Nov 23 06:40:58 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Nov 23 06:40:58 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 23 06:40:58 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 23 06:40:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 23 06:40:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 23 06:40:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 23 06:40:58 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 23 06:40:58 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Nov 23 06:40:58 localhost kernel: signal: max sigframe size: 1776
Nov 23 06:40:58 localhost kernel: BIOS-provided physical RAM map:
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 23 06:40:58 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Nov 23 06:40:58 localhost kernel: NX (Execute Disable) protection: active
Nov 23 06:40:58 localhost kernel: SMBIOS 2.8 present.
Nov 23 06:40:58 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 23 06:40:58 localhost kernel: Hypervisor detected: KVM
Nov 23 06:40:58 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 23 06:40:58 localhost kernel: kvm-clock: using sched offset of 2569527257 cycles
Nov 23 06:40:58 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 23 06:40:58 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 23 06:40:58 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 23 06:40:58 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 23 06:40:58 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Nov 23 06:40:58 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 23 06:40:58 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 23 06:40:58 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 23 06:40:58 localhost kernel: Using GB pages for direct mapping
Nov 23 06:40:58 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Nov 23 06:40:58 localhost kernel: ACPI: Early table checksum verification disabled
Nov 23 06:40:58 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 23 06:40:58 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 06:40:58 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 06:40:58 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 06:40:58 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 23 06:40:58 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 06:40:58 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 06:40:58 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 23 06:40:58 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 23 06:40:58 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 23 06:40:58 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 23 06:40:58 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 23 06:40:58 localhost kernel: No NUMA configuration found
Nov 23 06:40:58 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Nov 23 06:40:58 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Nov 23 06:40:58 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Nov 23 06:40:58 localhost kernel: Zone ranges:
Nov 23 06:40:58 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 23 06:40:58 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 23 06:40:58 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Nov 23 06:40:58 localhost kernel:   Device   empty
Nov 23 06:40:58 localhost kernel: Movable zone start for each node
Nov 23 06:40:58 localhost kernel: Early memory node ranges
Nov 23 06:40:58 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 23 06:40:58 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 23 06:40:58 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Nov 23 06:40:58 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Nov 23 06:40:58 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 23 06:40:58 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 23 06:40:58 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 23 06:40:58 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 23 06:40:58 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 23 06:40:58 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 23 06:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 23 06:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 23 06:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 23 06:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 23 06:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 23 06:40:58 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 23 06:40:58 localhost kernel: TSC deadline timer available
Nov 23 06:40:58 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 23 06:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 23 06:40:58 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 23 06:40:58 localhost kernel: Booting paravirtualized kernel on KVM
Nov 23 06:40:58 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 23 06:40:58 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 23 06:40:58 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Nov 23 06:40:58 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Nov 23 06:40:58 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 23 06:40:58 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 23 06:40:58 localhost kernel: Fallback order for Node 0: 0 
Nov 23 06:40:58 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Nov 23 06:40:58 localhost kernel: Policy zone: Normal
Nov 23 06:40:58 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 23 06:40:58 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Nov 23 06:40:58 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Nov 23 06:40:58 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 23 06:40:58 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 23 06:40:58 localhost kernel: software IO TLB: area num 8.
Nov 23 06:40:58 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Nov 23 06:40:58 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Nov 23 06:40:58 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 23 06:40:58 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Nov 23 06:40:58 localhost kernel: ftrace: allocated 176 pages with 3 groups
Nov 23 06:40:58 localhost kernel: Dynamic Preempt: voluntary
Nov 23 06:40:58 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 23 06:40:58 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 23 06:40:58 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 23 06:40:58 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 23 06:40:58 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 23 06:40:58 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 23 06:40:58 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 23 06:40:58 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 23 06:40:58 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 23 06:40:58 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 23 06:40:58 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Nov 23 06:40:58 localhost kernel: Console: colour VGA+ 80x25
Nov 23 06:40:58 localhost kernel: printk: console [tty0] enabled
Nov 23 06:40:58 localhost kernel: printk: console [ttyS0] enabled
Nov 23 06:40:58 localhost kernel: ACPI: Core revision 20211217
Nov 23 06:40:58 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 23 06:40:58 localhost kernel: x2apic enabled
Nov 23 06:40:58 localhost kernel: Switched APIC routing to physical x2apic.
Nov 23 06:40:58 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 23 06:40:58 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 23 06:40:58 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 23 06:40:58 localhost kernel: LSM: Security Framework initializing
Nov 23 06:40:58 localhost kernel: Yama: becoming mindful.
Nov 23 06:40:58 localhost kernel: SELinux:  Initializing.
Nov 23 06:40:58 localhost kernel: LSM support for eBPF active
Nov 23 06:40:58 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Nov 23 06:40:58 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Nov 23 06:40:58 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 23 06:40:58 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 23 06:40:58 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 23 06:40:58 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 23 06:40:58 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 23 06:40:58 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Nov 23 06:40:58 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Nov 23 06:40:58 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 23 06:40:58 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 23 06:40:58 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 23 06:40:58 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 23 06:40:58 localhost kernel: Freeing SMP alternatives memory: 36K
Nov 23 06:40:58 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 23 06:40:58 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Nov 23 06:40:58 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 23 06:40:58 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 23 06:40:58 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 23 06:40:58 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 23 06:40:58 localhost kernel: ... version:                0
Nov 23 06:40:58 localhost kernel: ... bit width:              48
Nov 23 06:40:58 localhost kernel: ... generic registers:      6
Nov 23 06:40:58 localhost kernel: ... value mask:             0000ffffffffffff
Nov 23 06:40:58 localhost kernel: ... max period:             00007fffffffffff
Nov 23 06:40:58 localhost kernel: ... fixed-purpose events:   0
Nov 23 06:40:58 localhost kernel: ... event mask:             000000000000003f
Nov 23 06:40:58 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 23 06:40:58 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 23 06:40:58 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 23 06:40:58 localhost kernel: x86: Booting SMP configuration:
Nov 23 06:40:58 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 23 06:40:58 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 23 06:40:58 localhost kernel: smpboot: Max logical packages: 8
Nov 23 06:40:58 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 23 06:40:58 localhost kernel: node 0 deferred pages initialised in 20ms
Nov 23 06:40:58 localhost kernel: devtmpfs: initialized
Nov 23 06:40:58 localhost kernel: x86/mm: Memory block size: 128MB
Nov 23 06:40:58 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 23 06:40:58 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 23 06:40:58 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 23 06:40:58 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 23 06:40:58 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Nov 23 06:40:58 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 23 06:40:58 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 23 06:40:58 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 23 06:40:58 localhost kernel: audit: type=2000 audit(1763880056.876:1): state=initialized audit_enabled=0 res=1
Nov 23 06:40:58 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 23 06:40:58 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 23 06:40:58 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 23 06:40:58 localhost kernel: cpuidle: using governor menu
Nov 23 06:40:58 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Nov 23 06:40:58 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 23 06:40:58 localhost kernel: PCI: Using configuration type 1 for base access
Nov 23 06:40:58 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 23 06:40:58 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 23 06:40:58 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Nov 23 06:40:58 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Nov 23 06:40:58 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Nov 23 06:40:58 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Nov 23 06:40:58 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Nov 23 06:40:58 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 23 06:40:58 localhost kernel: ACPI: Interpreter enabled
Nov 23 06:40:58 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 23 06:40:58 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 23 06:40:58 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 23 06:40:58 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 23 06:40:58 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 23 06:40:58 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 23 06:40:58 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [3] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [4] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [5] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [6] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [7] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [8] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [9] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [10] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [11] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [12] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [13] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [14] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [15] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [16] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [17] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [18] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [19] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [20] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [21] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [22] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [23] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [24] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [25] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [26] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [27] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [28] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [29] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [30] registered
Nov 23 06:40:58 localhost kernel: acpiphp: Slot [31] registered
Nov 23 06:40:58 localhost kernel: PCI host bridge to bus 0000:00
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Nov 23 06:40:58 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Nov 23 06:40:58 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Nov 23 06:40:58 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Nov 23 06:40:58 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Nov 23 06:40:58 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 23 06:40:58 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Nov 23 06:40:58 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Nov 23 06:40:58 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 23 06:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 23 06:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 23 06:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 23 06:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 23 06:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 23 06:40:58 localhost kernel: iommu: Default domain type: Translated 
Nov 23 06:40:58 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Nov 23 06:40:58 localhost kernel: SCSI subsystem initialized
Nov 23 06:40:58 localhost kernel: ACPI: bus type USB registered
Nov 23 06:40:58 localhost kernel: usbcore: registered new interface driver usbfs
Nov 23 06:40:58 localhost kernel: usbcore: registered new interface driver hub
Nov 23 06:40:58 localhost kernel: usbcore: registered new device driver usb
Nov 23 06:40:58 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 23 06:40:58 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 23 06:40:58 localhost kernel: PTP clock support registered
Nov 23 06:40:58 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 23 06:40:58 localhost kernel: NetLabel: Initializing
Nov 23 06:40:58 localhost kernel: NetLabel:  domain hash size = 128
Nov 23 06:40:58 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 23 06:40:58 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 23 06:40:58 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 23 06:40:58 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 23 06:40:58 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 23 06:40:58 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 23 06:40:58 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 23 06:40:58 localhost kernel: vgaarb: loaded
Nov 23 06:40:58 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 23 06:40:58 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 23 06:40:58 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 23 06:40:58 localhost kernel: pnp: PnP ACPI init
Nov 23 06:40:58 localhost kernel: pnp 00:03: [dma 2]
Nov 23 06:40:58 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 23 06:40:58 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 23 06:40:58 localhost kernel: NET: Registered PF_INET protocol family
Nov 23 06:40:58 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Nov 23 06:40:58 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Nov 23 06:40:58 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 23 06:40:58 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 23 06:40:58 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 23 06:40:58 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Nov 23 06:40:58 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Nov 23 06:40:58 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Nov 23 06:40:58 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Nov 23 06:40:58 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 23 06:40:58 localhost kernel: NET: Registered PF_XDP protocol family
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 23 06:40:58 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 23 06:40:58 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 23 06:40:58 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 23 06:40:58 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27411 usecs
Nov 23 06:40:58 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 23 06:40:58 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 23 06:40:58 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 23 06:40:58 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 23 06:40:58 localhost kernel: ACPI: bus type thunderbolt registered
Nov 23 06:40:58 localhost kernel: Initialise system trusted keyrings
Nov 23 06:40:58 localhost kernel: Key type blacklist registered
Nov 23 06:40:58 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Nov 23 06:40:58 localhost kernel: zbud: loaded
Nov 23 06:40:58 localhost kernel: integrity: Platform Keyring initialized
Nov 23 06:40:58 localhost kernel: NET: Registered PF_ALG protocol family
Nov 23 06:40:58 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 23 06:40:58 localhost kernel: Key type asymmetric registered
Nov 23 06:40:58 localhost kernel: Asymmetric key parser 'x509' registered
Nov 23 06:40:58 localhost kernel: Running certificate verification selftests
Nov 23 06:40:58 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 23 06:40:58 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 23 06:40:58 localhost kernel: io scheduler mq-deadline registered
Nov 23 06:40:58 localhost kernel: io scheduler kyber registered
Nov 23 06:40:58 localhost kernel: io scheduler bfq registered
Nov 23 06:40:58 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 23 06:40:58 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 23 06:40:58 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 23 06:40:58 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 23 06:40:58 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 23 06:40:58 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 23 06:40:58 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 23 06:40:58 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 23 06:40:58 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 23 06:40:58 localhost kernel: Non-volatile memory driver v1.3
Nov 23 06:40:58 localhost kernel: rdac: device handler registered
Nov 23 06:40:58 localhost kernel: hp_sw: device handler registered
Nov 23 06:40:58 localhost kernel: emc: device handler registered
Nov 23 06:40:58 localhost kernel: alua: device handler registered
Nov 23 06:40:58 localhost kernel: libphy: Fixed MDIO Bus: probed
Nov 23 06:40:58 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Nov 23 06:40:58 localhost kernel: ehci-pci: EHCI PCI platform driver
Nov 23 06:40:58 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Nov 23 06:40:58 localhost kernel: ohci-pci: OHCI PCI platform driver
Nov 23 06:40:58 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Nov 23 06:40:58 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 23 06:40:58 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 23 06:40:58 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 23 06:40:58 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 23 06:40:58 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 23 06:40:58 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 23 06:40:58 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 23 06:40:58 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Nov 23 06:40:58 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 23 06:40:58 localhost kernel: hub 1-0:1.0: USB hub found
Nov 23 06:40:58 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 23 06:40:58 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 23 06:40:58 localhost kernel: usbserial: USB Serial support registered for generic
Nov 23 06:40:58 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 23 06:40:58 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 23 06:40:58 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 23 06:40:58 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 23 06:40:58 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 23 06:40:58 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 23 06:40:58 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 23 06:40:58 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-23T06:40:57 UTC (1763880057)
Nov 23 06:40:58 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 23 06:40:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 23 06:40:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 23 06:40:58 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 23 06:40:58 localhost kernel: usbcore: registered new interface driver usbhid
Nov 23 06:40:58 localhost kernel: usbhid: USB HID core driver
Nov 23 06:40:58 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 23 06:40:58 localhost kernel: Initializing XFRM netlink socket
Nov 23 06:40:58 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 23 06:40:58 localhost kernel: Segment Routing with IPv6
Nov 23 06:40:58 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 23 06:40:58 localhost kernel: mpls_gso: MPLS GSO support
Nov 23 06:40:58 localhost kernel: IPI shorthand broadcast: enabled
Nov 23 06:40:58 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 23 06:40:58 localhost kernel: AES CTR mode by8 optimization enabled
Nov 23 06:40:58 localhost kernel: sched_clock: Marking stable (745683194, 175432733)->(1040568262, -119452335)
Nov 23 06:40:58 localhost kernel: registered taskstats version 1
Nov 23 06:40:58 localhost kernel: Loading compiled-in X.509 certificates
Nov 23 06:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Nov 23 06:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 23 06:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 23 06:40:58 localhost kernel: zswap: loaded using pool lzo/zbud
Nov 23 06:40:58 localhost kernel: page_owner is disabled
Nov 23 06:40:58 localhost kernel: Key type big_key registered
Nov 23 06:40:58 localhost kernel: Freeing initrd memory: 74232K
Nov 23 06:40:58 localhost kernel: Key type encrypted registered
Nov 23 06:40:58 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 23 06:40:58 localhost kernel: Loading compiled-in module X.509 certificates
Nov 23 06:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Nov 23 06:40:58 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 23 06:40:58 localhost kernel: ima: No architecture policies found
Nov 23 06:40:58 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 23 06:40:58 localhost kernel: evm: Initialising EVM extended attributes:
Nov 23 06:40:58 localhost kernel: evm: security.selinux
Nov 23 06:40:58 localhost kernel: evm: security.SMACK64 (disabled)
Nov 23 06:40:58 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 23 06:40:58 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 23 06:40:58 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 23 06:40:58 localhost kernel: evm: security.apparmor (disabled)
Nov 23 06:40:58 localhost kernel: evm: security.ima
Nov 23 06:40:58 localhost kernel: evm: security.capability
Nov 23 06:40:58 localhost kernel: evm: HMAC attrs: 0x1
Nov 23 06:40:58 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 23 06:40:58 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 23 06:40:58 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 23 06:40:58 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 23 06:40:58 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 23 06:40:58 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 23 06:40:58 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 23 06:40:58 localhost kernel: Freeing unused decrypted memory: 2036K
Nov 23 06:40:58 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Nov 23 06:40:58 localhost kernel: Write protecting the kernel read-only data: 26624k
Nov 23 06:40:58 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Nov 23 06:40:58 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Nov 23 06:40:58 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 23 06:40:58 localhost kernel: Run /init as init process
Nov 23 06:40:58 localhost kernel:   with arguments:
Nov 23 06:40:58 localhost kernel:     /init
Nov 23 06:40:58 localhost kernel:   with environment:
Nov 23 06:40:58 localhost kernel:     HOME=/
Nov 23 06:40:58 localhost kernel:     TERM=linux
Nov 23 06:40:58 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Nov 23 06:40:58 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 23 06:40:58 localhost systemd[1]: Detected virtualization kvm.
Nov 23 06:40:58 localhost systemd[1]: Detected architecture x86-64.
Nov 23 06:40:58 localhost systemd[1]: Running in initrd.
Nov 23 06:40:58 localhost systemd[1]: No hostname configured, using default hostname.
Nov 23 06:40:58 localhost systemd[1]: Hostname set to <localhost>.
Nov 23 06:40:58 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 23 06:40:58 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 23 06:40:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 23 06:40:58 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 23 06:40:58 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 23 06:40:58 localhost systemd[1]: Reached target Local File Systems.
Nov 23 06:40:58 localhost systemd[1]: Reached target Path Units.
Nov 23 06:40:58 localhost systemd[1]: Reached target Slice Units.
Nov 23 06:40:58 localhost systemd[1]: Reached target Swaps.
Nov 23 06:40:58 localhost systemd[1]: Reached target Timer Units.
Nov 23 06:40:58 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 23 06:40:58 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 23 06:40:58 localhost systemd[1]: Listening on Journal Socket.
Nov 23 06:40:58 localhost systemd[1]: Listening on udev Control Socket.
Nov 23 06:40:58 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 23 06:40:58 localhost systemd[1]: Reached target Socket Units.
Nov 23 06:40:58 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 23 06:40:58 localhost systemd[1]: Starting Journal Service...
Nov 23 06:40:58 localhost systemd[1]: Starting Load Kernel Modules...
Nov 23 06:40:58 localhost systemd[1]: Starting Create System Users...
Nov 23 06:40:58 localhost systemd[1]: Starting Setup Virtual Console...
Nov 23 06:40:58 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 23 06:40:58 localhost systemd[1]: Finished Load Kernel Modules.
Nov 23 06:40:58 localhost systemd-journald[284]: Journal started
Nov 23 06:40:58 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/43895cafe6c247af84a56194e901da5c) is 8.0M, max 314.7M, 306.7M free.
Nov 23 06:40:58 localhost systemd-modules-load[285]: Module 'msr' is built in
Nov 23 06:40:58 localhost systemd[1]: Started Journal Service.
Nov 23 06:40:58 localhost systemd[1]: Finished Setup Virtual Console.
Nov 23 06:40:58 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 23 06:40:58 localhost systemd[1]: Starting dracut cmdline hook...
Nov 23 06:40:58 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 23 06:40:58 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Nov 23 06:40:58 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Nov 23 06:40:58 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Nov 23 06:40:58 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 23 06:40:58 localhost systemd[1]: Finished Create System Users.
Nov 23 06:40:58 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 23 06:40:58 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Nov 23 06:40:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 23 06:40:58 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 23 06:40:58 localhost dracut-cmdline[289]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 23 06:40:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 23 06:40:58 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 23 06:40:58 localhost systemd[1]: Finished dracut cmdline hook.
Nov 23 06:40:58 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 23 06:40:58 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 23 06:40:58 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 23 06:40:58 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Nov 23 06:40:58 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 23 06:40:58 localhost kernel: RPC: Registered udp transport module.
Nov 23 06:40:58 localhost kernel: RPC: Registered tcp transport module.
Nov 23 06:40:58 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 23 06:40:58 localhost rpc.statd[408]: Version 2.5.4 starting
Nov 23 06:40:58 localhost rpc.statd[408]: Initializing NSM state
Nov 23 06:40:58 localhost rpc.idmapd[413]: Setting log level to 0
Nov 23 06:40:58 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 23 06:40:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 23 06:40:58 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'.
Nov 23 06:40:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 23 06:40:58 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 23 06:40:58 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 23 06:40:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 23 06:40:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 23 06:40:58 localhost systemd[1]: Reached target System Initialization.
Nov 23 06:40:58 localhost systemd[1]: Reached target Basic System.
Nov 23 06:40:58 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 23 06:40:58 localhost systemd[1]: Reached target Network.
Nov 23 06:40:58 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 23 06:40:58 localhost systemd[1]: Starting dracut initqueue hook...
Nov 23 06:40:58 localhost kernel: libata version 3.00 loaded.
Nov 23 06:40:58 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Nov 23 06:40:58 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 23 06:40:58 localhost kernel: scsi host0: ata_piix
Nov 23 06:40:58 localhost kernel: scsi host1: ata_piix
Nov 23 06:40:58 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Nov 23 06:40:58 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Nov 23 06:40:58 localhost systemd-udevd[438]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 06:40:58 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Nov 23 06:40:58 localhost kernel: GPT:20971519 != 838860799
Nov 23 06:40:58 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Nov 23 06:40:58 localhost kernel: GPT:20971519 != 838860799
Nov 23 06:40:58 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Nov 23 06:40:58 localhost kernel:  vda: vda1 vda2 vda3 vda4
Nov 23 06:40:59 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Nov 23 06:40:59 localhost systemd[1]: Reached target Initrd Root Device.
Nov 23 06:40:59 localhost kernel: ata1: found unknown device (class 0)
Nov 23 06:40:59 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 23 06:40:59 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 23 06:40:59 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 23 06:40:59 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 23 06:40:59 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 23 06:40:59 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 23 06:40:59 localhost systemd[1]: Finished dracut initqueue hook.
Nov 23 06:40:59 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 23 06:40:59 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 23 06:40:59 localhost systemd[1]: Reached target Remote File Systems.
Nov 23 06:40:59 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 23 06:40:59 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 23 06:40:59 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Nov 23 06:40:59 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Nov 23 06:40:59 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Nov 23 06:40:59 localhost systemd[1]: Mounting /sysroot...
Nov 23 06:40:59 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 23 06:40:59 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Nov 23 06:40:59 localhost kernel: XFS (vda4): Ending clean mount
Nov 23 06:40:59 localhost systemd[1]: Mounted /sysroot.
Nov 23 06:40:59 localhost systemd[1]: Reached target Initrd Root File System.
Nov 23 06:40:59 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 23 06:40:59 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 23 06:40:59 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 23 06:40:59 localhost systemd[1]: Reached target Initrd File Systems.
Nov 23 06:40:59 localhost systemd[1]: Reached target Initrd Default Target.
Nov 23 06:40:59 localhost systemd[1]: Starting dracut mount hook...
Nov 23 06:40:59 localhost systemd[1]: Finished dracut mount hook.
Nov 23 06:40:59 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 23 06:40:59 localhost rpc.idmapd[413]: exiting on signal 15
Nov 23 06:40:59 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 23 06:40:59 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 23 06:40:59 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 23 06:40:59 localhost systemd[1]: Stopped target Network.
Nov 23 06:40:59 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 23 06:40:59 localhost systemd[1]: Stopped target Timer Units.
Nov 23 06:40:59 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 23 06:40:59 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 23 06:40:59 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 23 06:40:59 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 23 06:40:59 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 23 06:40:59 localhost systemd[1]: Stopped target Basic System.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Path Units.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Remote File Systems.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Slice Units.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Socket Units.
Nov 23 06:41:00 localhost systemd[1]: Stopped target System Initialization.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Local File Systems.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Swaps.
Nov 23 06:41:00 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped dracut mount hook.
Nov 23 06:41:00 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 23 06:41:00 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 23 06:41:00 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 23 06:41:00 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 23 06:41:00 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 23 06:41:00 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Load Kernel Modules.
Nov 23 06:41:00 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 23 06:41:00 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 23 06:41:00 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 23 06:41:00 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 23 06:41:00 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 23 06:41:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 23 06:41:00 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 23 06:41:00 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Closed udev Control Socket.
Nov 23 06:41:00 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Closed udev Kernel Socket.
Nov 23 06:41:00 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 23 06:41:00 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 23 06:41:00 localhost systemd[1]: Starting Cleanup udev Database...
Nov 23 06:41:00 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 23 06:41:00 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 23 06:41:00 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Stopped Create System Users.
Nov 23 06:41:00 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 23 06:41:00 localhost systemd[1]: Finished Cleanup udev Database.
Nov 23 06:41:00 localhost systemd[1]: Reached target Switch Root.
Nov 23 06:41:00 localhost systemd[1]: Starting Switch Root...
Nov 23 06:41:00 localhost systemd[1]: Switching root.
Nov 23 06:41:00 localhost systemd-journald[284]: Journal stopped
Nov 23 06:41:01 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Nov 23 06:41:01 localhost kernel: audit: type=1404 audit(1763880060.262:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability open_perms=1
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 06:41:01 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 06:41:01 localhost kernel: audit: type=1403 audit(1763880060.365:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 23 06:41:01 localhost systemd[1]: Successfully loaded SELinux policy in 107.169ms.
Nov 23 06:41:01 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.238ms.
Nov 23 06:41:01 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 23 06:41:01 localhost systemd[1]: Detected virtualization kvm.
Nov 23 06:41:01 localhost systemd[1]: Detected architecture x86-64.
Nov 23 06:41:01 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 06:41:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 06:41:01 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd[1]: Stopped Switch Root.
Nov 23 06:41:01 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 23 06:41:01 localhost systemd[1]: Created slice Slice /system/getty.
Nov 23 06:41:01 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 23 06:41:01 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 23 06:41:01 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 23 06:41:01 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Nov 23 06:41:01 localhost systemd[1]: Created slice User and Session Slice.
Nov 23 06:41:01 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 23 06:41:01 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 23 06:41:01 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 23 06:41:01 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 23 06:41:01 localhost systemd[1]: Stopped target Switch Root.
Nov 23 06:41:01 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 23 06:41:01 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 23 06:41:01 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 23 06:41:01 localhost systemd[1]: Reached target Path Units.
Nov 23 06:41:01 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 23 06:41:01 localhost systemd[1]: Reached target Slice Units.
Nov 23 06:41:01 localhost systemd[1]: Reached target Swaps.
Nov 23 06:41:01 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 23 06:41:01 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 23 06:41:01 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 23 06:41:01 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 23 06:41:01 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 23 06:41:01 localhost systemd[1]: Listening on udev Control Socket.
Nov 23 06:41:01 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 23 06:41:01 localhost systemd[1]: Mounting Huge Pages File System...
Nov 23 06:41:01 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 23 06:41:01 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 23 06:41:01 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 23 06:41:01 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 23 06:41:01 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 23 06:41:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 23 06:41:01 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 23 06:41:01 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 23 06:41:01 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 23 06:41:01 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 23 06:41:01 localhost systemd[1]: Stopped Journal Service.
Nov 23 06:41:01 localhost systemd[1]: Starting Journal Service...
Nov 23 06:41:01 localhost systemd[1]: Starting Load Kernel Modules...
Nov 23 06:41:01 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 23 06:41:01 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 23 06:41:01 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 23 06:41:01 localhost kernel: fuse: init (API version 7.36)
Nov 23 06:41:01 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 23 06:41:01 localhost systemd[1]: Mounted Huge Pages File System.
Nov 23 06:41:01 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 23 06:41:01 localhost systemd-journald[618]: Journal started
Nov 23 06:41:01 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/6e0090cd4cf296f54418e234b90f721c) is 8.0M, max 314.7M, 306.7M free.
Nov 23 06:41:00 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 23 06:41:00 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd-modules-load[619]: Module 'msr' is built in
Nov 23 06:41:01 localhost systemd[1]: Started Journal Service.
Nov 23 06:41:01 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 23 06:41:01 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 23 06:41:01 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 23 06:41:01 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 23 06:41:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 23 06:41:01 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 23 06:41:01 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 23 06:41:01 localhost kernel: ACPI: bus type drm_connector registered
Nov 23 06:41:01 localhost systemd[1]: Finished Load Kernel Modules.
Nov 23 06:41:01 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 23 06:41:01 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 23 06:41:01 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 23 06:41:01 localhost systemd[1]: Mounting FUSE Control File System...
Nov 23 06:41:01 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 23 06:41:01 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 23 06:41:01 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 23 06:41:01 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 23 06:41:01 localhost systemd[1]: Starting Load/Save Random Seed...
Nov 23 06:41:01 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 23 06:41:01 localhost systemd[1]: Starting Create System Users...
Nov 23 06:41:01 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/6e0090cd4cf296f54418e234b90f721c) is 8.0M, max 314.7M, 306.7M free.
Nov 23 06:41:01 localhost systemd-journald[618]: Received client request to flush runtime journal.
Nov 23 06:41:01 localhost systemd[1]: Mounted FUSE Control File System.
Nov 23 06:41:01 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 23 06:41:01 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 23 06:41:01 localhost systemd[1]: Finished Load/Save Random Seed.
Nov 23 06:41:01 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 23 06:41:01 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 23 06:41:01 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 23 06:41:01 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989.
Nov 23 06:41:01 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988.
Nov 23 06:41:01 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Nov 23 06:41:01 localhost systemd[1]: Finished Create System Users.
Nov 23 06:41:01 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 23 06:41:01 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 23 06:41:01 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 23 06:41:01 localhost systemd[1]: Set up automount EFI System Partition Automount.
Nov 23 06:41:01 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 23 06:41:01 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 23 06:41:01 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'.
Nov 23 06:41:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 23 06:41:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 23 06:41:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 23 06:41:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 23 06:41:01 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 23 06:41:01 localhost systemd-udevd[647]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 06:41:01 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Nov 23 06:41:01 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 23 06:41:01 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Nov 23 06:41:01 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Nov 23 06:41:01 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 23 06:41:01 localhost systemd-fsck[685]: fsck.fat 4.2 (2021-01-31)
Nov 23 06:41:01 localhost systemd-fsck[685]: /dev/vda2: 12 files, 1782/51145 clusters
Nov 23 06:41:01 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Nov 23 06:41:01 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 23 06:41:01 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 23 06:41:01 localhost kernel: Console: switching to colour dummy device 80x25
Nov 23 06:41:01 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 23 06:41:01 localhost kernel: [drm] features: -context_init
Nov 23 06:41:01 localhost kernel: [drm] number of scanouts: 1
Nov 23 06:41:01 localhost kernel: [drm] number of cap sets: 0
Nov 23 06:41:01 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Nov 23 06:41:01 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Nov 23 06:41:01 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 23 06:41:01 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 23 06:41:01 localhost kernel: SVM: TSC scaling supported
Nov 23 06:41:01 localhost kernel: kvm: Nested Virtualization enabled
Nov 23 06:41:01 localhost kernel: SVM: kvm: Nested Paging enabled
Nov 23 06:41:01 localhost kernel: SVM: LBR virtualization supported
Nov 23 06:41:02 localhost systemd[1]: Mounting /boot...
Nov 23 06:41:02 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Nov 23 06:41:02 localhost kernel: XFS (vda3): Ending clean mount
Nov 23 06:41:02 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Nov 23 06:41:02 localhost systemd[1]: Mounted /boot.
Nov 23 06:41:02 localhost systemd[1]: Mounting /boot/efi...
Nov 23 06:41:02 localhost systemd[1]: Mounted /boot/efi.
Nov 23 06:41:02 localhost systemd[1]: Reached target Local File Systems.
Nov 23 06:41:02 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 23 06:41:02 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 23 06:41:02 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 23 06:41:02 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 23 06:41:02 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 23 06:41:02 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 23 06:41:02 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 23 06:41:02 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 718 (bootctl)
Nov 23 06:41:02 localhost systemd[1]: Starting File System Check on /dev/vda2...
Nov 23 06:41:02 localhost systemd[1]: Finished File System Check on /dev/vda2.
Nov 23 06:41:02 localhost systemd[1]: Mounting EFI System Partition Automount...
Nov 23 06:41:02 localhost systemd[1]: Mounted EFI System Partition Automount.
Nov 23 06:41:02 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 23 06:41:02 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 23 06:41:02 localhost systemd[1]: Starting Security Auditing Service...
Nov 23 06:41:02 localhost systemd[1]: Starting RPC Bind...
Nov 23 06:41:02 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 23 06:41:02 localhost auditd[727]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Nov 23 06:41:02 localhost auditd[727]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Nov 23 06:41:02 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 23 06:41:02 localhost systemd[1]: Started RPC Bind.
Nov 23 06:41:02 localhost augenrules[732]: /sbin/augenrules: No change
Nov 23 06:41:02 localhost augenrules[742]: No rules
Nov 23 06:41:02 localhost augenrules[742]: enabled 1
Nov 23 06:41:02 localhost augenrules[742]: failure 1
Nov 23 06:41:02 localhost augenrules[742]: pid 727
Nov 23 06:41:02 localhost augenrules[742]: rate_limit 0
Nov 23 06:41:02 localhost augenrules[742]: backlog_limit 8192
Nov 23 06:41:02 localhost augenrules[742]: lost 0
Nov 23 06:41:02 localhost augenrules[742]: backlog 0
Nov 23 06:41:02 localhost augenrules[742]: backlog_wait_time 60000
Nov 23 06:41:02 localhost augenrules[742]: backlog_wait_time_actual 0
Nov 23 06:41:02 localhost augenrules[742]: enabled 1
Nov 23 06:41:02 localhost augenrules[742]: failure 1
Nov 23 06:41:02 localhost augenrules[742]: pid 727
Nov 23 06:41:02 localhost augenrules[742]: rate_limit 0
Nov 23 06:41:02 localhost augenrules[742]: backlog_limit 8192
Nov 23 06:41:02 localhost augenrules[742]: lost 0
Nov 23 06:41:02 localhost augenrules[742]: backlog 0
Nov 23 06:41:02 localhost augenrules[742]: backlog_wait_time 60000
Nov 23 06:41:02 localhost augenrules[742]: backlog_wait_time_actual 0
Nov 23 06:41:02 localhost augenrules[742]: enabled 1
Nov 23 06:41:02 localhost augenrules[742]: failure 1
Nov 23 06:41:02 localhost augenrules[742]: pid 727
Nov 23 06:41:02 localhost augenrules[742]: rate_limit 0
Nov 23 06:41:02 localhost augenrules[742]: backlog_limit 8192
Nov 23 06:41:02 localhost augenrules[742]: lost 0
Nov 23 06:41:02 localhost augenrules[742]: backlog 0
Nov 23 06:41:02 localhost augenrules[742]: backlog_wait_time 60000
Nov 23 06:41:02 localhost augenrules[742]: backlog_wait_time_actual 0
Nov 23 06:41:02 localhost systemd[1]: Started Security Auditing Service.
Nov 23 06:41:02 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 23 06:41:02 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 23 06:41:02 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 23 06:41:02 localhost systemd[1]: Starting Update is Completed...
Nov 23 06:41:02 localhost systemd[1]: Finished Update is Completed.
Nov 23 06:41:02 localhost systemd[1]: Reached target System Initialization.
Nov 23 06:41:02 localhost systemd[1]: Started dnf makecache --timer.
Nov 23 06:41:02 localhost systemd[1]: Started Daily rotation of log files.
Nov 23 06:41:02 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 23 06:41:02 localhost systemd[1]: Reached target Timer Units.
Nov 23 06:41:02 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 23 06:41:02 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 23 06:41:02 localhost systemd[1]: Reached target Socket Units.
Nov 23 06:41:02 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Nov 23 06:41:02 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 23 06:41:02 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 23 06:41:02 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 23 06:41:02 localhost systemd[1]: Reached target Basic System.
Nov 23 06:41:02 localhost systemd[1]: Starting NTP client/server...
Nov 23 06:41:02 localhost dbus-broker-lau[752]: Ready
Nov 23 06:41:02 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 23 06:41:02 localhost systemd[1]: Started irqbalance daemon.
Nov 23 06:41:02 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 23 06:41:02 localhost systemd[1]: Starting System Logging Service...
Nov 23 06:41:02 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 06:41:02 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 06:41:02 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 06:41:02 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 23 06:41:02 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 23 06:41:02 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 23 06:41:02 localhost systemd[1]: Starting User Login Management...
Nov 23 06:41:02 localhost rsyslogd[760]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="760" x-info="https://www.rsyslog.com"] start
Nov 23 06:41:02 localhost rsyslogd[760]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Nov 23 06:41:02 localhost systemd[1]: Started System Logging Service.
Nov 23 06:41:02 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 23 06:41:02 localhost chronyd[767]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 23 06:41:02 localhost chronyd[767]: Using right/UTC timezone to obtain leap second data
Nov 23 06:41:02 localhost chronyd[767]: Loaded seccomp filter (level 2)
Nov 23 06:41:02 localhost systemd[1]: Started NTP client/server.
Nov 23 06:41:02 localhost systemd-logind[761]: New seat seat0.
Nov 23 06:41:02 localhost systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 23 06:41:02 localhost systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 23 06:41:02 localhost systemd[1]: Started User Login Management.
Nov 23 06:41:02 localhost rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 06:41:03 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sun, 23 Nov 2025 06:41:03 +0000. Up 6.37 seconds.
Nov 23 06:41:03 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 23 06:41:03 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 23 06:41:03 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpqqzl606r.mount: Deactivated successfully.
Nov 23 06:41:03 localhost systemd[1]: Starting Hostname Service...
Nov 23 06:41:03 localhost systemd[1]: Started Hostname Service.
Nov 23 06:41:03 np0005532585.novalocal systemd-hostnamed[785]: Hostname set to <np0005532585.novalocal> (static)
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Reached target Preparation for Network.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting Network Manager...
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7770] NetworkManager (version 1.42.2-1.el9) is starting... (boot:2e694857-c83c-42a3-a300-fcad2ba2b06e)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7775] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Started Network Manager.
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7823] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Reached target Network.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7906] manager[0x5622fc1ef020]: monitoring kernel firmware directory '/lib/firmware'.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7977] hostname: hostname: using hostnamed
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7978] hostname: static hostname changed from (none) to "np0005532585.novalocal"
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.7994] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Reached target NFS client services.
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8123] manager[0x5622fc1ef020]: rfkill: Wi-Fi hardware radio set enabled
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8124] manager[0x5622fc1ef020]: rfkill: WWAN hardware radio set enabled
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Reached target Remote File Systems.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8212] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8213] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8222] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8223] manager: Networking is enabled by state file
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8270] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8271] settings: Loaded settings plugin: keyfile (internal)
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8329] dhcp: init: Using DHCP client 'internal'
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8335] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8373] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8389] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8405] device (lo): Activation: starting connection 'lo' (75f95e5d-82c2-442b-91f8-cfa8260985bf)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8422] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8434] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8500] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8519] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8520] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8522] device (eth0): carrier: link connected
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8539] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8542] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8547] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8552] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8553] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8555] manager: NetworkManager state is now CONNECTING
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8556] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8561] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8564] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8566] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8571] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8575] device (lo): Activation: successful, device activated.
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8616] dhcp4 (eth0): state changed new lease, address=38.102.83.198
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8619] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8637] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8650] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8651] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8654] manager: NetworkManager state is now CONNECTED_SITE
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8657] device (eth0): Activation: successful, device activated.
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8660] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 23 06:41:03 np0005532585.novalocal NetworkManager[790]: <info>  [1763880063.8664] manager: startup complete
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 23 06:41:03 np0005532585.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: Cloud-init v. 22.1-9.el9 running 'init' at Sun, 23 Nov 2025 06:41:04 +0000. Up 7.28 seconds.
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |  eth0  | True |        38.102.83.198         | 255.255.255.0 | global | fa:16:3e:72:a3:51 |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |  eth0  | True | fe80::f816:3eff:fe72:a351/64 |       .       |  link  | fa:16:3e:72:a3:51 |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 23 06:41:04 np0005532585.novalocal cloud-init[944]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 23 06:41:04 np0005532585.novalocal systemd[1]: Starting Authorization Manager...
Nov 23 06:41:04 np0005532585.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 06:41:04 np0005532585.novalocal polkitd[1037]: Started polkitd version 0.117
Nov 23 06:41:04 np0005532585.novalocal polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Nov 23 06:41:04 np0005532585.novalocal polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 23 06:41:04 np0005532585.novalocal polkitd[1037]: Finished loading, compiling and executing 4 rules
Nov 23 06:41:04 np0005532585.novalocal systemd[1]: Started Authorization Manager.
Nov 23 06:41:04 np0005532585.novalocal polkitd[1037]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 23 06:41:06 np0005532585.novalocal useradd[1120]: new group: name=cloud-user, GID=1001
Nov 23 06:41:06 np0005532585.novalocal useradd[1120]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 23 06:41:06 np0005532585.novalocal useradd[1120]: add 'cloud-user' to group 'adm'
Nov 23 06:41:06 np0005532585.novalocal useradd[1120]: add 'cloud-user' to group 'systemd-journal'
Nov 23 06:41:06 np0005532585.novalocal useradd[1120]: add 'cloud-user' to shadow group 'adm'
Nov 23 06:41:06 np0005532585.novalocal useradd[1120]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Generating public/private rsa key pair.
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: The key fingerprint is:
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: SHA256:ZQrZ/pYycRFaGUbuepttPoytYaqMQuJ642qdOqq+xXI root@np0005532585.novalocal
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: The key's randomart image is:
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: +---[RSA 3072]----+
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |         .*o     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |       o =..     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |      o o =      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |       o = .     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |        S o      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |. o      = .     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |.+.E.   + B+     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: | +Bo o   B.==    |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |@B=o. o.. ++o.   |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: +----[SHA256]-----+
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Generating public/private ecdsa key pair.
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: The key fingerprint is:
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: SHA256:uLBD0n17Ltuhg1Ary+3sqo/DK6blrJs9Mbvc6BVR/xw root@np0005532585.novalocal
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: The key's randomart image is:
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: +---[ECDSA 256]---+
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |      .          |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |     . .         |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |    .   . E      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |   . o.. o .     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |  . =.o.S o      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |  ooo+.o .       |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: | ..=+=... o      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |.O==ooo o= .     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |B=%B+++ o+o      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: +----[SHA256]-----+
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Generating public/private ed25519 key pair.
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: The key fingerprint is:
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: SHA256:tYIAbgiV17JZ+ywyFV1S7S0smdEHVQupK5wep/i3maw root@np0005532585.novalocal
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: The key's randomart image is:
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: +--[ED25519 256]--+
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |..o. . ..ooo.+o..|
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |.o..o + ... o.o .|
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |. o..= o  .*.o . |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: | .  o.o. .+o+ .  |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |     ..oS o...   |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |    o . o* o     |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |     o .o =      |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |       . o..o    |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: |        .Eo=.    |
Nov 23 06:41:08 np0005532585.novalocal cloud-init[944]: +----[SHA256]-----+
Nov 23 06:41:08 np0005532585.novalocal sm-notify[1133]: Version 2.5.4 starting
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Nov 23 06:41:08 np0005532585.novalocal sshd[1134]: Server listening on 0.0.0.0 port 22.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 23 06:41:08 np0005532585.novalocal sshd[1134]: Server listening on :: port 22.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Reached target Network is Online.
Nov 23 06:41:08 np0005532585.novalocal crond[1141]: (CRON) STARTUP (1.5.7)
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Nov 23 06:41:08 np0005532585.novalocal crond[1141]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 23 06:41:08 np0005532585.novalocal sshd[1134]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Nov 23 06:41:08 np0005532585.novalocal crond[1141]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 37% if used.)
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 23 06:41:08 np0005532585.novalocal crond[1141]: (CRON) INFO (running with inotify support)
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 23 06:41:08 np0005532585.novalocal sshd[1145]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 23 06:41:08 np0005532585.novalocal sshd[1156]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting Permit User Sessions...
Nov 23 06:41:08 np0005532585.novalocal sshd[1156]: Unable to negotiate with 38.102.83.114 port 60982: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 23 06:41:08 np0005532585.novalocal sshd[1167]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Finished Permit User Sessions.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Started Command Scheduler.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Started Getty on tty1.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Reached target Login Prompts.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Reached target Multi-User System.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 23 06:41:08 np0005532585.novalocal sshd[1174]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal sshd[1174]: Unable to negotiate with 38.102.83.114 port 32776: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 23 06:41:08 np0005532585.novalocal sshd[1182]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal sshd[1182]: Unable to negotiate with 38.102.83.114 port 32780: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 23 06:41:08 np0005532585.novalocal kdumpctl[1137]: kdump: No kdump initial ramdisk found.
Nov 23 06:41:08 np0005532585.novalocal kdumpctl[1137]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Nov 23 06:41:08 np0005532585.novalocal sshd[1199]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal sshd[1145]: Connection closed by 38.102.83.114 port 60978 [preauth]
Nov 23 06:41:08 np0005532585.novalocal sshd[1212]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal sshd[1167]: Connection closed by 38.102.83.114 port 60996 [preauth]
Nov 23 06:41:08 np0005532585.novalocal cloud-init[1263]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sun, 23 Nov 2025 06:41:08 +0000. Up 11.65 seconds.
Nov 23 06:41:08 np0005532585.novalocal sshd[1262]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal sshd[1262]: fatal: mm_answer_sign: sign: error in libcrypto
Nov 23 06:41:08 np0005532585.novalocal sshd[1199]: Connection closed by 38.102.83.114 port 32786 [preauth]
Nov 23 06:41:08 np0005532585.novalocal sshd[1279]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:08 np0005532585.novalocal sshd[1279]: Unable to negotiate with 38.102.83.114 port 32820: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 23 06:41:08 np0005532585.novalocal sshd[1212]: Connection closed by 38.102.83.114 port 32796 [preauth]
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Nov 23 06:41:08 np0005532585.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Nov 23 06:41:08 np0005532585.novalocal chronyd[767]: Selected source 167.160.187.12 (2.rhel.pool.ntp.org)
Nov 23 06:41:08 np0005532585.novalocal chronyd[767]: System clock TAI offset set to 37 seconds
Nov 23 06:41:08 np0005532585.novalocal dracut[1436]: dracut-057-21.git20230214.el9
Nov 23 06:41:08 np0005532585.novalocal cloud-init[1456]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sun, 23 Nov 2025 06:41:08 +0000. Up 12.05 seconds.
Nov 23 06:41:08 np0005532585.novalocal dracut[1438]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1513]: #############################################################
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1518]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1532]: 256 SHA256:uLBD0n17Ltuhg1Ary+3sqo/DK6blrJs9Mbvc6BVR/xw root@np0005532585.novalocal (ECDSA)
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1540]: 256 SHA256:tYIAbgiV17JZ+ywyFV1S7S0smdEHVQupK5wep/i3maw root@np0005532585.novalocal (ED25519)
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1546]: 3072 SHA256:ZQrZ/pYycRFaGUbuepttPoytYaqMQuJ642qdOqq+xXI root@np0005532585.novalocal (RSA)
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1549]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1552]: #############################################################
Nov 23 06:41:09 np0005532585.novalocal cloud-init[1456]: Cloud-init v. 22.1-9.el9 finished at Sun, 23 Nov 2025 06:41:09 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.30 seconds
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 23 06:41:09 np0005532585.novalocal systemd[1]: Reloading Network Manager...
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 23 06:41:09 np0005532585.novalocal NetworkManager[790]: <info>  [1763880069.2572] audit: op="reload" arg="0" pid=1629 uid=0 result="success"
Nov 23 06:41:09 np0005532585.novalocal NetworkManager[790]: <info>  [1763880069.2580] config: signal: SIGHUP (no changes from disk)
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 23 06:41:09 np0005532585.novalocal systemd[1]: Reloaded Network Manager.
Nov 23 06:41:09 np0005532585.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Nov 23 06:41:09 np0005532585.novalocal systemd[1]: Reached target Cloud-init target.
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: memstrack is not available
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: memstrack is not available
Nov 23 06:41:09 np0005532585.novalocal dracut[1438]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: *** Including module: systemd ***
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: *** Including module: systemd-initrd ***
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: *** Including module: i18n ***
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: No KEYMAP configured.
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: *** Including module: drm ***
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: *** Including module: prefixdevname ***
Nov 23 06:41:10 np0005532585.novalocal dracut[1438]: *** Including module: kernel-modules ***
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]: *** Including module: kernel-modules-extra ***
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]: *** Including module: qemu ***
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]: *** Including module: fstab-sys ***
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]: *** Including module: rootfs-block ***
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]: *** Including module: terminfo ***
Nov 23 06:41:11 np0005532585.novalocal dracut[1438]: *** Including module: udev-rules ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: Skipping udev rule: 91-permissions.rules
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: virtiofs ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: dracut-systemd ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: usrmount ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: base ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: fs-lib ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: kdumpbase ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:   microcode_ctl module: mangling fw_dir
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 23 06:41:12 np0005532585.novalocal dracut[1438]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]: *** Including module: shutdown ***
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]: *** Including module: squash ***
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]: *** Including modules done ***
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]: *** Installing kernel module dependencies ***
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]: *** Installing kernel module dependencies done ***
Nov 23 06:41:13 np0005532585.novalocal dracut[1438]: *** Resolving executable dependencies ***
Nov 23 06:41:13 np0005532585.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 06:41:14 np0005532585.novalocal dracut[1438]: *** Resolving executable dependencies done ***
Nov 23 06:41:14 np0005532585.novalocal dracut[1438]: *** Hardlinking files ***
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Mode:           real
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Files:          1099
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Linked:         3 files
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Compared:       0 xattrs
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Compared:       373 files
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Saved:          61.04 KiB
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Duration:       0.048811 seconds
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: *** Hardlinking files done ***
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Could not find 'strip'. Not stripping the initramfs.
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: *** Generating early-microcode cpio image ***
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: *** Constructing AuthenticAMD.bin ***
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: *** Store current command line parameters ***
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: Stored kernel commandline:
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: No dracut internal kernel commandline stored in the initramfs
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: *** Install squash loader ***
Nov 23 06:41:15 np0005532585.novalocal dracut[1438]: *** Squashing the files inside the initramfs ***
Nov 23 06:41:17 np0005532585.novalocal dracut[1438]: *** Squashing the files inside the initramfs done ***
Nov 23 06:41:17 np0005532585.novalocal dracut[1438]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Nov 23 06:41:17 np0005532585.novalocal dracut[1438]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Nov 23 06:41:17 np0005532585.novalocal kdumpctl[1137]: kdump: kexec: loaded kdump kernel
Nov 23 06:41:17 np0005532585.novalocal kdumpctl[1137]: kdump: Starting kdump: [OK]
Nov 23 06:41:17 np0005532585.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 23 06:41:17 np0005532585.novalocal systemd[1]: Startup finished in 1.232s (kernel) + 2.196s (initrd) + 17.645s (userspace) = 21.075s.
Nov 23 06:41:33 np0005532585.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 06:41:43 np0005532585.novalocal sshd[4177]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:41:43 np0005532585.novalocal sshd[4177]: Accepted publickey for zuul from 38.102.83.114 port 38404 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 23 06:41:43 np0005532585.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 23 06:41:43 np0005532585.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 23 06:41:43 np0005532585.novalocal systemd-logind[761]: New session 1 of user zuul.
Nov 23 06:41:43 np0005532585.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 23 06:41:43 np0005532585.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Queued start job for default target Main User Target.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Created slice User Application Slice.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Reached target Paths.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Reached target Timers.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Starting D-Bus User Message Bus Socket...
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Starting Create User's Volatile Files and Directories...
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Finished Create User's Volatile Files and Directories.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Listening on D-Bus User Message Bus Socket.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Reached target Sockets.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Reached target Basic System.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Reached target Main User Target.
Nov 23 06:41:43 np0005532585.novalocal systemd[4181]: Startup finished in 135ms.
Nov 23 06:41:43 np0005532585.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 23 06:41:43 np0005532585.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 23 06:41:43 np0005532585.novalocal sshd[4177]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:41:43 np0005532585.novalocal python3[4233]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 06:41:53 np0005532585.novalocal python3[4251]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 06:42:00 np0005532585.novalocal python3[4304]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 06:42:01 np0005532585.novalocal python3[4334]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 23 06:42:04 np0005532585.novalocal python3[4350]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:04 np0005532585.novalocal python3[4364]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:06 np0005532585.novalocal python3[4423]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:06 np0005532585.novalocal python3[4464]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763880125.9268944-391-182221978638100/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa follow=False checksum=4877a9422cfa308f85d093f4f170aa8e2f5129bc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:07 np0005532585.novalocal python3[4537]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:08 np0005532585.novalocal python3[4578]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763880127.6385386-491-67572255985701/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa.pub follow=False checksum=9e6358c9dcdfe108c4f779a3b698bd3c9d97da46 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:10 np0005532585.novalocal python3[4606]: ansible-ping Invoked with data=pong
Nov 23 06:42:12 np0005532585.novalocal python3[4620]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 06:42:15 np0005532585.novalocal python3[4673]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 23 06:42:18 np0005532585.novalocal python3[4695]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:18 np0005532585.novalocal python3[4709]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:18 np0005532585.novalocal python3[4723]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:20 np0005532585.novalocal python3[4737]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:20 np0005532585.novalocal python3[4751]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:20 np0005532585.novalocal python3[4765]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:23 np0005532585.novalocal sudo[4780]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkscdmcvvycuvhsqvkdtnvfjvcflztmk ; /usr/bin/python3
Nov 23 06:42:23 np0005532585.novalocal sudo[4780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:23 np0005532585.novalocal python3[4782]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:23 np0005532585.novalocal sudo[4780]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:24 np0005532585.novalocal sudo[4828]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvloybbfhjqyxfiwpmwlabepuyhpobgz ; /usr/bin/python3
Nov 23 06:42:24 np0005532585.novalocal sudo[4828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:24 np0005532585.novalocal python3[4830]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:24 np0005532585.novalocal sudo[4828]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:25 np0005532585.novalocal sudo[4871]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyqlrygrhswogxatmfuqvbmqxpophibb ; /usr/bin/python3
Nov 23 06:42:25 np0005532585.novalocal sudo[4871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:25 np0005532585.novalocal python3[4873]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880144.6079686-102-52469507161254/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:25 np0005532585.novalocal sudo[4871]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:32 np0005532585.novalocal python3[4901]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:33 np0005532585.novalocal python3[4915]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:33 np0005532585.novalocal python3[4929]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:33 np0005532585.novalocal python3[4943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:33 np0005532585.novalocal python3[4957]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:34 np0005532585.novalocal python3[4971]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:34 np0005532585.novalocal python3[4985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:34 np0005532585.novalocal python3[4999]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:34 np0005532585.novalocal python3[5013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:35 np0005532585.novalocal python3[5027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:35 np0005532585.novalocal python3[5041]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:35 np0005532585.novalocal python3[5055]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:36 np0005532585.novalocal python3[5069]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:36 np0005532585.novalocal python3[5083]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:36 np0005532585.novalocal python3[5097]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:36 np0005532585.novalocal python3[5111]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:37 np0005532585.novalocal python3[5125]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:37 np0005532585.novalocal python3[5139]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:37 np0005532585.novalocal python3[5153]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:37 np0005532585.novalocal python3[5167]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:38 np0005532585.novalocal python3[5181]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:38 np0005532585.novalocal python3[5195]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:38 np0005532585.novalocal python3[5209]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:39 np0005532585.novalocal python3[5223]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:39 np0005532585.novalocal python3[5237]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:39 np0005532585.novalocal python3[5251]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:42:40 np0005532585.novalocal sudo[5265]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gojzkvegiogagttffdoyrlwmtmnidzts ; /usr/bin/python3
Nov 23 06:42:40 np0005532585.novalocal sudo[5265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:40 np0005532585.novalocal python3[5267]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 23 06:42:40 np0005532585.novalocal systemd[1]: Starting Time & Date Service...
Nov 23 06:42:41 np0005532585.novalocal systemd[1]: Started Time & Date Service.
Nov 23 06:42:41 np0005532585.novalocal systemd-timedated[5269]: Changed time zone to 'UTC' (UTC).
Nov 23 06:42:41 np0005532585.novalocal sudo[5265]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:42 np0005532585.novalocal sudo[5286]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btccfcjnwpfkxpqdusqglffhaipuhfit ; /usr/bin/python3
Nov 23 06:42:42 np0005532585.novalocal sudo[5286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:42 np0005532585.novalocal python3[5288]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:42 np0005532585.novalocal sudo[5286]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:43 np0005532585.novalocal python3[5334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:43 np0005532585.novalocal python3[5375]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763880163.269609-494-154951226261599/source _original_basename=tmp0pv_ioa6 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:45 np0005532585.novalocal python3[5435]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:45 np0005532585.novalocal python3[5476]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763880164.7656982-583-138643582818316/source _original_basename=tmp1tdcb_wm follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:46 np0005532585.novalocal sshd[5518]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:42:47 np0005532585.novalocal sudo[5537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsgtgwezhoazkybwtxpvimnpqqpwkkxu ; /usr/bin/python3
Nov 23 06:42:47 np0005532585.novalocal sudo[5537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:47 np0005532585.novalocal python3[5539]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:47 np0005532585.novalocal sudo[5537]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:47 np0005532585.novalocal sudo[5581]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xluzgmdtgxpdtldbbvccilwidddomoca ; /usr/bin/python3
Nov 23 06:42:47 np0005532585.novalocal sudo[5581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:47 np0005532585.novalocal python3[5583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763880166.8770776-728-49137748679588/source _original_basename=tmpdm6587bx follow=False checksum=fd315655f47fc5fc6a83eb387067284e4325b6ee backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:47 np0005532585.novalocal sudo[5581]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:48 np0005532585.novalocal sshd[5518]: Invalid user centos from 24.187.213.29 port 51168
Nov 23 06:42:48 np0005532585.novalocal python3[5611]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:42:48 np0005532585.novalocal sshd[5518]: Connection closed by invalid user centos 24.187.213.29 port 51168 [preauth]
Nov 23 06:42:48 np0005532585.novalocal python3[5627]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:42:49 np0005532585.novalocal sudo[5675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxtplytcccfhiglnqqpeiqtxvzzuibdj ; /usr/bin/python3
Nov 23 06:42:49 np0005532585.novalocal sudo[5675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:50 np0005532585.novalocal python3[5677]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:42:50 np0005532585.novalocal sudo[5675]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:50 np0005532585.novalocal sudo[5718]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxpjdixcpogngjqcamkcyrebvxuwbise ; /usr/bin/python3
Nov 23 06:42:50 np0005532585.novalocal sudo[5718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:50 np0005532585.novalocal python3[5720]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880169.7945817-853-200842931884732/source _original_basename=tmpolctoykt follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:42:50 np0005532585.novalocal sudo[5718]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:51 np0005532585.novalocal sudo[5749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdsuxbyqxzziugzhetieldsbsusjoveh ; /usr/bin/python3
Nov 23 06:42:51 np0005532585.novalocal sudo[5749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:42:51 np0005532585.novalocal python3[5751]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-656c-65aa-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:42:51 np0005532585.novalocal sudo[5749]: pam_unix(sudo:session): session closed for user root
Nov 23 06:42:52 np0005532585.novalocal python3[5769]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-656c-65aa-000000000024-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 23 06:42:54 np0005532585.novalocal python3[5787]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:43:11 np0005532585.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 06:43:14 np0005532585.novalocal sudo[5804]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ougxsueyxtchupuivukrklfewzlwlayk ; /usr/bin/python3
Nov 23 06:43:14 np0005532585.novalocal sudo[5804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:43:14 np0005532585.novalocal python3[5806]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:43:14 np0005532585.novalocal sudo[5804]: pam_unix(sudo:session): session closed for user root
Nov 23 06:44:14 np0005532585.novalocal sshd[4190]: Received disconnect from 38.102.83.114 port 38404:11: disconnected by user
Nov 23 06:44:14 np0005532585.novalocal sshd[4190]: Disconnected from user zuul 38.102.83.114 port 38404
Nov 23 06:44:14 np0005532585.novalocal sshd[4177]: pam_unix(sshd:session): session closed for user zuul
Nov 23 06:44:14 np0005532585.novalocal systemd-logind[761]: Session 1 logged out. Waiting for processes to exit.
Nov 23 06:44:36 np0005532585.novalocal systemd[4181]: Starting Mark boot as successful...
Nov 23 06:44:36 np0005532585.novalocal systemd[4181]: Finished Mark boot as successful.
Nov 23 06:45:03 np0005532585.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Nov 23 06:45:03 np0005532585.novalocal systemd[1]: efi.mount: Deactivated successfully.
Nov 23 06:45:03 np0005532585.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Nov 23 06:46:13 np0005532585.novalocal sshd[5812]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:46:13 np0005532585.novalocal sshd[5813]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:46:13 np0005532585.novalocal sshd[5813]: error: kex_exchange_identification: read: Connection reset by peer
Nov 23 06:46:13 np0005532585.novalocal sshd[5813]: Connection reset by 45.140.17.97 port 1476
Nov 23 06:46:42 np0005532585.novalocal sshd[5817]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:46:44 np0005532585.novalocal sshd[5817]: Invalid user test from 185.156.73.233 port 53308
Nov 23 06:46:44 np0005532585.novalocal sshd[5817]: Connection closed by invalid user test 185.156.73.233 port 53308 [preauth]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Nov 23 06:46:54 np0005532585.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Nov 23 06:46:55 np0005532585.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Nov 23 06:46:55 np0005532585.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0353] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 23 06:46:55 np0005532585.novalocal systemd-udevd[5819]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0536] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Nov 23 06:46:55 np0005532585.novalocal systemd[4181]: Created slice User Background Tasks Slice.
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0578] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0583] device (eth1): carrier: link connected
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0586] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0592] policy: auto-activating connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd)
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0599] device (eth1): Activation: starting connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd)
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0601] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0605] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Nov 23 06:46:55 np0005532585.novalocal systemd[4181]: Starting Cleanup of User's Temporary Files and Directories...
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0611] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Nov 23 06:46:55 np0005532585.novalocal NetworkManager[790]: <info>  [1763880415.0616] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 23 06:46:55 np0005532585.novalocal systemd[4181]: Finished Cleanup of User's Temporary Files and Directories.
Nov 23 06:46:55 np0005532585.novalocal sshd[5823]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:46:56 np0005532585.novalocal sshd[5823]: Accepted publickey for zuul from 38.102.83.114 port 38222 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 06:46:56 np0005532585.novalocal systemd-logind[761]: New session 3 of user zuul.
Nov 23 06:46:56 np0005532585.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 23 06:46:56 np0005532585.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Nov 23 06:46:56 np0005532585.novalocal sshd[5823]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:46:56 np0005532585.novalocal python3[5840]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-7c02-8043-000000000408-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:47:09 np0005532585.novalocal sudo[5888]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqjmxebftiuchymrrsqgtpsgewwraoqc ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 23 06:47:09 np0005532585.novalocal sudo[5888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:47:09 np0005532585.novalocal python3[5890]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:47:09 np0005532585.novalocal sudo[5888]: pam_unix(sudo:session): session closed for user root
Nov 23 06:47:09 np0005532585.novalocal sudo[5931]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifnfjetknwjxhewjwuzlhrlpcurmhabh ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 23 06:47:09 np0005532585.novalocal sudo[5931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:47:10 np0005532585.novalocal python3[5933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763880429.3437893-486-217729593099398/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=985a126fed576251819cd9adb198f0955a128dcd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:47:10 np0005532585.novalocal sudo[5931]: pam_unix(sudo:session): session closed for user root
Nov 23 06:47:10 np0005532585.novalocal sudo[5961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdazospraukcajghuzjoftnjktkrqhio ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 23 06:47:10 np0005532585.novalocal sudo[5961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:47:10 np0005532585.novalocal python3[5963]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7113] caught SIGTERM, shutting down normally.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Stopping Network Manager...
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7219] dhcp4 (eth0): canceled DHCP transaction
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7221] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7221] dhcp4 (eth0): state changed no lease
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7226] manager: NetworkManager state is now CONNECTING
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7320] dhcp4 (eth1): canceled DHCP transaction
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7321] dhcp4 (eth1): state changed no lease
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[790]: <info>  [1763880430.7387] exiting (success)
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Stopped Network Manager.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: NetworkManager.service: Consumed 2.331s CPU time.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Starting Network Manager...
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.7919] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:2e694857-c83c-42a3-a300-fcad2ba2b06e)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.7923] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.7947] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Started Network Manager.
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8015] manager[0x55bcf6d8a090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Starting Hostname Service...
Nov 23 06:47:10 np0005532585.novalocal sudo[5961]: pam_unix(sudo:session): session closed for user root
Nov 23 06:47:10 np0005532585.novalocal systemd[1]: Started Hostname Service.
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8868] hostname: hostname: using hostnamed
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8869] hostname: static hostname changed from (none) to "np0005532585.novalocal"
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8875] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8882] manager[0x55bcf6d8a090]: rfkill: Wi-Fi hardware radio set enabled
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8882] manager[0x55bcf6d8a090]: rfkill: WWAN hardware radio set enabled
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8921] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8922] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8923] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8923] manager: Networking is enabled by state file
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8930] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8931] settings: Loaded settings plugin: keyfile (internal)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8984] dhcp: init: Using DHCP client 'internal'
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.8989] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9000] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9009] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9022] device (lo): Activation: starting connection 'lo' (75f95e5d-82c2-442b-91f8-cfa8260985bf)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9033] device (eth0): carrier: link connected
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9042] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9057] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9058] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9072] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9084] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9093] device (eth1): carrier: link connected
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9100] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9108] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd) (indicated)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9108] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9116] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9127] device (eth1): Activation: starting connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9156] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9164] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9170] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9174] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9179] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9183] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9187] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9192] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9201] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9206] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9225] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9231] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9256] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9291] dhcp4 (eth0): state changed new lease, address=38.102.83.198
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9302] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9311] device (lo): Activation: successful, device activated.
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9320] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9418] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9487] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9491] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9496] manager: NetworkManager state is now CONNECTED_SITE
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9501] device (eth0): Activation: successful, device activated.
Nov 23 06:47:10 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880430.9507] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 23 06:47:11 np0005532585.novalocal python3[6036]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-7c02-8043-00000000012b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:47:21 np0005532585.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 06:47:40 np0005532585.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 06:47:55 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880475.8228] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:55 np0005532585.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 06:47:55 np0005532585.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 06:47:55 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880475.8443] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:55 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880475.8449] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Nov 23 06:47:55 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880475.8461] device (eth1): Activation: successful, device activated.
Nov 23 06:47:55 np0005532585.novalocal NetworkManager[5975]: <info>  [1763880475.8470] manager: startup complete
Nov 23 06:47:55 np0005532585.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 23 06:48:05 np0005532585.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 06:48:11 np0005532585.novalocal sshd[5826]: Received disconnect from 38.102.83.114 port 38222:11: disconnected by user
Nov 23 06:48:11 np0005532585.novalocal sshd[5826]: Disconnected from user zuul 38.102.83.114 port 38222
Nov 23 06:48:11 np0005532585.novalocal sshd[5823]: pam_unix(sshd:session): session closed for user zuul
Nov 23 06:48:11 np0005532585.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 23 06:48:11 np0005532585.novalocal systemd[1]: session-3.scope: Consumed 1.492s CPU time.
Nov 23 06:48:11 np0005532585.novalocal systemd-logind[761]: Session 3 logged out. Waiting for processes to exit.
Nov 23 06:48:11 np0005532585.novalocal systemd-logind[761]: Removed session 3.
Nov 23 06:48:57 np0005532585.novalocal sshd[6063]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:48:57 np0005532585.novalocal sshd[6063]: Accepted publickey for zuul from 38.102.83.114 port 34628 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 06:48:57 np0005532585.novalocal systemd-logind[761]: New session 4 of user zuul.
Nov 23 06:48:57 np0005532585.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 23 06:48:57 np0005532585.novalocal sshd[6063]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:48:57 np0005532585.novalocal sudo[6112]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sonduzejzlkyzoocgoohzfmyqqcbrssb ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 23 06:48:57 np0005532585.novalocal sudo[6112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:48:58 np0005532585.novalocal python3[6114]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:48:58 np0005532585.novalocal sudo[6112]: pam_unix(sudo:session): session closed for user root
Nov 23 06:48:58 np0005532585.novalocal sudo[6155]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdsijnbltnutcyzwcmosdphaqwgaafju ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 23 06:48:58 np0005532585.novalocal sudo[6155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:48:58 np0005532585.novalocal python3[6157]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880537.7609437-628-269896251414129/source _original_basename=tmpe0spzpgd follow=False checksum=492625bb7c06d655281f511b293f3f3edc954e6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:48:58 np0005532585.novalocal sudo[6155]: pam_unix(sudo:session): session closed for user root
Nov 23 06:49:01 np0005532585.novalocal sshd[6063]: pam_unix(sshd:session): session closed for user zuul
Nov 23 06:49:01 np0005532585.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 23 06:49:01 np0005532585.novalocal systemd-logind[761]: Session 4 logged out. Waiting for processes to exit.
Nov 23 06:49:01 np0005532585.novalocal systemd-logind[761]: Removed session 4.
Nov 23 06:49:59 np0005532585.novalocal sshd[6173]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:50:00 np0005532585.novalocal sshd[6173]: Received disconnect from 187.72.57.82 port 36578:11: Bye Bye [preauth]
Nov 23 06:50:00 np0005532585.novalocal sshd[6173]: Disconnected from authenticating user root 187.72.57.82 port 36578 [preauth]
Nov 23 06:52:49 np0005532585.novalocal sshd[6177]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:52:51 np0005532585.novalocal sshd[6177]: Received disconnect from 34.124.148.87 port 46224:11: Bye Bye [preauth]
Nov 23 06:52:51 np0005532585.novalocal sshd[6177]: Disconnected from authenticating user root 34.124.148.87 port 46224 [preauth]
Nov 23 06:53:07 np0005532585.novalocal sshd[6179]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:53:08 np0005532585.novalocal sshd[6179]: Received disconnect from 187.72.57.82 port 57961:11: Bye Bye [preauth]
Nov 23 06:53:08 np0005532585.novalocal sshd[6179]: Disconnected from authenticating user root 187.72.57.82 port 57961 [preauth]
Nov 23 06:54:43 np0005532585.novalocal sshd[6181]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:54:44 np0005532585.novalocal sshd[6181]: Invalid user ec2-user from 187.72.57.82 port 43525
Nov 23 06:54:45 np0005532585.novalocal sshd[6181]: Received disconnect from 187.72.57.82 port 43525:11: Bye Bye [preauth]
Nov 23 06:54:45 np0005532585.novalocal sshd[6181]: Disconnected from invalid user ec2-user 187.72.57.82 port 43525 [preauth]
Nov 23 06:54:45 np0005532585.novalocal sshd[6184]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:54:45 np0005532585.novalocal sshd[6184]: Accepted publickey for zuul from 38.102.83.114 port 57894 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 06:54:45 np0005532585.novalocal systemd-logind[761]: New session 5 of user zuul.
Nov 23 06:54:45 np0005532585.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 23 06:54:45 np0005532585.novalocal sshd[6184]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:54:45 np0005532585.novalocal sudo[6201]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsdtoaimmpoaihwpeezxypoanrhnxxnb ; /usr/bin/python3
Nov 23 06:54:45 np0005532585.novalocal sudo[6201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:45 np0005532585.novalocal python3[6203]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-9f63-9a03-000000001cfc-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:54:46 np0005532585.novalocal sudo[6201]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:47 np0005532585.novalocal sudo[6220]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcdpubvhcnpxvnvkjatdbpehitwampkm ; /usr/bin/python3
Nov 23 06:54:47 np0005532585.novalocal sudo[6220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:47 np0005532585.novalocal python3[6222]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:54:47 np0005532585.novalocal sudo[6220]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:47 np0005532585.novalocal sudo[6236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnexodbudrxyvzpundgiyiajigisewrn ; /usr/bin/python3
Nov 23 06:54:47 np0005532585.novalocal sudo[6236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:47 np0005532585.novalocal python3[6238]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:54:47 np0005532585.novalocal sudo[6236]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:47 np0005532585.novalocal sudo[6252]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztmhfgapjuckhjjzprexnzxrvkuozucw ; /usr/bin/python3
Nov 23 06:54:47 np0005532585.novalocal sudo[6252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:47 np0005532585.novalocal python3[6254]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:54:47 np0005532585.novalocal sudo[6252]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:48 np0005532585.novalocal sudo[6268]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlwjsmhmdxjnnxxsiccenglkpvteuerd ; /usr/bin/python3
Nov 23 06:54:48 np0005532585.novalocal sudo[6268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:48 np0005532585.novalocal python3[6270]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:54:48 np0005532585.novalocal sudo[6268]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:48 np0005532585.novalocal sudo[6284]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvdrjfddrzghuzjfmylfutsdicaljliu ; /usr/bin/python3
Nov 23 06:54:48 np0005532585.novalocal sudo[6284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:48 np0005532585.novalocal python3[6286]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:54:48 np0005532585.novalocal sudo[6284]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:49 np0005532585.novalocal sudo[6332]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfdbftsdxcgsqhsqahhhmksxszgzxreg ; /usr/bin/python3
Nov 23 06:54:49 np0005532585.novalocal sudo[6332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:50 np0005532585.novalocal python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 06:54:50 np0005532585.novalocal sudo[6332]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:50 np0005532585.novalocal sudo[6375]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkmfocuvqpqiivizcouovsiiyyfwhkop ; /usr/bin/python3
Nov 23 06:54:50 np0005532585.novalocal sudo[6375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:50 np0005532585.novalocal python3[6377]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880889.8740711-642-221984898280675/source _original_basename=tmppfvd9hk7 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 06:54:50 np0005532585.novalocal sudo[6375]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:51 np0005532585.novalocal sudo[6405]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwiwfuhfljbapwwxlfauufersbthuyvk ; /usr/bin/python3
Nov 23 06:54:51 np0005532585.novalocal sudo[6405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:52 np0005532585.novalocal python3[6407]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 06:54:52 np0005532585.novalocal systemd[1]: Reloading.
Nov 23 06:54:52 np0005532585.novalocal systemd-rc-local-generator[6427]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 06:54:52 np0005532585.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 06:54:52 np0005532585.novalocal sudo[6405]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:53 np0005532585.novalocal sudo[6451]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdhsmwcdfunormmsdaxgpeljcnskbtfa ; /usr/bin/python3
Nov 23 06:54:53 np0005532585.novalocal sudo[6451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:53 np0005532585.novalocal python3[6453]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 23 06:54:53 np0005532585.novalocal sudo[6451]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:54 np0005532585.novalocal sudo[6467]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmhavghbhnaoalucblwlrpkeqviammnf ; /usr/bin/python3
Nov 23 06:54:54 np0005532585.novalocal sudo[6467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:54 np0005532585.novalocal python3[6469]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:54:54 np0005532585.novalocal sudo[6467]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:55 np0005532585.novalocal sudo[6485]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgcdrkpnsfljzmfkqdhlmyzzvbymxfuq ; /usr/bin/python3
Nov 23 06:54:55 np0005532585.novalocal sudo[6485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:55 np0005532585.novalocal python3[6487]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:54:55 np0005532585.novalocal sudo[6485]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:55 np0005532585.novalocal sudo[6503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neeefykdmjutkzsmpbtwdqmzcclzoger ; /usr/bin/python3
Nov 23 06:54:55 np0005532585.novalocal sudo[6503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:55 np0005532585.novalocal python3[6505]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:54:55 np0005532585.novalocal sudo[6503]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:55 np0005532585.novalocal sudo[6521]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgktyygmqznnlmcdycxapugmmuqifpsr ; /usr/bin/python3
Nov 23 06:54:55 np0005532585.novalocal sudo[6521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:54:55 np0005532585.novalocal python3[6523]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:54:55 np0005532585.novalocal sudo[6521]: pam_unix(sudo:session): session closed for user root
Nov 23 06:54:57 np0005532585.novalocal python3[6540]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-9f63-9a03-000000001d03-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:54:57 np0005532585.novalocal python3[6560]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 06:55:00 np0005532585.novalocal sshd[6184]: pam_unix(sshd:session): session closed for user zuul
Nov 23 06:55:00 np0005532585.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 23 06:55:00 np0005532585.novalocal systemd[1]: session-5.scope: Consumed 3.969s CPU time.
Nov 23 06:55:00 np0005532585.novalocal systemd-logind[761]: Session 5 logged out. Waiting for processes to exit.
Nov 23 06:55:00 np0005532585.novalocal systemd-logind[761]: Removed session 5.
Nov 23 06:55:47 np0005532585.novalocal sshd[6566]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:55:49 np0005532585.novalocal sshd[6566]: Invalid user reza from 34.124.148.87 port 42210
Nov 23 06:55:49 np0005532585.novalocal sshd[6566]: Received disconnect from 34.124.148.87 port 42210:11: Bye Bye [preauth]
Nov 23 06:55:49 np0005532585.novalocal sshd[6566]: Disconnected from invalid user reza 34.124.148.87 port 42210 [preauth]
Nov 23 06:56:14 np0005532585.novalocal sshd[6569]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:56:14 np0005532585.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Nov 23 06:56:14 np0005532585.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 23 06:56:14 np0005532585.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Nov 23 06:56:14 np0005532585.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 23 06:56:14 np0005532585.novalocal sshd[6569]: Accepted publickey for zuul from 38.102.83.114 port 38732 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 06:56:14 np0005532585.novalocal systemd-logind[761]: New session 6 of user zuul.
Nov 23 06:56:14 np0005532585.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 23 06:56:14 np0005532585.novalocal sshd[6569]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:56:14 np0005532585.novalocal sudo[6588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqfowyxuitpmxchikxycrgonpuitaidr ; /usr/bin/python3
Nov 23 06:56:14 np0005532585.novalocal sudo[6588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:56:15 np0005532585.novalocal systemd[1]: Starting RHSM dbus service...
Nov 23 06:56:15 np0005532585.novalocal systemd[1]: Started RHSM dbus service.
Nov 23 06:56:15 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 23 06:56:15 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 23 06:56:15 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 23 06:56:15 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005532585.novalocal (805f8986-28c6-49b9-9c1d-56700caa6ca2)
Nov 23 06:56:16 np0005532585.novalocal subscription-manager[6595]: Registered system with identity: 805f8986-28c6-49b9-9c1d-56700caa6ca2
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.entcertlib:131] certs updated:
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]: Total updates: 1
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]: Found (local) serial# []
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]: Expected (UEP) serial# [5107503348184781599]
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]: Added (new)
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]:   [sn:5107503348184781599 ( Content Access,) @ /etc/pki/entitlement/5107503348184781599.pem]
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]: Deleted (rogue):
Nov 23 06:56:16 np0005532585.novalocal rhsm-service[6595]:   <NONE>
Nov 23 06:56:16 np0005532585.novalocal subscription-manager[6595]: Added subscription for 'Content Access' contract 'None'
Nov 23 06:56:16 np0005532585.novalocal subscription-manager[6595]: Added subscription for product ' Content Access'
Nov 23 06:56:17 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 23 06:56:17 np0005532585.novalocal rhsm-service[6595]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 23 06:56:18 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:56:18 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:56:18 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:56:18 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:56:18 np0005532585.novalocal sshd[6667]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:56:18 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:56:19 np0005532585.novalocal sudo[6588]: pam_unix(sudo:session): session closed for user root
Nov 23 06:56:19 np0005532585.novalocal sshd[6667]: Received disconnect from 187.72.57.82 port 57325:11: Bye Bye [preauth]
Nov 23 06:56:19 np0005532585.novalocal sshd[6667]: Disconnected from authenticating user root 187.72.57.82 port 57325 [preauth]
Nov 23 06:56:26 np0005532585.novalocal python3[6688]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-d6e0-fafb-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 06:56:27 np0005532585.novalocal sudo[6705]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exsihxjfmxgcgbjnojnerrblxjhyrnrf ; /usr/bin/python3
Nov 23 06:56:27 np0005532585.novalocal sudo[6705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:56:27 np0005532585.novalocal python3[6707]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 06:56:34 np0005532585.novalocal sshd[6714]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:56:37 np0005532585.novalocal sshd[6714]: Connection closed by authenticating user nobody 72.179.206.6 port 44922 [preauth]
Nov 23 06:56:58 np0005532585.novalocal setsebool[6784]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 23 06:56:58 np0005532585.novalocal setsebool[6784]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  Converting 406 SID table entries...
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 06:57:07 np0005532585.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 06:57:20 np0005532585.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Nov 23 06:57:21 np0005532585.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 06:57:21 np0005532585.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 23 06:57:21 np0005532585.novalocal systemd[1]: Reloading.
Nov 23 06:57:21 np0005532585.novalocal systemd-rc-local-generator[7619]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 06:57:21 np0005532585.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 06:57:21 np0005532585.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 06:57:22 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:57:22 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 06:57:22 np0005532585.novalocal sudo[6705]: pam_unix(sudo:session): session closed for user root
Nov 23 06:57:30 np0005532585.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 06:57:30 np0005532585.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 23 06:57:30 np0005532585.novalocal systemd[1]: man-db-cache-update.service: Consumed 10.932s CPU time.
Nov 23 06:57:30 np0005532585.novalocal systemd[1]: run-r33d149de15094182a0e3920cebbfb370.service: Deactivated successfully.
Nov 23 06:57:34 np0005532585.novalocal sshd[18363]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:57:36 np0005532585.novalocal sshd[18363]: Received disconnect from 34.124.148.87 port 52922:11: Bye Bye [preauth]
Nov 23 06:57:36 np0005532585.novalocal sshd[18363]: Disconnected from authenticating user root 34.124.148.87 port 52922 [preauth]
Nov 23 06:57:56 np0005532585.novalocal sshd[18365]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:57:57 np0005532585.novalocal sshd[18365]: Received disconnect from 187.72.57.82 port 42891:11: Bye Bye [preauth]
Nov 23 06:57:57 np0005532585.novalocal sshd[18365]: Disconnected from authenticating user root 187.72.57.82 port 42891 [preauth]
Nov 23 06:58:00 np0005532585.novalocal sshd[18367]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:02 np0005532585.novalocal sshd[18367]: Invalid user postgres from 185.156.73.233 port 37870
Nov 23 06:58:02 np0005532585.novalocal sshd[18367]: Connection closed by invalid user postgres 185.156.73.233 port 37870 [preauth]
Nov 23 06:58:07 np0005532585.novalocal sshd[18369]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:14 np0005532585.novalocal sudo[18383]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avsrndnmvppcmewzxakgxsxeieghhxfp ; /usr/bin/python3
Nov 23 06:58:14 np0005532585.novalocal sudo[18383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:58:14 np0005532585.novalocal systemd[1]: var-lib-containers-storage-overlay-compat3400401224-merged.mount: Deactivated successfully.
Nov 23 06:58:14 np0005532585.novalocal systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1223709566-merged.mount: Deactivated successfully.
Nov 23 06:58:14 np0005532585.novalocal podman[18386]: 2025-11-23 06:58:14.36612783 +0000 UTC m=+0.110736362 system refresh
Nov 23 06:58:14 np0005532585.novalocal sudo[18383]: pam_unix(sudo:session): session closed for user root
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: Starting D-Bus User Message Bus...
Nov 23 06:58:15 np0005532585.novalocal dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 23 06:58:15 np0005532585.novalocal dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: Started D-Bus User Message Bus.
Nov 23 06:58:15 np0005532585.novalocal dbus-broker-lau[18442]: Ready
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: Created slice Slice /user.
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: podman-18426.scope: unit configures an IP firewall, but not running as root.
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: (This warning is only shown for the first unit using IP firewalling.)
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: Started podman-18426.scope.
Nov 23 06:58:15 np0005532585.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 06:58:15 np0005532585.novalocal systemd[4181]: Started podman-pause-1ee92a94.scope.
Nov 23 06:58:17 np0005532585.novalocal sshd[6569]: pam_unix(sshd:session): session closed for user zuul
Nov 23 06:58:17 np0005532585.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Nov 23 06:58:17 np0005532585.novalocal systemd[1]: session-6.scope: Consumed 51.522s CPU time.
Nov 23 06:58:17 np0005532585.novalocal systemd-logind[761]: Session 6 logged out. Waiting for processes to exit.
Nov 23 06:58:17 np0005532585.novalocal systemd-logind[761]: Removed session 6.
Nov 23 06:58:33 np0005532585.novalocal sshd[18446]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:33 np0005532585.novalocal sshd[18449]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:33 np0005532585.novalocal sshd[18447]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:33 np0005532585.novalocal sshd[18450]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:33 np0005532585.novalocal sshd[18448]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:33 np0005532585.novalocal sshd[18449]: Connection closed by 38.102.83.233 port 40768 [preauth]
Nov 23 06:58:33 np0005532585.novalocal sshd[18448]: Connection closed by 38.102.83.233 port 40752 [preauth]
Nov 23 06:58:33 np0005532585.novalocal sshd[18446]: Unable to negotiate with 38.102.83.233 port 40782: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 23 06:58:33 np0005532585.novalocal sshd[18447]: Unable to negotiate with 38.102.83.233 port 40788: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 23 06:58:33 np0005532585.novalocal sshd[18450]: Unable to negotiate with 38.102.83.233 port 40802: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 23 06:58:38 np0005532585.novalocal sshd[18456]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:58:38 np0005532585.novalocal sshd[18456]: Accepted publickey for zuul from 38.102.83.114 port 54756 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 06:58:38 np0005532585.novalocal systemd-logind[761]: New session 7 of user zuul.
Nov 23 06:58:38 np0005532585.novalocal systemd[1]: Started Session 7 of User zuul.
Nov 23 06:58:38 np0005532585.novalocal sshd[18456]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 06:58:39 np0005532585.novalocal python3[18473]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFp5ffFMuM9GICal8e/QjT+yRcsIbGaMWRt/HA7rb1TB5YKChpgkSmzIFogHU4gX8uce12LB+CRf7ndL6kzcKrg= zuul@np0005532578.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:58:39 np0005532585.novalocal sudo[18487]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfuamyzarwiobyllnlulbunqfjarsgew ; /usr/bin/python3
Nov 23 06:58:39 np0005532585.novalocal sudo[18487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 06:58:39 np0005532585.novalocal python3[18489]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFp5ffFMuM9GICal8e/QjT+yRcsIbGaMWRt/HA7rb1TB5YKChpgkSmzIFogHU4gX8uce12LB+CRf7ndL6kzcKrg= zuul@np0005532578.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 06:58:39 np0005532585.novalocal sudo[18487]: pam_unix(sudo:session): session closed for user root
Nov 23 06:58:41 np0005532585.novalocal sshd[18456]: pam_unix(sshd:session): session closed for user zuul
Nov 23 06:58:41 np0005532585.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Nov 23 06:58:41 np0005532585.novalocal systemd-logind[761]: Session 7 logged out. Waiting for processes to exit.
Nov 23 06:58:41 np0005532585.novalocal systemd-logind[761]: Removed session 7.
Nov 23 06:59:17 np0005532585.novalocal sshd[18490]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:59:19 np0005532585.novalocal sshd[18490]: Invalid user user from 34.124.148.87 port 48626
Nov 23 06:59:19 np0005532585.novalocal sshd[18490]: Received disconnect from 34.124.148.87 port 48626:11: Bye Bye [preauth]
Nov 23 06:59:19 np0005532585.novalocal sshd[18490]: Disconnected from invalid user user 34.124.148.87 port 48626 [preauth]
Nov 23 06:59:34 np0005532585.novalocal sshd[18492]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 06:59:35 np0005532585.novalocal sshd[18492]: Received disconnect from 187.72.57.82 port 56690:11: Bye Bye [preauth]
Nov 23 06:59:35 np0005532585.novalocal sshd[18492]: Disconnected from authenticating user root 187.72.57.82 port 56690 [preauth]
Nov 23 07:00:07 np0005532585.novalocal sshd[18369]: fatal: Timeout before authentication for 98.181.137.75 port 35406
Nov 23 07:00:08 np0005532585.novalocal sshd[18495]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:00:08 np0005532585.novalocal sshd[18495]: Accepted publickey for zuul from 38.102.83.114 port 44062 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:00:08 np0005532585.novalocal systemd-logind[761]: New session 8 of user zuul.
Nov 23 07:00:08 np0005532585.novalocal systemd[1]: Started Session 8 of User zuul.
Nov 23 07:00:08 np0005532585.novalocal sshd[18495]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:00:08 np0005532585.novalocal sudo[18512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maawxmgundszsfnlxawbuwgplrurzokt ; /usr/bin/python3
Nov 23 07:00:08 np0005532585.novalocal sudo[18512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:08 np0005532585.novalocal python3[18514]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 07:00:08 np0005532585.novalocal sudo[18512]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:09 np0005532585.novalocal sudo[18528]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcgjcpxkfyozwijvsgentpxxlzqkewjd ; /usr/bin/python3
Nov 23 07:00:09 np0005532585.novalocal sudo[18528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:09 np0005532585.novalocal python3[18530]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 23 07:00:09 np0005532585.novalocal sudo[18528]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:10 np0005532585.novalocal sudo[18578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcpkivxnfwcdtbxbkdnruzxownqqhywl ; /usr/bin/python3
Nov 23 07:00:10 np0005532585.novalocal sudo[18578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:11 np0005532585.novalocal python3[18580]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:00:11 np0005532585.novalocal sudo[18578]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:11 np0005532585.novalocal sudo[18621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxozvvapcolmwlmbbxhwupwfasxjdjmb ; /usr/bin/python3
Nov 23 07:00:11 np0005532585.novalocal sudo[18621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:11 np0005532585.novalocal python3[18623]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763881210.7678437-135-250098084774401/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa follow=False checksum=4877a9422cfa308f85d093f4f170aa8e2f5129bc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:00:11 np0005532585.novalocal sudo[18621]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:12 np0005532585.novalocal sudo[18683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldvmussodlkjjrlfybymwdzxkfamopmr ; /usr/bin/python3
Nov 23 07:00:12 np0005532585.novalocal sudo[18683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:12 np0005532585.novalocal python3[18685]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:00:12 np0005532585.novalocal sudo[18683]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:12 np0005532585.novalocal sudo[18726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twtejzqmdovftrbjetrwpksnamjigtbp ; /usr/bin/python3
Nov 23 07:00:12 np0005532585.novalocal sudo[18726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:12 np0005532585.novalocal python3[18728]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763881212.3622243-224-85225766420296/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa.pub follow=False checksum=9e6358c9dcdfe108c4f779a3b698bd3c9d97da46 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:00:12 np0005532585.novalocal sudo[18726]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:14 np0005532585.novalocal sudo[18756]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecyawfcfnquqzrswalmazgvqtyfkwhji ; /usr/bin/python3
Nov 23 07:00:14 np0005532585.novalocal sudo[18756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:00:14 np0005532585.novalocal python3[18758]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:00:14 np0005532585.novalocal sudo[18756]: pam_unix(sudo:session): session closed for user root
Nov 23 07:00:15 np0005532585.novalocal python3[18804]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:00:16 np0005532585.novalocal python3[18820]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpxjho5avh recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:00:17 np0005532585.novalocal python3[18880]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:00:17 np0005532585.novalocal python3[18896]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpiq_mqcca recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:00:19 np0005532585.novalocal python3[18956]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:00:19 np0005532585.novalocal python3[18972]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmp4ipv2jsb recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:00:19 np0005532585.novalocal sshd[18495]: pam_unix(sshd:session): session closed for user zuul
Nov 23 07:00:19 np0005532585.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Nov 23 07:00:19 np0005532585.novalocal systemd[1]: session-8.scope: Consumed 3.400s CPU time.
Nov 23 07:00:19 np0005532585.novalocal systemd-logind[761]: Session 8 logged out. Waiting for processes to exit.
Nov 23 07:00:19 np0005532585.novalocal systemd-logind[761]: Removed session 8.
Nov 23 07:00:21 np0005532585.novalocal sshd[18988]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:00:22 np0005532585.novalocal sshd[18988]: Invalid user support from 78.128.112.74 port 38312
Nov 23 07:00:22 np0005532585.novalocal sshd[18988]: Connection closed by invalid user support 78.128.112.74 port 38312 [preauth]
Nov 23 07:00:55 np0005532585.novalocal sshd[18990]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:00:56 np0005532585.novalocal sshd[18990]: Invalid user abbott from 34.124.148.87 port 37846
Nov 23 07:00:56 np0005532585.novalocal sshd[18990]: Received disconnect from 34.124.148.87 port 37846:11: Bye Bye [preauth]
Nov 23 07:00:56 np0005532585.novalocal sshd[18990]: Disconnected from invalid user abbott 34.124.148.87 port 37846 [preauth]
Nov 23 07:01:01 np0005532585.novalocal CROND[18993]: (root) CMD (run-parts /etc/cron.hourly)
Nov 23 07:01:01 np0005532585.novalocal run-parts[18996]: (/etc/cron.hourly) starting 0anacron
Nov 23 07:01:01 np0005532585.novalocal anacron[19004]: Anacron started on 2025-11-23
Nov 23 07:01:01 np0005532585.novalocal anacron[19004]: Will run job `cron.daily' in 6 min.
Nov 23 07:01:01 np0005532585.novalocal anacron[19004]: Will run job `cron.weekly' in 26 min.
Nov 23 07:01:01 np0005532585.novalocal anacron[19004]: Will run job `cron.monthly' in 46 min.
Nov 23 07:01:01 np0005532585.novalocal anacron[19004]: Jobs will be executed sequentially
Nov 23 07:01:01 np0005532585.novalocal run-parts[19006]: (/etc/cron.hourly) finished 0anacron
Nov 23 07:01:01 np0005532585.novalocal CROND[18992]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 23 07:01:10 np0005532585.novalocal sshd[19008]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:01:11 np0005532585.novalocal sshd[19008]: Received disconnect from 187.72.57.82 port 42256:11: Bye Bye [preauth]
Nov 23 07:01:11 np0005532585.novalocal sshd[19008]: Disconnected from authenticating user root 187.72.57.82 port 42256 [preauth]
Nov 23 07:02:26 np0005532585.novalocal sshd[19010]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:02:28 np0005532585.novalocal sshd[19010]: Invalid user rancher from 34.124.148.87 port 38876
Nov 23 07:02:28 np0005532585.novalocal sshd[19010]: Received disconnect from 34.124.148.87 port 38876:11: Bye Bye [preauth]
Nov 23 07:02:28 np0005532585.novalocal sshd[19010]: Disconnected from invalid user rancher 34.124.148.87 port 38876 [preauth]
Nov 23 07:02:32 np0005532585.novalocal sshd[19012]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:02:32 np0005532585.novalocal sshd[19012]: Accepted publickey for zuul from 38.102.83.233 port 36272 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:02:32 np0005532585.novalocal systemd-logind[761]: New session 9 of user zuul.
Nov 23 07:02:32 np0005532585.novalocal systemd[1]: Started Session 9 of User zuul.
Nov 23 07:02:32 np0005532585.novalocal sshd[19012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:02:33 np0005532585.novalocal python3[19058]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:02:41 np0005532585.novalocal sshd[19060]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:02:42 np0005532585.novalocal sshd[19060]: Invalid user opsadmin from 187.72.57.82 port 56053
Nov 23 07:02:42 np0005532585.novalocal sshd[19060]: Received disconnect from 187.72.57.82 port 56053:11: Bye Bye [preauth]
Nov 23 07:02:42 np0005532585.novalocal sshd[19060]: Disconnected from invalid user opsadmin 187.72.57.82 port 56053 [preauth]
Nov 23 07:04:03 np0005532585.novalocal sshd[19063]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:04:04 np0005532585.novalocal sshd[19063]: Invalid user sanjay from 34.124.148.87 port 59348
Nov 23 07:04:04 np0005532585.novalocal sshd[19063]: Received disconnect from 34.124.148.87 port 59348:11: Bye Bye [preauth]
Nov 23 07:04:04 np0005532585.novalocal sshd[19063]: Disconnected from invalid user sanjay 34.124.148.87 port 59348 [preauth]
Nov 23 07:04:15 np0005532585.novalocal sshd[19066]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:04:16 np0005532585.novalocal sshd[19066]: Invalid user backupuser from 187.72.57.82 port 41618
Nov 23 07:04:16 np0005532585.novalocal sshd[19066]: Received disconnect from 187.72.57.82 port 41618:11: Bye Bye [preauth]
Nov 23 07:04:16 np0005532585.novalocal sshd[19066]: Disconnected from invalid user backupuser 187.72.57.82 port 41618 [preauth]
Nov 23 07:04:35 np0005532585.novalocal sshd[19068]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:04:36 np0005532585.novalocal sshd[19068]: Invalid user admin from 80.94.95.116 port 23272
Nov 23 07:04:36 np0005532585.novalocal sshd[19068]: Connection closed by invalid user admin 80.94.95.116 port 23272 [preauth]
Nov 23 07:05:38 np0005532585.novalocal sshd[19070]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:05:39 np0005532585.novalocal sshd[19070]: Invalid user test from 34.124.148.87 port 34038
Nov 23 07:05:39 np0005532585.novalocal sshd[19070]: Received disconnect from 34.124.148.87 port 34038:11: Bye Bye [preauth]
Nov 23 07:05:39 np0005532585.novalocal sshd[19070]: Disconnected from invalid user test 34.124.148.87 port 34038 [preauth]
Nov 23 07:05:52 np0005532585.novalocal sshd[19072]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:05:53 np0005532585.novalocal sshd[19072]: Received disconnect from 187.72.57.82 port 55415:11: Bye Bye [preauth]
Nov 23 07:05:53 np0005532585.novalocal sshd[19072]: Disconnected from authenticating user root 187.72.57.82 port 55415 [preauth]
Nov 23 07:06:56 np0005532585.novalocal sshd[19074]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:07:00 np0005532585.novalocal sshd[19074]: Invalid user test from 49.124.152.144 port 56542
Nov 23 07:07:00 np0005532585.novalocal sshd[19074]: Connection closed by invalid user test 49.124.152.144 port 56542 [preauth]
Nov 23 07:07:01 np0005532585.novalocal anacron[19004]: Job `cron.daily' started
Nov 23 07:07:01 np0005532585.novalocal anacron[19004]: Job `cron.daily' terminated
Nov 23 07:07:09 np0005532585.novalocal sshd[19078]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:07:10 np0005532585.novalocal sshd[19078]: Invalid user suporte from 34.124.148.87 port 32868
Nov 23 07:07:11 np0005532585.novalocal sshd[19078]: Received disconnect from 34.124.148.87 port 32868:11: Bye Bye [preauth]
Nov 23 07:07:11 np0005532585.novalocal sshd[19078]: Disconnected from invalid user suporte 34.124.148.87 port 32868 [preauth]
Nov 23 07:07:24 np0005532585.novalocal sshd[19080]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:07:26 np0005532585.novalocal sshd[19080]: Received disconnect from 187.72.57.82 port 40979:11: Bye Bye [preauth]
Nov 23 07:07:26 np0005532585.novalocal sshd[19080]: Disconnected from authenticating user root 187.72.57.82 port 40979 [preauth]
Nov 23 07:07:33 np0005532585.novalocal sshd[19015]: Received disconnect from 38.102.83.233 port 36272:11: disconnected by user
Nov 23 07:07:33 np0005532585.novalocal sshd[19015]: Disconnected from user zuul 38.102.83.233 port 36272
Nov 23 07:07:33 np0005532585.novalocal sshd[19012]: pam_unix(sshd:session): session closed for user zuul
Nov 23 07:07:33 np0005532585.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Nov 23 07:07:33 np0005532585.novalocal systemd-logind[761]: Session 9 logged out. Waiting for processes to exit.
Nov 23 07:07:33 np0005532585.novalocal systemd-logind[761]: Removed session 9.
Nov 23 07:08:39 np0005532585.novalocal sshd[19084]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:08:41 np0005532585.novalocal sshd[19084]: Received disconnect from 34.124.148.87 port 41348:11: Bye Bye [preauth]
Nov 23 07:08:41 np0005532585.novalocal sshd[19084]: Disconnected from authenticating user root 34.124.148.87 port 41348 [preauth]
Nov 23 07:09:00 np0005532585.novalocal sshd[19086]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:09:01 np0005532585.novalocal sshd[19086]: Received disconnect from 187.72.57.82 port 54777:11: Bye Bye [preauth]
Nov 23 07:09:01 np0005532585.novalocal sshd[19086]: Disconnected from authenticating user root 187.72.57.82 port 54777 [preauth]
Nov 23 07:10:15 np0005532585.novalocal sshd[19088]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:10:16 np0005532585.novalocal sshd[19088]: Invalid user showroom from 34.124.148.87 port 37688
Nov 23 07:10:17 np0005532585.novalocal sshd[19088]: Received disconnect from 34.124.148.87 port 37688:11: Bye Bye [preauth]
Nov 23 07:10:17 np0005532585.novalocal sshd[19088]: Disconnected from invalid user showroom 34.124.148.87 port 37688 [preauth]
Nov 23 07:10:41 np0005532585.novalocal sshd[19090]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:10:42 np0005532585.novalocal sshd[19090]: Received disconnect from 187.72.57.82 port 40343:11: Bye Bye [preauth]
Nov 23 07:10:42 np0005532585.novalocal sshd[19090]: Disconnected from authenticating user root 187.72.57.82 port 40343 [preauth]
Nov 23 07:11:54 np0005532585.novalocal sshd[19092]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:11:56 np0005532585.novalocal sshd[19092]: Invalid user user from 34.124.148.87 port 33350
Nov 23 07:11:56 np0005532585.novalocal sshd[19092]: Received disconnect from 34.124.148.87 port 33350:11: Bye Bye [preauth]
Nov 23 07:11:56 np0005532585.novalocal sshd[19092]: Disconnected from invalid user user 34.124.148.87 port 33350 [preauth]
Nov 23 07:12:23 np0005532585.novalocal sshd[19095]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:12:24 np0005532585.novalocal sshd[19095]: Invalid user ubuntu from 187.72.57.82 port 54141
Nov 23 07:12:24 np0005532585.novalocal sshd[19095]: Received disconnect from 187.72.57.82 port 54141:11: Bye Bye [preauth]
Nov 23 07:12:24 np0005532585.novalocal sshd[19095]: Disconnected from invalid user ubuntu 187.72.57.82 port 54141 [preauth]
Nov 23 07:13:38 np0005532585.novalocal sshd[19098]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:13:38 np0005532585.novalocal sshd[19100]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:13:38 np0005532585.novalocal sshd[19098]: Accepted publickey for zuul from 38.102.83.114 port 58512 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:13:38 np0005532585.novalocal systemd-logind[761]: New session 10 of user zuul.
Nov 23 07:13:38 np0005532585.novalocal systemd[1]: Started Session 10 of User zuul.
Nov 23 07:13:38 np0005532585.novalocal sshd[19098]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:13:38 np0005532585.novalocal python3[19117]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:13:40 np0005532585.novalocal sshd[19100]: Received disconnect from 34.124.148.87 port 33778:11: Bye Bye [preauth]
Nov 23 07:13:40 np0005532585.novalocal sshd[19100]: Disconnected from authenticating user root 34.124.148.87 port 33778 [preauth]
Nov 23 07:13:40 np0005532585.novalocal sudo[19135]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjxhbgqzjcuupzxjrhqytrwvbysadkcv ; /usr/bin/python3
Nov 23 07:13:40 np0005532585.novalocal sudo[19135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:13:40 np0005532585.novalocal python3[19137]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:13:42 np0005532585.novalocal sudo[19135]: pam_unix(sudo:session): session closed for user root
Nov 23 07:13:45 np0005532585.novalocal sudo[19154]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynidlrxgycxehvkalnjdrpqrkcbwqjfh ; /usr/bin/python3
Nov 23 07:13:45 np0005532585.novalocal sudo[19154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:13:45 np0005532585.novalocal python3[19156]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Nov 23 07:13:48 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:13:48 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:13:59 np0005532585.novalocal sshd[19291]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:14:00 np0005532585.novalocal sshd[19291]: Invalid user helen from 187.72.57.82 port 39705
Nov 23 07:14:00 np0005532585.novalocal sshd[19291]: Received disconnect from 187.72.57.82 port 39705:11: Bye Bye [preauth]
Nov 23 07:14:00 np0005532585.novalocal sshd[19291]: Disconnected from invalid user helen 187.72.57.82 port 39705 [preauth]
Nov 23 07:14:07 np0005532585.novalocal sshd[19301]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:14:13 np0005532585.novalocal sudo[19154]: pam_unix(sudo:session): session closed for user root
Nov 23 07:14:26 np0005532585.novalocal sshd[19301]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 07:14:26 np0005532585.novalocal sshd[19301]: Connection closed by 221.234.10.211 port 53494
Nov 23 07:14:42 np0005532585.novalocal sudo[19315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbfcqmebebuoboxdpofurrkwtyzxbssy ; /usr/bin/python3
Nov 23 07:14:42 np0005532585.novalocal sudo[19315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:14:42 np0005532585.novalocal python3[19317]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Nov 23 07:14:45 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:14:45 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:14:47 np0005532585.novalocal sudo[19315]: pam_unix(sudo:session): session closed for user root
Nov 23 07:14:53 np0005532585.novalocal sshd[19443]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:14:53 np0005532585.novalocal sudo[19457]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwyhyxrsdbmvnncyrrlzdogqyhsycgkx ; /usr/bin/python3
Nov 23 07:14:53 np0005532585.novalocal sudo[19457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:14:53 np0005532585.novalocal python3[19459]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Nov 23 07:14:54 np0005532585.novalocal sshd[19443]: Connection closed by authenticating user root 80.94.95.116 port 21894 [preauth]
Nov 23 07:14:56 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:14:57 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:02 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:02 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:09 np0005532585.novalocal sudo[19457]: pam_unix(sudo:session): session closed for user root
Nov 23 07:15:12 np0005532585.novalocal sshd[19839]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:15:14 np0005532585.novalocal sshd[19839]: Received disconnect from 34.124.148.87 port 43706:11: Bye Bye [preauth]
Nov 23 07:15:14 np0005532585.novalocal sshd[19839]: Disconnected from authenticating user root 34.124.148.87 port 43706 [preauth]
Nov 23 07:15:25 np0005532585.novalocal sudo[19854]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olwnzzgapzxyjkacfvnlqyxesephaecs ; /usr/bin/python3
Nov 23 07:15:25 np0005532585.novalocal systemd[1]: Starting dnf makecache...
Nov 23 07:15:25 np0005532585.novalocal sudo[19854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:15:25 np0005532585.novalocal python3[19857]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Nov 23 07:15:25 np0005532585.novalocal dnf[19856]: Updating Subscription Management repositories.
Nov 23 07:15:27 np0005532585.novalocal dnf[19856]: Failed determining last makecache time.
Nov 23 07:15:27 np0005532585.novalocal dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  44 MB/s |  24 MB     00:00
Nov 23 07:15:28 np0005532585.novalocal dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - High Av 6.0 MB/s | 2.5 MB     00:00
Nov 23 07:15:28 np0005532585.novalocal dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   62 MB/s |  44 MB     00:00
Nov 23 07:15:29 np0005532585.novalocal dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   27 MB/s |  14 MB     00:00
Nov 23 07:15:29 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:29 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:30 np0005532585.novalocal dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  57 MB/s |  42 MB     00:00
Nov 23 07:15:30 np0005532585.novalocal dnf[19856]: Metadata cache created.
Nov 23 07:15:30 np0005532585.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 23 07:15:30 np0005532585.novalocal systemd[1]: Finished dnf makecache.
Nov 23 07:15:30 np0005532585.novalocal systemd[1]: dnf-makecache.service: Consumed 3.822s CPU time.
Nov 23 07:15:34 np0005532585.novalocal sshd[20006]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:15:34 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:35 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:35 np0005532585.novalocal sshd[20006]: Received disconnect from 187.72.57.82 port 53513:11: Bye Bye [preauth]
Nov 23 07:15:35 np0005532585.novalocal sshd[20006]: Disconnected from authenticating user root 187.72.57.82 port 53513 [preauth]
Nov 23 07:15:41 np0005532585.novalocal sudo[19854]: pam_unix(sudo:session): session closed for user root
Nov 23 07:15:56 np0005532585.novalocal sudo[20214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mimogouyyqsobejuvncxhurwikyoauqx ; /usr/bin/python3
Nov 23 07:15:56 np0005532585.novalocal sudo[20214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:15:56 np0005532585.novalocal python3[20216]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Nov 23 07:15:59 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:15:59 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:16:04 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:16:04 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:16:12 np0005532585.novalocal sudo[20214]: pam_unix(sudo:session): session closed for user root
Nov 23 07:16:30 np0005532585.novalocal sudo[20493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdhwxiugpdrhylcezkfzxrdcguobfnoz ; /usr/bin/python3
Nov 23 07:16:30 np0005532585.novalocal sudo[20493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:16:30 np0005532585.novalocal python3[20495]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000013-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:16:32 np0005532585.novalocal sudo[20493]: pam_unix(sudo:session): session closed for user root
Nov 23 07:16:35 np0005532585.novalocal sudo[20512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjhhpomycdxfhnaxlwowoyynputuykez ; /usr/bin/python3
Nov 23 07:16:35 np0005532585.novalocal sudo[20512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:16:35 np0005532585.novalocal python3[20514]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:16:49 np0005532585.novalocal sshd[20589]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:16:50 np0005532585.novalocal sshd[20589]: Invalid user viktor from 34.124.148.87 port 56864
Nov 23 07:16:50 np0005532585.novalocal sshd[20589]: Received disconnect from 34.124.148.87 port 56864:11: Bye Bye [preauth]
Nov 23 07:16:50 np0005532585.novalocal sshd[20589]: Disconnected from invalid user viktor 34.124.148.87 port 56864 [preauth]
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  Converting 489 SID table entries...
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:16:53 np0005532585.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:16:53 np0005532585.novalocal groupadd[20600]: group added to /etc/group: name=unbound, GID=987
Nov 23 07:16:53 np0005532585.novalocal groupadd[20600]: group added to /etc/gshadow: name=unbound
Nov 23 07:16:53 np0005532585.novalocal groupadd[20600]: new group: name=unbound, GID=987
Nov 23 07:16:53 np0005532585.novalocal useradd[20607]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Nov 23 07:16:53 np0005532585.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Nov 23 07:16:53 np0005532585.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 23 07:16:53 np0005532585.novalocal groupadd[20620]: group added to /etc/group: name=openvswitch, GID=986
Nov 23 07:16:53 np0005532585.novalocal groupadd[20620]: group added to /etc/gshadow: name=openvswitch
Nov 23 07:16:53 np0005532585.novalocal groupadd[20620]: new group: name=openvswitch, GID=986
Nov 23 07:16:53 np0005532585.novalocal useradd[20627]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Nov 23 07:16:53 np0005532585.novalocal groupadd[20635]: group added to /etc/group: name=hugetlbfs, GID=985
Nov 23 07:16:53 np0005532585.novalocal groupadd[20635]: group added to /etc/gshadow: name=hugetlbfs
Nov 23 07:16:53 np0005532585.novalocal groupadd[20635]: new group: name=hugetlbfs, GID=985
Nov 23 07:16:53 np0005532585.novalocal usermod[20643]: add 'openvswitch' to group 'hugetlbfs'
Nov 23 07:16:53 np0005532585.novalocal usermod[20643]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 23 07:16:57 np0005532585.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:16:57 np0005532585.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 23 07:16:57 np0005532585.novalocal systemd[1]: Reloading.
Nov 23 07:16:57 np0005532585.novalocal systemd-sysv-generator[21177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:16:57 np0005532585.novalocal systemd-rc-local-generator[21174]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:16:57 np0005532585.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:16:57 np0005532585.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 07:16:58 np0005532585.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 07:16:58 np0005532585.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 23 07:16:58 np0005532585.novalocal systemd[1]: run-rbcb47c8f41e24d0192b20b7757a07e71.service: Deactivated successfully.
Nov 23 07:16:58 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:16:58 np0005532585.novalocal sudo[20512]: pam_unix(sudo:session): session closed for user root
Nov 23 07:16:58 np0005532585.novalocal rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:17:13 np0005532585.novalocal sshd[21811]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:17:14 np0005532585.novalocal sshd[21811]: Invalid user p from 187.72.57.82 port 39079
Nov 23 07:17:15 np0005532585.novalocal sshd[21811]: Received disconnect from 187.72.57.82 port 39079:11: Bye Bye [preauth]
Nov 23 07:17:15 np0005532585.novalocal sshd[21811]: Disconnected from invalid user p 187.72.57.82 port 39079 [preauth]
Nov 23 07:17:27 np0005532585.novalocal sudo[21827]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vluhrdatsknwliizmxbwktxsfkwlwdtc ; /usr/bin/python3
Nov 23 07:17:27 np0005532585.novalocal sudo[21827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:27 np0005532585.novalocal python3[21829]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000015-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:17:41 np0005532585.novalocal sudo[21827]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:54 np0005532585.novalocal sudo[21847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhumxilqtvwylpkiihxwchndqzjzwyzg ; /usr/bin/python3
Nov 23 07:17:54 np0005532585.novalocal sudo[21847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:55 np0005532585.novalocal python3[21849]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:17:55 np0005532585.novalocal sudo[21847]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:55 np0005532585.novalocal sudo[21895]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqenyytpvcakibjkftkbhmbmivytiahw ; /usr/bin/python3
Nov 23 07:17:55 np0005532585.novalocal sudo[21895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:55 np0005532585.novalocal python3[21897]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:17:55 np0005532585.novalocal sudo[21895]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:56 np0005532585.novalocal sudo[21938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmhccezsmntednprttolccmpxqcqnigq ; /usr/bin/python3
Nov 23 07:17:56 np0005532585.novalocal sudo[21938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:56 np0005532585.novalocal python3[21940]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763882275.5478146-292-75817046235764/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=9333f42ac4b9baf349a5c32f7bcba3335b5912e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:17:56 np0005532585.novalocal sudo[21938]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:57 np0005532585.novalocal sudo[21968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcearkcdchfexdrgixfozjarwnfknorw ; /usr/bin/python3
Nov 23 07:17:57 np0005532585.novalocal sudo[21968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:57 np0005532585.novalocal python3[21970]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 23 07:17:57 np0005532585.novalocal sudo[21968]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:57 np0005532585.novalocal systemd-journald[618]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Nov 23 07:17:57 np0005532585.novalocal systemd-journald[618]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 07:17:57 np0005532585.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 07:17:57 np0005532585.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 07:17:57 np0005532585.novalocal sudo[21989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hynwhlcowxqnjcnoiiuunfqmgkxcjtvv ; /usr/bin/python3
Nov 23 07:17:57 np0005532585.novalocal sudo[21989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:57 np0005532585.novalocal python3[21991]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 23 07:17:58 np0005532585.novalocal sudo[21989]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:59 np0005532585.novalocal sudo[22009]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzodruhytwbcwdyvfnrgasjxevaqlmrd ; /usr/bin/python3
Nov 23 07:17:59 np0005532585.novalocal sudo[22009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:59 np0005532585.novalocal python3[22011]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 23 07:17:59 np0005532585.novalocal sudo[22009]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:59 np0005532585.novalocal sudo[22029]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vansdpjfoqmhvmriyixfwwqxcmewrcwj ; /usr/bin/python3
Nov 23 07:17:59 np0005532585.novalocal sudo[22029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:59 np0005532585.novalocal python3[22031]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 23 07:17:59 np0005532585.novalocal sudo[22029]: pam_unix(sudo:session): session closed for user root
Nov 23 07:17:59 np0005532585.novalocal sudo[22049]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkgomsamhdjfcxdmxzgwzytclzfzvwkf ; /usr/bin/python3
Nov 23 07:17:59 np0005532585.novalocal sudo[22049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:17:59 np0005532585.novalocal python3[22051]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 23 07:17:59 np0005532585.novalocal sudo[22049]: pam_unix(sudo:session): session closed for user root
Nov 23 07:18:00 np0005532585.novalocal sudo[22069]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxbwnhqrgwtkicwrxxablprzcojziqoa ; /usr/bin/python3
Nov 23 07:18:00 np0005532585.novalocal sudo[22069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:18:01 np0005532585.novalocal python3[22071]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:18:01 np0005532585.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Nov 23 07:18:01 np0005532585.novalocal network[22074]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:01 np0005532585.novalocal network[22085]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:01 np0005532585.novalocal network[22074]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:01 np0005532585.novalocal network[22086]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:01 np0005532585.novalocal network[22074]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 07:18:01 np0005532585.novalocal network[22087]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 07:18:01 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882281.3081] audit: op="connections-reload" pid=22115 uid=0 result="success"
Nov 23 07:18:01 np0005532585.novalocal network[22074]: Bringing up loopback interface:  [  OK  ]
Nov 23 07:18:01 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882281.5033] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22203 uid=0 result="success"
Nov 23 07:18:01 np0005532585.novalocal network[22074]: Bringing up interface eth0:  [  OK  ]
Nov 23 07:18:01 np0005532585.novalocal systemd[1]: Started LSB: Bring up/down networking.
Nov 23 07:18:01 np0005532585.novalocal sudo[22069]: pam_unix(sudo:session): session closed for user root
Nov 23 07:18:01 np0005532585.novalocal sudo[22242]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfpyrfxnywbrwntslkedwbcmwlcspdng ; /usr/bin/python3
Nov 23 07:18:01 np0005532585.novalocal sudo[22242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:18:01 np0005532585.novalocal python3[22244]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Nov 23 07:18:02 np0005532585.novalocal chown[22248]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22253]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22253]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22253]: Starting ovsdb-server [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal ovs-vsctl[22302]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 23 07:18:02 np0005532585.novalocal ovs-vsctl[22322]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"26f986a7-6ac7-4ec2-887b-8da6da04a661\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22253]: Configuring Open vSwitch system IDs [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal ovs-vsctl[22328]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005532585.novalocal
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22253]: Enabling remote OVSDB managers [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Started Open vSwitch Database Unit.
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 23 07:18:02 np0005532585.novalocal kernel: openvswitch: Open vSwitch switching datapath
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22372]: Inserting openvswitch module [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22341]: Starting ovs-vswitchd [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal ovs-vsctl[22391]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005532585.novalocal
Nov 23 07:18:02 np0005532585.novalocal ovs-ctl[22341]: Enabling remote OVSDB managers [  OK  ]
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Starting Open vSwitch...
Nov 23 07:18:02 np0005532585.novalocal systemd[1]: Finished Open vSwitch.
Nov 23 07:18:02 np0005532585.novalocal sudo[22242]: pam_unix(sudo:session): session closed for user root
Nov 23 07:18:05 np0005532585.novalocal sudo[22407]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltvyjhyldlpmpgwgwzyighkxgbtpmajq ; /usr/bin/python3
Nov 23 07:18:05 np0005532585.novalocal sudo[22407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:18:05 np0005532585.novalocal python3[22409]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000001a-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:18:06 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882286.1673] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22567 uid=0 result="success"
Nov 23 07:18:06 np0005532585.novalocal ifup[22568]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:06 np0005532585.novalocal ifup[22569]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:06 np0005532585.novalocal ifup[22570]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:06 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882286.2017] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22576 uid=0 result="success"
Nov 23 07:18:06 np0005532585.novalocal ovs-vsctl[22578]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:15:fb:34 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Nov 23 07:18:06 np0005532585.novalocal kernel: device ovs-system entered promiscuous mode
Nov 23 07:18:06 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882286.2299] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Nov 23 07:18:06 np0005532585.novalocal systemd-udevd[22580]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 07:18:06 np0005532585.novalocal kernel: Timeout policy base is empty
Nov 23 07:18:06 np0005532585.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Nov 23 07:18:06 np0005532585.novalocal kernel: device br-ex entered promiscuous mode
Nov 23 07:18:06 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882286.2742] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Nov 23 07:18:06 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882286.2996] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22606 uid=0 result="success"
Nov 23 07:18:06 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882286.3199] device (br-ex): carrier: link connected
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.3735] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22635 uid=0 result="success"
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.4191] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22650 uid=0 result="success"
Nov 23 07:18:09 np0005532585.novalocal NET[22675]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.5084] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.5289] dhcp4 (eth1): canceled DHCP transaction
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.5289] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.5290] dhcp4 (eth1): state changed no lease
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.5331] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22684 uid=0 result="success"
Nov 23 07:18:09 np0005532585.novalocal ifup[22685]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:09 np0005532585.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 07:18:09 np0005532585.novalocal ifup[22686]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:09 np0005532585.novalocal ifup[22688]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:09 np0005532585.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.5702] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22701 uid=0 result="success"
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.6157] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22712 uid=0 result="success"
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.6227] device (eth1): carrier: link connected
Nov 23 07:18:09 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882289.6450] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22721 uid=0 result="success"
Nov 23 07:18:09 np0005532585.novalocal ipv6_wait_tentative[22733]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Nov 23 07:18:10 np0005532585.novalocal ipv6_wait_tentative[22738]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.7151] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22747 uid=0 result="success"
Nov 23 07:18:11 np0005532585.novalocal ovs-vsctl[22762]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Nov 23 07:18:11 np0005532585.novalocal kernel: device eth1 entered promiscuous mode
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.7846] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22770 uid=0 result="success"
Nov 23 07:18:11 np0005532585.novalocal ifup[22771]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:11 np0005532585.novalocal ifup[22772]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:11 np0005532585.novalocal ifup[22773]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.8151] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22779 uid=0 result="success"
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.8554] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22789 uid=0 result="success"
Nov 23 07:18:11 np0005532585.novalocal ifup[22790]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:11 np0005532585.novalocal ifup[22791]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:11 np0005532585.novalocal ifup[22792]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.8850] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22798 uid=0 result="success"
Nov 23 07:18:11 np0005532585.novalocal ovs-vsctl[22801]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Nov 23 07:18:11 np0005532585.novalocal kernel: device vlan20 entered promiscuous mode
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.9226] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Nov 23 07:18:11 np0005532585.novalocal systemd-udevd[22803]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.9480] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22812 uid=0 result="success"
Nov 23 07:18:11 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882291.9694] device (vlan20): carrier: link connected
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.0147] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22841 uid=0 result="success"
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.0544] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22856 uid=0 result="success"
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.1096] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22877 uid=0 result="success"
Nov 23 07:18:15 np0005532585.novalocal ifup[22878]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:15 np0005532585.novalocal ifup[22879]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:15 np0005532585.novalocal ifup[22880]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.1395] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22886 uid=0 result="success"
Nov 23 07:18:15 np0005532585.novalocal ovs-vsctl[22889]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Nov 23 07:18:15 np0005532585.novalocal kernel: device vlan44 entered promiscuous mode
Nov 23 07:18:15 np0005532585.novalocal systemd-udevd[22891]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.1805] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.2087] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22901 uid=0 result="success"
Nov 23 07:18:15 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882295.2298] device (vlan44): carrier: link connected
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.2785] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22931 uid=0 result="success"
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.3270] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22946 uid=0 result="success"
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.3807] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22967 uid=0 result="success"
Nov 23 07:18:18 np0005532585.novalocal ifup[22968]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:18 np0005532585.novalocal ifup[22969]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:18 np0005532585.novalocal ifup[22970]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.4124] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22976 uid=0 result="success"
Nov 23 07:18:18 np0005532585.novalocal ovs-vsctl[22979]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Nov 23 07:18:18 np0005532585.novalocal systemd-udevd[22981]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 07:18:18 np0005532585.novalocal kernel: device vlan22 entered promiscuous mode
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.4481] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.4727] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22991 uid=0 result="success"
Nov 23 07:18:18 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882298.4887] device (vlan22): carrier: link connected
Nov 23 07:18:19 np0005532585.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.5420] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23021 uid=0 result="success"
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.5885] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23036 uid=0 result="success"
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.6381] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23057 uid=0 result="success"
Nov 23 07:18:21 np0005532585.novalocal ifup[23058]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:21 np0005532585.novalocal ifup[23059]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:21 np0005532585.novalocal ifup[23060]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.6655] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23066 uid=0 result="success"
Nov 23 07:18:21 np0005532585.novalocal ovs-vsctl[23069]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Nov 23 07:18:21 np0005532585.novalocal kernel: device vlan23 entered promiscuous mode
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.7311] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Nov 23 07:18:21 np0005532585.novalocal systemd-udevd[23071]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.7546] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23081 uid=0 result="success"
Nov 23 07:18:21 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882301.7738] device (vlan23): carrier: link connected
Nov 23 07:18:24 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882304.8230] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23111 uid=0 result="success"
Nov 23 07:18:24 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882304.8706] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23126 uid=0 result="success"
Nov 23 07:18:24 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882304.9294] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23147 uid=0 result="success"
Nov 23 07:18:24 np0005532585.novalocal ifup[23148]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:24 np0005532585.novalocal ifup[23149]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:24 np0005532585.novalocal ifup[23150]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:24 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882304.9614] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23156 uid=0 result="success"
Nov 23 07:18:24 np0005532585.novalocal ovs-vsctl[23159]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Nov 23 07:18:25 np0005532585.novalocal kernel: device vlan21 entered promiscuous mode
Nov 23 07:18:25 np0005532585.novalocal systemd-udevd[23161]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 07:18:25 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882305.0011] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Nov 23 07:18:25 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882305.0270] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23171 uid=0 result="success"
Nov 23 07:18:25 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882305.0481] device (vlan21): carrier: link connected
Nov 23 07:18:28 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882308.1031] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23201 uid=0 result="success"
Nov 23 07:18:28 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882308.1430] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23216 uid=0 result="success"
Nov 23 07:18:28 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882308.1909] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23237 uid=0 result="success"
Nov 23 07:18:28 np0005532585.novalocal ifup[23238]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:28 np0005532585.novalocal ifup[23239]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:28 np0005532585.novalocal ifup[23240]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:28 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882308.2153] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23246 uid=0 result="success"
Nov 23 07:18:28 np0005532585.novalocal ovs-vsctl[23249]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Nov 23 07:18:28 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882308.2955] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23256 uid=0 result="success"
Nov 23 07:18:29 np0005532585.novalocal sshd[23274]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:18:29 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882309.3565] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23285 uid=0 result="success"
Nov 23 07:18:29 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882309.4025] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23300 uid=0 result="success"
Nov 23 07:18:29 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882309.4593] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23321 uid=0 result="success"
Nov 23 07:18:29 np0005532585.novalocal ifup[23322]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:29 np0005532585.novalocal ifup[23323]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:29 np0005532585.novalocal ifup[23324]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:29 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882309.4912] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23330 uid=0 result="success"
Nov 23 07:18:29 np0005532585.novalocal ovs-vsctl[23333]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Nov 23 07:18:29 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882309.5494] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23340 uid=0 result="success"
Nov 23 07:18:30 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882310.6114] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23368 uid=0 result="success"
Nov 23 07:18:30 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882310.6611] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23383 uid=0 result="success"
Nov 23 07:18:30 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882310.7207] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23404 uid=0 result="success"
Nov 23 07:18:30 np0005532585.novalocal ifup[23405]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:30 np0005532585.novalocal ifup[23406]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:30 np0005532585.novalocal ifup[23407]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:30 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882310.7540] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23413 uid=0 result="success"
Nov 23 07:18:30 np0005532585.novalocal ovs-vsctl[23416]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Nov 23 07:18:30 np0005532585.novalocal sshd[23274]: Invalid user p from 34.124.148.87 port 37314
Nov 23 07:18:30 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882310.8120] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23423 uid=0 result="success"
Nov 23 07:18:31 np0005532585.novalocal sshd[23274]: Received disconnect from 34.124.148.87 port 37314:11: Bye Bye [preauth]
Nov 23 07:18:31 np0005532585.novalocal sshd[23274]: Disconnected from invalid user p 34.124.148.87 port 37314 [preauth]
Nov 23 07:18:31 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882311.8707] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23451 uid=0 result="success"
Nov 23 07:18:31 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882311.9164] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23466 uid=0 result="success"
Nov 23 07:18:31 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882311.9742] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23487 uid=0 result="success"
Nov 23 07:18:31 np0005532585.novalocal ifup[23488]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:31 np0005532585.novalocal ifup[23489]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:31 np0005532585.novalocal ifup[23490]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:32 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882312.0064] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23496 uid=0 result="success"
Nov 23 07:18:32 np0005532585.novalocal ovs-vsctl[23499]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Nov 23 07:18:32 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882312.0986] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23506 uid=0 result="success"
Nov 23 07:18:33 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882313.1557] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23534 uid=0 result="success"
Nov 23 07:18:33 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882313.2009] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23549 uid=0 result="success"
Nov 23 07:18:33 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882313.2567] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23570 uid=0 result="success"
Nov 23 07:18:33 np0005532585.novalocal ifup[23571]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 23 07:18:33 np0005532585.novalocal ifup[23572]: 'network-scripts' will be removed from distribution in near future.
Nov 23 07:18:33 np0005532585.novalocal ifup[23573]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 23 07:18:33 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882313.2875] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23579 uid=0 result="success"
Nov 23 07:18:33 np0005532585.novalocal ovs-vsctl[23582]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Nov 23 07:18:33 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882313.3785] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23589 uid=0 result="success"
Nov 23 07:18:34 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882314.4336] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23617 uid=0 result="success"
Nov 23 07:18:34 np0005532585.novalocal NetworkManager[5975]: <info>  [1763882314.4813] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23632 uid=0 result="success"
Nov 23 07:18:34 np0005532585.novalocal sudo[22407]: pam_unix(sudo:session): session closed for user root
Nov 23 07:18:49 np0005532585.novalocal sshd[23650]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:18:50 np0005532585.novalocal sshd[23650]: Invalid user itsupport from 187.72.57.82 port 52876
Nov 23 07:18:51 np0005532585.novalocal sshd[23650]: Received disconnect from 187.72.57.82 port 52876:11: Bye Bye [preauth]
Nov 23 07:18:51 np0005532585.novalocal sshd[23650]: Disconnected from invalid user itsupport 187.72.57.82 port 52876 [preauth]
Nov 23 07:19:27 np0005532585.novalocal python3[23667]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000001b-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:19:33 np0005532585.novalocal python3[23686]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 07:19:33 np0005532585.novalocal sudo[23700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgnoqrzanchhkjdsanbvusodlvtyjfsi ; /usr/bin/python3
Nov 23 07:19:33 np0005532585.novalocal sudo[23700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:19:33 np0005532585.novalocal python3[23702]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 07:19:33 np0005532585.novalocal sudo[23700]: pam_unix(sudo:session): session closed for user root
Nov 23 07:19:35 np0005532585.novalocal python3[23716]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 07:19:35 np0005532585.novalocal sudo[23730]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onrpgykjbwuqcbnoarbuvwescbrjavqt ; /usr/bin/python3
Nov 23 07:19:35 np0005532585.novalocal sudo[23730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:19:35 np0005532585.novalocal python3[23732]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 07:19:35 np0005532585.novalocal sudo[23730]: pam_unix(sudo:session): session closed for user root
Nov 23 07:19:36 np0005532585.novalocal python3[23746]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Nov 23 07:19:37 np0005532585.novalocal python3[23761]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005532585.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000022-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:19:38 np0005532585.novalocal sudo[23779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzuyctdjujyiczcdnernlgmrdjsggkai ; /usr/bin/python3
Nov 23 07:19:38 np0005532585.novalocal sudo[23779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:19:38 np0005532585.novalocal python3[23781]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:19:38 np0005532585.novalocal systemd[1]: Starting Hostname Service...
Nov 23 07:19:38 np0005532585.novalocal systemd[1]: Started Hostname Service.
Nov 23 07:19:38 np0005532585.localdomain systemd-hostnamed[23785]: Hostname set to <np0005532585.localdomain> (static)
Nov 23 07:19:38 np0005532585.localdomain NetworkManager[5975]: <info>  [1763882378.6039] hostname: static hostname changed from "np0005532585.novalocal" to "np0005532585.localdomain"
Nov 23 07:19:38 np0005532585.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 07:19:38 np0005532585.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 07:19:38 np0005532585.localdomain sudo[23779]: pam_unix(sudo:session): session closed for user root
Nov 23 07:19:40 np0005532585.localdomain sshd[19098]: pam_unix(sshd:session): session closed for user zuul
Nov 23 07:19:40 np0005532585.localdomain systemd-logind[761]: Session 10 logged out. Waiting for processes to exit.
Nov 23 07:19:40 np0005532585.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Nov 23 07:19:40 np0005532585.localdomain systemd[1]: session-10.scope: Consumed 1min 43.396s CPU time.
Nov 23 07:19:40 np0005532585.localdomain systemd-logind[761]: Removed session 10.
Nov 23 07:19:42 np0005532585.localdomain sshd[23796]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:19:43 np0005532585.localdomain sshd[23796]: Accepted publickey for zuul from 38.102.83.114 port 59196 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:19:43 np0005532585.localdomain systemd-logind[761]: New session 11 of user zuul.
Nov 23 07:19:43 np0005532585.localdomain systemd[1]: Started Session 11 of User zuul.
Nov 23 07:19:43 np0005532585.localdomain sshd[23796]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:19:43 np0005532585.localdomain python3[23813]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Nov 23 07:19:45 np0005532585.localdomain sshd[23796]: pam_unix(sshd:session): session closed for user zuul
Nov 23 07:19:45 np0005532585.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Nov 23 07:19:45 np0005532585.localdomain systemd-logind[761]: Session 11 logged out. Waiting for processes to exit.
Nov 23 07:19:45 np0005532585.localdomain systemd-logind[761]: Removed session 11.
Nov 23 07:19:48 np0005532585.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 07:20:08 np0005532585.localdomain sshd[23815]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:20:08 np0005532585.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 07:20:11 np0005532585.localdomain sshd[23815]: Received disconnect from 34.124.148.87 port 46092:11: Bye Bye [preauth]
Nov 23 07:20:11 np0005532585.localdomain sshd[23815]: Disconnected from authenticating user root 34.124.148.87 port 46092 [preauth]
Nov 23 07:20:23 np0005532585.localdomain sshd[23820]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:20:24 np0005532585.localdomain sshd[23820]: Invalid user felix from 187.72.57.82 port 38439
Nov 23 07:20:24 np0005532585.localdomain sshd[23820]: Received disconnect from 187.72.57.82 port 38439:11: Bye Bye [preauth]
Nov 23 07:20:24 np0005532585.localdomain sshd[23820]: Disconnected from invalid user felix 187.72.57.82 port 38439 [preauth]
Nov 23 07:20:31 np0005532585.localdomain sshd[23822]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:20:32 np0005532585.localdomain sshd[23822]: Accepted publickey for zuul from 38.102.83.114 port 54134 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:20:32 np0005532585.localdomain systemd-logind[761]: New session 12 of user zuul.
Nov 23 07:20:32 np0005532585.localdomain systemd[1]: Started Session 12 of User zuul.
Nov 23 07:20:32 np0005532585.localdomain sshd[23822]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:20:32 np0005532585.localdomain sudo[23839]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-astougkbtedqixpuillwwsvsztwzghek ; /usr/bin/python3
Nov 23 07:20:32 np0005532585.localdomain sudo[23839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:20:32 np0005532585.localdomain python3[23841]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:20:36 np0005532585.localdomain systemd-rc-local-generator[23879]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:20:36 np0005532585.localdomain systemd-sysv-generator[23884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:20:36 np0005532585.localdomain systemd-rc-local-generator[23923]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:20:36 np0005532585.localdomain systemd-sysv-generator[23926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:20:36 np0005532585.localdomain systemd-sysv-generator[23969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:20:36 np0005532585.localdomain systemd-rc-local-generator[23966]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:20:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:20:37 np0005532585.localdomain systemd-sysv-generator[24015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:20:37 np0005532585.localdomain systemd-rc-local-generator[24012]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: run-rf59b1884ed1f41a0bec64077d7b2621b.service: Deactivated successfully.
Nov 23 07:20:37 np0005532585.localdomain systemd[1]: run-r97bfd8361b184554866928a6d999f465.service: Deactivated successfully.
Nov 23 07:20:38 np0005532585.localdomain sudo[23839]: pam_unix(sudo:session): session closed for user root
Nov 23 07:21:38 np0005532585.localdomain sshd[23825]: Received disconnect from 38.102.83.114 port 54134:11: disconnected by user
Nov 23 07:21:38 np0005532585.localdomain sshd[23825]: Disconnected from user zuul 38.102.83.114 port 54134
Nov 23 07:21:38 np0005532585.localdomain sshd[23822]: pam_unix(sshd:session): session closed for user zuul
Nov 23 07:21:38 np0005532585.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Nov 23 07:21:38 np0005532585.localdomain systemd[1]: session-12.scope: Consumed 4.652s CPU time.
Nov 23 07:21:38 np0005532585.localdomain systemd-logind[761]: Session 12 logged out. Waiting for processes to exit.
Nov 23 07:21:38 np0005532585.localdomain systemd-logind[761]: Removed session 12.
Nov 23 07:21:49 np0005532585.localdomain sshd[24615]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:21:50 np0005532585.localdomain sshd[24615]: Received disconnect from 34.124.148.87 port 46210:11: Bye Bye [preauth]
Nov 23 07:21:50 np0005532585.localdomain sshd[24615]: Disconnected from authenticating user root 34.124.148.87 port 46210 [preauth]
Nov 23 07:21:58 np0005532585.localdomain sshd[24617]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:21:59 np0005532585.localdomain sshd[24617]: Invalid user mohammad from 187.72.57.82 port 52239
Nov 23 07:21:59 np0005532585.localdomain sshd[24617]: Received disconnect from 187.72.57.82 port 52239:11: Bye Bye [preauth]
Nov 23 07:21:59 np0005532585.localdomain sshd[24617]: Disconnected from invalid user mohammad 187.72.57.82 port 52239 [preauth]
Nov 23 07:23:29 np0005532585.localdomain sshd[24619]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:23:32 np0005532585.localdomain sshd[24619]: Invalid user xt from 34.124.148.87 port 57876
Nov 23 07:23:32 np0005532585.localdomain sshd[24619]: Received disconnect from 34.124.148.87 port 57876:11: Bye Bye [preauth]
Nov 23 07:23:32 np0005532585.localdomain sshd[24619]: Disconnected from invalid user xt 34.124.148.87 port 57876 [preauth]
Nov 23 07:23:37 np0005532585.localdomain sshd[24621]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:23:38 np0005532585.localdomain sshd[24621]: Received disconnect from 187.72.57.82 port 37804:11: Bye Bye [preauth]
Nov 23 07:23:38 np0005532585.localdomain sshd[24621]: Disconnected from authenticating user root 187.72.57.82 port 37804 [preauth]
Nov 23 07:25:13 np0005532585.localdomain sshd[24624]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:25:14 np0005532585.localdomain sshd[24626]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:25:15 np0005532585.localdomain sshd[24624]: Received disconnect from 34.124.148.87 port 37212:11: Bye Bye [preauth]
Nov 23 07:25:15 np0005532585.localdomain sshd[24624]: Disconnected from authenticating user root 34.124.148.87 port 37212 [preauth]
Nov 23 07:25:15 np0005532585.localdomain sshd[24626]: Invalid user apolo from 187.72.57.82 port 51610
Nov 23 07:25:16 np0005532585.localdomain sshd[24626]: Received disconnect from 187.72.57.82 port 51610:11: Bye Bye [preauth]
Nov 23 07:25:16 np0005532585.localdomain sshd[24626]: Disconnected from invalid user apolo 187.72.57.82 port 51610 [preauth]
Nov 23 07:25:30 np0005532585.localdomain sshd[24628]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:25:31 np0005532585.localdomain sshd[24628]: Invalid user vpn from 185.156.73.233 port 31066
Nov 23 07:25:31 np0005532585.localdomain sshd[24628]: Connection closed by invalid user vpn 185.156.73.233 port 31066 [preauth]
Nov 23 07:26:50 np0005532585.localdomain sshd[24630]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:26:51 np0005532585.localdomain sshd[24630]: Invalid user user1 from 187.72.57.82 port 37176
Nov 23 07:26:51 np0005532585.localdomain sshd[24630]: Received disconnect from 187.72.57.82 port 37176:11: Bye Bye [preauth]
Nov 23 07:26:51 np0005532585.localdomain sshd[24630]: Disconnected from invalid user user1 187.72.57.82 port 37176 [preauth]
Nov 23 07:26:55 np0005532585.localdomain sshd[24632]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:26:57 np0005532585.localdomain sshd[24632]: Invalid user ec2-user from 34.124.148.87 port 39810
Nov 23 07:26:57 np0005532585.localdomain sshd[24632]: Received disconnect from 34.124.148.87 port 39810:11: Bye Bye [preauth]
Nov 23 07:26:57 np0005532585.localdomain sshd[24632]: Disconnected from invalid user ec2-user 34.124.148.87 port 39810 [preauth]
Nov 23 07:27:01 np0005532585.localdomain anacron[19004]: Job `cron.weekly' started
Nov 23 07:27:01 np0005532585.localdomain anacron[19004]: Job `cron.weekly' terminated
Nov 23 07:28:23 np0005532585.localdomain sshd[24636]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:28:24 np0005532585.localdomain sshd[24636]: Invalid user ftpuser from 187.72.57.82 port 50973
Nov 23 07:28:24 np0005532585.localdomain sshd[24636]: Received disconnect from 187.72.57.82 port 50973:11: Bye Bye [preauth]
Nov 23 07:28:24 np0005532585.localdomain sshd[24636]: Disconnected from invalid user ftpuser 187.72.57.82 port 50973 [preauth]
Nov 23 07:28:40 np0005532585.localdomain sshd[24638]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:28:42 np0005532585.localdomain sshd[24638]: Received disconnect from 34.124.148.87 port 50824:11: Bye Bye [preauth]
Nov 23 07:28:42 np0005532585.localdomain sshd[24638]: Disconnected from authenticating user root 34.124.148.87 port 50824 [preauth]
Nov 23 07:30:02 np0005532585.localdomain sshd[24642]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:30:03 np0005532585.localdomain sshd[24642]: Invalid user oracle from 187.72.57.82 port 36543
Nov 23 07:30:03 np0005532585.localdomain sshd[24642]: Received disconnect from 187.72.57.82 port 36543:11: Bye Bye [preauth]
Nov 23 07:30:03 np0005532585.localdomain sshd[24642]: Disconnected from invalid user oracle 187.72.57.82 port 36543 [preauth]
Nov 23 07:30:24 np0005532585.localdomain sshd[24645]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:30:26 np0005532585.localdomain sshd[24645]: Received disconnect from 34.124.148.87 port 58200:11: Bye Bye [preauth]
Nov 23 07:30:26 np0005532585.localdomain sshd[24645]: Disconnected from authenticating user root 34.124.148.87 port 58200 [preauth]
Nov 23 07:31:14 np0005532585.localdomain sshd[24647]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:31:18 np0005532585.localdomain sshd[24647]: Invalid user config from 59.10.56.130 port 46177
Nov 23 07:31:19 np0005532585.localdomain sshd[24647]: Connection closed by invalid user config 59.10.56.130 port 46177 [preauth]
Nov 23 07:31:41 np0005532585.localdomain sshd[24649]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:31:42 np0005532585.localdomain sshd[24649]: Received disconnect from 187.72.57.82 port 50340:11: Bye Bye [preauth]
Nov 23 07:31:42 np0005532585.localdomain sshd[24649]: Disconnected from authenticating user root 187.72.57.82 port 50340 [preauth]
Nov 23 07:32:06 np0005532585.localdomain sshd[24651]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:32:08 np0005532585.localdomain sshd[24651]: Received disconnect from 34.124.148.87 port 52070:11: Bye Bye [preauth]
Nov 23 07:32:08 np0005532585.localdomain sshd[24651]: Disconnected from authenticating user root 34.124.148.87 port 52070 [preauth]
Nov 23 07:33:15 np0005532585.localdomain sshd[24653]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:33:16 np0005532585.localdomain sshd[24653]: Invalid user ftpuser from 187.72.57.82 port 35903
Nov 23 07:33:16 np0005532585.localdomain sshd[24653]: Received disconnect from 187.72.57.82 port 35903:11: Bye Bye [preauth]
Nov 23 07:33:16 np0005532585.localdomain sshd[24653]: Disconnected from invalid user ftpuser 187.72.57.82 port 35903 [preauth]
Nov 23 07:33:39 np0005532585.localdomain sshd[24655]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:33:40 np0005532585.localdomain sshd[24655]: Received disconnect from 34.124.148.87 port 51972:11: Bye Bye [preauth]
Nov 23 07:33:40 np0005532585.localdomain sshd[24655]: Disconnected from authenticating user root 34.124.148.87 port 51972 [preauth]
Nov 23 07:34:49 np0005532585.localdomain sshd[24657]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:34:50 np0005532585.localdomain sshd[24657]: Received disconnect from 187.72.57.82 port 49701:11: Bye Bye [preauth]
Nov 23 07:34:50 np0005532585.localdomain sshd[24657]: Disconnected from authenticating user root 187.72.57.82 port 49701 [preauth]
Nov 23 07:35:12 np0005532585.localdomain sshd[24661]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:35:13 np0005532585.localdomain sshd[24661]: Received disconnect from 34.124.148.87 port 59204:11: Bye Bye [preauth]
Nov 23 07:35:13 np0005532585.localdomain sshd[24661]: Disconnected from authenticating user root 34.124.148.87 port 59204 [preauth]
Nov 23 07:36:24 np0005532585.localdomain sshd[24664]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:36:25 np0005532585.localdomain sshd[24664]: Received disconnect from 187.72.57.82 port 35264:11: Bye Bye [preauth]
Nov 23 07:36:25 np0005532585.localdomain sshd[24664]: Disconnected from authenticating user root 187.72.57.82 port 35264 [preauth]
Nov 23 07:36:38 np0005532585.localdomain sshd[24666]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:36:40 np0005532585.localdomain sshd[24666]: Invalid user ubuntu from 185.156.73.233 port 37756
Nov 23 07:36:40 np0005532585.localdomain sshd[24666]: Connection closed by invalid user ubuntu 185.156.73.233 port 37756 [preauth]
Nov 23 07:36:50 np0005532585.localdomain sshd[24668]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:36:51 np0005532585.localdomain sshd[24668]: Invalid user admin from 34.124.148.87 port 33928
Nov 23 07:36:51 np0005532585.localdomain sshd[24668]: Received disconnect from 34.124.148.87 port 33928:11: Bye Bye [preauth]
Nov 23 07:36:51 np0005532585.localdomain sshd[24668]: Disconnected from invalid user admin 34.124.148.87 port 33928 [preauth]
Nov 23 07:37:15 np0005532585.localdomain sshd[24670]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:37:15 np0005532585.localdomain sshd[24670]: Accepted publickey for zuul from 192.168.122.100 port 35880 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:37:15 np0005532585.localdomain systemd-logind[761]: New session 13 of user zuul.
Nov 23 07:37:15 np0005532585.localdomain systemd[1]: Started Session 13 of User zuul.
Nov 23 07:37:15 np0005532585.localdomain sshd[24670]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:37:15 np0005532585.localdomain sudo[24716]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akenxqdeqhkpfmvqjjahusvrzyuyunxa ; /usr/bin/python3
Nov 23 07:37:15 np0005532585.localdomain sudo[24716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:15 np0005532585.localdomain python3[24718]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 07:37:16 np0005532585.localdomain sudo[24716]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:17 np0005532585.localdomain sudo[24803]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqijjgzmhtoetdonsjgnaycpskfrhjcl ; /usr/bin/python3
Nov 23 07:37:17 np0005532585.localdomain sudo[24803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:17 np0005532585.localdomain python3[24805]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:37:20 np0005532585.localdomain sudo[24803]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:20 np0005532585.localdomain sudo[24820]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbbiayusjhhcpkkfyrogacsclocnjuox ; /usr/bin/python3
Nov 23 07:37:20 np0005532585.localdomain sudo[24820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:20 np0005532585.localdomain python3[24822]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:37:20 np0005532585.localdomain sudo[24820]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:21 np0005532585.localdomain sudo[24836]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdaebytqhdjfxkwjwwnqorgepzncwwgu ; /usr/bin/python3
Nov 23 07:37:21 np0005532585.localdomain sudo[24836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:21 np0005532585.localdomain python3[24838]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:21 np0005532585.localdomain kernel: loop: module loaded
Nov 23 07:37:21 np0005532585.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Nov 23 07:37:21 np0005532585.localdomain sudo[24836]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:21 np0005532585.localdomain sudo[24861]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqhasfwfmfqampgbcgkfxhrsovsbetwy ; /usr/bin/python3
Nov 23 07:37:21 np0005532585.localdomain sudo[24861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:21 np0005532585.localdomain python3[24863]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:21 np0005532585.localdomain lvm[24866]: PV /dev/loop3 not used.
Nov 23 07:37:21 np0005532585.localdomain lvm[24868]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 07:37:21 np0005532585.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 23 07:37:21 np0005532585.localdomain lvm[24878]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 07:37:21 np0005532585.localdomain lvm[24878]: VG ceph_vg0 finished
Nov 23 07:37:21 np0005532585.localdomain lvm[24877]:   1 logical volume(s) in volume group "ceph_vg0" now active
Nov 23 07:37:22 np0005532585.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 23 07:37:22 np0005532585.localdomain sudo[24861]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:22 np0005532585.localdomain sudo[24924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrixhyxirzvsbpgomzdbvnjcnpkwmotl ; /usr/bin/python3
Nov 23 07:37:22 np0005532585.localdomain sudo[24924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:22 np0005532585.localdomain python3[24926]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:37:22 np0005532585.localdomain sudo[24924]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:22 np0005532585.localdomain sudo[24967]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqfrigtfwjfeqfgpkxkequlsyyjbwxpg ; /usr/bin/python3
Nov 23 07:37:22 np0005532585.localdomain sudo[24967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:23 np0005532585.localdomain python3[24969]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883442.2949338-55245-19214359527875/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:23 np0005532585.localdomain sudo[24967]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:23 np0005532585.localdomain sudo[24997]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieluoyhrjltcwvlgzipzcnebsymdxact ; /usr/bin/python3
Nov 23 07:37:23 np0005532585.localdomain sudo[24997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:23 np0005532585.localdomain python3[24999]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:37:23 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:37:24 np0005532585.localdomain systemd-rc-local-generator[25023]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:37:24 np0005532585.localdomain systemd-sysv-generator[25027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:37:24 np0005532585.localdomain systemd[1]: Starting Ceph OSD losetup...
Nov 23 07:37:24 np0005532585.localdomain bash[25039]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img)
Nov 23 07:37:24 np0005532585.localdomain systemd[1]: Finished Ceph OSD losetup.
Nov 23 07:37:24 np0005532585.localdomain sudo[24997]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:24 np0005532585.localdomain lvm[25040]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 07:37:24 np0005532585.localdomain lvm[25040]: VG ceph_vg0 finished
Nov 23 07:37:24 np0005532585.localdomain sudo[25055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npxkkelsdrexzvhiyjkvfekdfxtelivb ; /usr/bin/python3
Nov 23 07:37:24 np0005532585.localdomain sudo[25055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:24 np0005532585.localdomain python3[25057]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:37:27 np0005532585.localdomain sudo[25055]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:27 np0005532585.localdomain sudo[25072]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejpuveiczvttflprhhjfqzxdyvmzjjvr ; /usr/bin/python3
Nov 23 07:37:27 np0005532585.localdomain sudo[25072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:27 np0005532585.localdomain python3[25074]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:37:27 np0005532585.localdomain sudo[25072]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:28 np0005532585.localdomain sudo[25088]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiaxulyetmfgaqfvqofzjgsjsjvbqcps ; /usr/bin/python3
Nov 23 07:37:28 np0005532585.localdomain sudo[25088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:28 np0005532585.localdomain python3[25090]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:28 np0005532585.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Nov 23 07:37:28 np0005532585.localdomain sudo[25088]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:28 np0005532585.localdomain sudo[25110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkjnbwdpnbrvtsgscuktlkmsvvlctulo ; /usr/bin/python3
Nov 23 07:37:28 np0005532585.localdomain sudo[25110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:28 np0005532585.localdomain python3[25112]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:28 np0005532585.localdomain lvm[25115]: PV /dev/loop4 not used.
Nov 23 07:37:29 np0005532585.localdomain lvm[25125]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 23 07:37:29 np0005532585.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Nov 23 07:37:29 np0005532585.localdomain sudo[25110]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:29 np0005532585.localdomain lvm[25127]:   1 logical volume(s) in volume group "ceph_vg1" now active
Nov 23 07:37:29 np0005532585.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Nov 23 07:37:29 np0005532585.localdomain sudo[25173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpjjklfrtjcijyfkodnzpwsyxhtcvdbl ; /usr/bin/python3
Nov 23 07:37:29 np0005532585.localdomain sudo[25173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:29 np0005532585.localdomain python3[25175]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:37:29 np0005532585.localdomain sudo[25173]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:29 np0005532585.localdomain sudo[25216]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alvvxtndioivudeqhfeywamxhjqihkrp ; /usr/bin/python3
Nov 23 07:37:29 np0005532585.localdomain sudo[25216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:30 np0005532585.localdomain python3[25218]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883449.3683252-55449-238753956326735/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:30 np0005532585.localdomain sudo[25216]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:30 np0005532585.localdomain sudo[25246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aquuxlwhgrdkzdbzolwzyiqnbyrqwqzl ; /usr/bin/python3
Nov 23 07:37:30 np0005532585.localdomain sudo[25246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:30 np0005532585.localdomain python3[25248]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:37:30 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:37:30 np0005532585.localdomain systemd-sysv-generator[25274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:37:30 np0005532585.localdomain systemd-rc-local-generator[25271]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:37:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:37:31 np0005532585.localdomain systemd[1]: Starting Ceph OSD losetup...
Nov 23 07:37:31 np0005532585.localdomain bash[25289]: /dev/loop4: [64516]:8399529 (/var/lib/ceph-osd-1.img)
Nov 23 07:37:31 np0005532585.localdomain systemd[1]: Finished Ceph OSD losetup.
Nov 23 07:37:31 np0005532585.localdomain lvm[25290]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 23 07:37:31 np0005532585.localdomain lvm[25290]: VG ceph_vg1 finished
Nov 23 07:37:31 np0005532585.localdomain sudo[25246]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:40 np0005532585.localdomain sudo[25333]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cntyavwttyozuietweggkvepqzxyyahg ; /usr/bin/python3
Nov 23 07:37:40 np0005532585.localdomain sudo[25333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:40 np0005532585.localdomain python3[25335]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 07:37:40 np0005532585.localdomain sudo[25333]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:41 np0005532585.localdomain sudo[25353]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjthonieednjzvuktwqjrxofxjajxpxf ; /usr/bin/python3
Nov 23 07:37:41 np0005532585.localdomain sudo[25353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:41 np0005532585.localdomain python3[25355]: ansible-hostname Invoked with name=np0005532585.localdomain use=None
Nov 23 07:37:41 np0005532585.localdomain systemd[1]: Starting Hostname Service...
Nov 23 07:37:42 np0005532585.localdomain systemd[1]: Started Hostname Service.
Nov 23 07:37:42 np0005532585.localdomain sudo[25353]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:44 np0005532585.localdomain sudo[25376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xletccaostvmojodgdigxbencddqpufs ; /usr/bin/python3
Nov 23 07:37:44 np0005532585.localdomain sudo[25376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:44 np0005532585.localdomain python3[25378]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Nov 23 07:37:44 np0005532585.localdomain sudo[25376]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:44 np0005532585.localdomain sudo[25424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhsmrwhwtrfontwggeakoltrzpbxzhaj ; /usr/bin/python3
Nov 23 07:37:44 np0005532585.localdomain sudo[25424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:44 np0005532585.localdomain python3[25426]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.kt75qvnutmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:44 np0005532585.localdomain sudo[25424]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:45 np0005532585.localdomain sudo[25454]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjebjbltkbemhgmyeikbcopbikvonbxc ; /usr/bin/python3
Nov 23 07:37:45 np0005532585.localdomain sudo[25454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:45 np0005532585.localdomain python3[25456]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.kt75qvnutmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:45 np0005532585.localdomain sudo[25454]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:45 np0005532585.localdomain sudo[25470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emyozuekqamnxkwdvznwfrkaaswdjeir ; /usr/bin/python3
Nov 23 07:37:45 np0005532585.localdomain sudo[25470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:45 np0005532585.localdomain python3[25472]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.kt75qvnutmphosts insertbefore=BOF block=192.168.122.106 np0005532584.localdomain np0005532584
                                                         192.168.122.106 np0005532584.ctlplane.localdomain np0005532584.ctlplane
                                                         192.168.122.107 np0005532585.localdomain np0005532585
                                                         192.168.122.107 np0005532585.ctlplane.localdomain np0005532585.ctlplane
                                                         192.168.122.108 np0005532586.localdomain np0005532586
                                                         192.168.122.108 np0005532586.ctlplane.localdomain np0005532586.ctlplane
                                                         192.168.122.103 np0005532581.localdomain np0005532581
                                                         192.168.122.103 np0005532581.ctlplane.localdomain np0005532581.ctlplane
                                                         192.168.122.104 np0005532582.localdomain np0005532582
                                                         192.168.122.104 np0005532582.ctlplane.localdomain np0005532582.ctlplane
                                                         192.168.122.105 np0005532583.localdomain np0005532583
                                                         192.168.122.105 np0005532583.ctlplane.localdomain np0005532583.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:45 np0005532585.localdomain sudo[25470]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:46 np0005532585.localdomain sudo[25486]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lviltqfzsgxxhrcavbuwdhxorebsoedm ; /usr/bin/python3
Nov 23 07:37:46 np0005532585.localdomain sudo[25486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:46 np0005532585.localdomain python3[25488]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.kt75qvnutmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:46 np0005532585.localdomain sudo[25486]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:46 np0005532585.localdomain sudo[25503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieoslimdorgdqstasulejbokyoxgqwao ; /usr/bin/python3
Nov 23 07:37:46 np0005532585.localdomain sudo[25503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:46 np0005532585.localdomain python3[25505]: ansible-file Invoked with path=/tmp/ansible.kt75qvnutmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:46 np0005532585.localdomain sudo[25503]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:48 np0005532585.localdomain sudo[25519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viadpcjjprmnanmukfjgmqehhdynxqjo ; /usr/bin/python3
Nov 23 07:37:48 np0005532585.localdomain sudo[25519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:48 np0005532585.localdomain python3[25521]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:48 np0005532585.localdomain sudo[25519]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:50 np0005532585.localdomain sudo[25537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xobbakqcskhzvoxrgfdnkfynmvlokapx ; /usr/bin/python3
Nov 23 07:37:50 np0005532585.localdomain sudo[25537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:50 np0005532585.localdomain python3[25539]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:37:53 np0005532585.localdomain sudo[25537]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:54 np0005532585.localdomain sudo[25586]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nluacotqkywkpstgdmetibuymfmpferr ; /usr/bin/python3
Nov 23 07:37:54 np0005532585.localdomain sudo[25586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:54 np0005532585.localdomain python3[25588]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:37:54 np0005532585.localdomain sudo[25586]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:54 np0005532585.localdomain sudo[25631]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqcradvlufizagbcdgsqqaxlekybbrev ; /usr/bin/python3
Nov 23 07:37:54 np0005532585.localdomain sudo[25631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:54 np0005532585.localdomain python3[25633]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883474.032404-56296-69872870508002/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:55 np0005532585.localdomain sudo[25631]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:56 np0005532585.localdomain sudo[25661]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucvqwpcqbnwydcdfbjfaxpbdskfqwzma ; /usr/bin/python3
Nov 23 07:37:56 np0005532585.localdomain sudo[25661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:56 np0005532585.localdomain python3[25663]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:37:56 np0005532585.localdomain sudo[25661]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:56 np0005532585.localdomain sudo[25679]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkjsmkmlzkzeigpoyptwhrxwhfjotdnj ; /usr/bin/python3
Nov 23 07:37:56 np0005532585.localdomain sudo[25679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:56 np0005532585.localdomain python3[25681]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:37:56 np0005532585.localdomain chronyd[767]: chronyd exiting
Nov 23 07:37:56 np0005532585.localdomain systemd[1]: Stopping NTP client/server...
Nov 23 07:37:56 np0005532585.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 23 07:37:56 np0005532585.localdomain systemd[1]: Stopped NTP client/server.
Nov 23 07:37:56 np0005532585.localdomain systemd[1]: chronyd.service: Consumed 116ms CPU time, read 1.9M from disk, written 0B to disk.
Nov 23 07:37:56 np0005532585.localdomain systemd[1]: Starting NTP client/server...
Nov 23 07:37:56 np0005532585.localdomain chronyd[25688]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 23 07:37:56 np0005532585.localdomain chronyd[25688]: Frequency -30.271 +/- 0.118 ppm read from /var/lib/chrony/drift
Nov 23 07:37:56 np0005532585.localdomain chronyd[25688]: Loaded seccomp filter (level 2)
Nov 23 07:37:56 np0005532585.localdomain systemd[1]: Started NTP client/server.
Nov 23 07:37:56 np0005532585.localdomain sudo[25679]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:57 np0005532585.localdomain sudo[25735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axifgopnkjukxjbjdimaexhlntiwetig ; /usr/bin/python3
Nov 23 07:37:57 np0005532585.localdomain sudo[25735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:57 np0005532585.localdomain python3[25737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:37:57 np0005532585.localdomain sudo[25735]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:57 np0005532585.localdomain sudo[25778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jklzigqabkuuacepdvplrgirewumwohu ; /usr/bin/python3
Nov 23 07:37:57 np0005532585.localdomain sudo[25778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:58 np0005532585.localdomain python3[25780]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883477.4283235-56485-280233936923020/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:37:58 np0005532585.localdomain sudo[25778]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:58 np0005532585.localdomain sudo[25808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dombomqlsvaqvljdmxigokdjfellrkht ; /usr/bin/python3
Nov 23 07:37:58 np0005532585.localdomain sudo[25808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:58 np0005532585.localdomain python3[25810]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:37:58 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:37:58 np0005532585.localdomain systemd-rc-local-generator[25834]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:37:58 np0005532585.localdomain systemd-sysv-generator[25839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:37:58 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:37:58 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:37:59 np0005532585.localdomain systemd-sysv-generator[25877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:37:59 np0005532585.localdomain systemd-rc-local-generator[25874]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:37:59 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:37:59 np0005532585.localdomain systemd[1]: Starting chronyd online sources service...
Nov 23 07:37:59 np0005532585.localdomain chronyc[25887]: 200 OK
Nov 23 07:37:59 np0005532585.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Nov 23 07:37:59 np0005532585.localdomain systemd[1]: Finished chronyd online sources service.
Nov 23 07:37:59 np0005532585.localdomain sudo[25808]: pam_unix(sudo:session): session closed for user root
Nov 23 07:37:59 np0005532585.localdomain sudo[25901]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blxydzupfeasvrkldmiwtqzurpkphuxj ; /usr/bin/python3
Nov 23 07:37:59 np0005532585.localdomain sudo[25901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:37:59 np0005532585.localdomain python3[25903]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:37:59 np0005532585.localdomain chronyd[25688]: System clock was stepped by 0.000000 seconds
Nov 23 07:37:59 np0005532585.localdomain sudo[25901]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:00 np0005532585.localdomain sudo[25918]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skbicganaoltadnrjrmrgvmxptixbrkm ; /usr/bin/python3
Nov 23 07:38:00 np0005532585.localdomain sudo[25918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:00 np0005532585.localdomain python3[25920]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:38:01 np0005532585.localdomain chronyd[25688]: Selected source 23.133.168.245 (pool.ntp.org)
Nov 23 07:38:10 np0005532585.localdomain sudo[25918]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:10 np0005532585.localdomain sudo[25935]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imrgafblzhursxkryvbucwwcijxwtgoq ; /usr/bin/python3
Nov 23 07:38:10 np0005532585.localdomain sudo[25935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:10 np0005532585.localdomain python3[25937]: ansible-timezone Invoked with name=UTC hwclock=None
Nov 23 07:38:10 np0005532585.localdomain systemd[1]: Starting Time & Date Service...
Nov 23 07:38:10 np0005532585.localdomain systemd[1]: Started Time & Date Service.
Nov 23 07:38:10 np0005532585.localdomain sudo[25935]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:12 np0005532585.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 07:38:12 np0005532585.localdomain sudo[25958]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unekfdwhdtjyckugxtutvnpjpkujigyl ; /usr/bin/python3
Nov 23 07:38:12 np0005532585.localdomain sudo[25958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:12 np0005532585.localdomain python3[25960]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:38:12 np0005532585.localdomain chronyd[25688]: chronyd exiting
Nov 23 07:38:12 np0005532585.localdomain systemd[1]: Stopping NTP client/server...
Nov 23 07:38:12 np0005532585.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 23 07:38:12 np0005532585.localdomain systemd[1]: Stopped NTP client/server.
Nov 23 07:38:12 np0005532585.localdomain systemd[1]: Starting NTP client/server...
Nov 23 07:38:12 np0005532585.localdomain chronyd[25967]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 23 07:38:12 np0005532585.localdomain chronyd[25967]: Frequency -30.271 +/- 0.125 ppm read from /var/lib/chrony/drift
Nov 23 07:38:12 np0005532585.localdomain chronyd[25967]: Loaded seccomp filter (level 2)
Nov 23 07:38:12 np0005532585.localdomain systemd[1]: Started NTP client/server.
Nov 23 07:38:12 np0005532585.localdomain sudo[25958]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:17 np0005532585.localdomain chronyd[25967]: Selected source 23.133.168.245 (pool.ntp.org)
Nov 23 07:38:23 np0005532585.localdomain sshd[25969]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:38:25 np0005532585.localdomain sshd[25969]: Connection closed by authenticating user root 46.30.161.197 port 60614 [preauth]
Nov 23 07:38:25 np0005532585.localdomain sshd[25971]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:38:25 np0005532585.localdomain sshd[25973]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:38:26 np0005532585.localdomain sshd[25973]: Invalid user ftpuser from 187.72.57.82 port 49067
Nov 23 07:38:26 np0005532585.localdomain sshd[25973]: Received disconnect from 187.72.57.82 port 49067:11: Bye Bye [preauth]
Nov 23 07:38:26 np0005532585.localdomain sshd[25973]: Disconnected from invalid user ftpuser 187.72.57.82 port 49067 [preauth]
Nov 23 07:38:26 np0005532585.localdomain sshd[25971]: Invalid user boss from 34.124.148.87 port 49290
Nov 23 07:38:26 np0005532585.localdomain sshd[25971]: Received disconnect from 34.124.148.87 port 49290:11: Bye Bye [preauth]
Nov 23 07:38:26 np0005532585.localdomain sshd[25971]: Disconnected from invalid user boss 34.124.148.87 port 49290 [preauth]
Nov 23 07:38:28 np0005532585.localdomain sudo[25988]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awyqygboroqlkpfawbgauvflbmpxczii ; /usr/bin/python3
Nov 23 07:38:28 np0005532585.localdomain sudo[25988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:28 np0005532585.localdomain useradd[25992]: new group: name=ceph-admin, GID=1002
Nov 23 07:38:28 np0005532585.localdomain useradd[25992]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Nov 23 07:38:28 np0005532585.localdomain sudo[25988]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:29 np0005532585.localdomain sudo[26044]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcztztmjhoncmhxowwevciywcnipgqgw ; /usr/bin/python3
Nov 23 07:38:29 np0005532585.localdomain sudo[26044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:29 np0005532585.localdomain sudo[26044]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:29 np0005532585.localdomain sudo[26087]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhfhspdxwnjjxhhhuciupnvkpbmzrtzx ; /usr/bin/python3
Nov 23 07:38:29 np0005532585.localdomain sudo[26087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:29 np0005532585.localdomain sudo[26087]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:30 np0005532585.localdomain sudo[26117]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwmrhmrytpkfvgeaupdcebgawosznzfa ; /usr/bin/python3
Nov 23 07:38:30 np0005532585.localdomain sudo[26117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:30 np0005532585.localdomain sudo[26117]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:30 np0005532585.localdomain sudo[26133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdsyufgzudjleaysrfufyhzhkyywnhex ; /usr/bin/python3
Nov 23 07:38:30 np0005532585.localdomain sudo[26133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:30 np0005532585.localdomain sudo[26133]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:30 np0005532585.localdomain sudo[26149]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kctyxziowdhopbqqabfauvbugbzmleye ; /usr/bin/python3
Nov 23 07:38:30 np0005532585.localdomain sudo[26149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:31 np0005532585.localdomain sudo[26149]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:31 np0005532585.localdomain sudo[26165]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyopasgtuntsolojtqfizuovlebsijxp ; /usr/bin/python3
Nov 23 07:38:31 np0005532585.localdomain sudo[26165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:38:31 np0005532585.localdomain sudo[26165]: pam_unix(sudo:session): session closed for user root
Nov 23 07:38:40 np0005532585.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 07:39:56 np0005532585.localdomain sshd[26170]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:39:58 np0005532585.localdomain sshd[26170]: Received disconnect from 187.72.57.82 port 34629:11: Bye Bye [preauth]
Nov 23 07:39:58 np0005532585.localdomain sshd[26170]: Disconnected from authenticating user root 187.72.57.82 port 34629 [preauth]
Nov 23 07:40:04 np0005532585.localdomain sshd[26172]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:05 np0005532585.localdomain sshd[26172]: Invalid user daniel from 34.124.148.87 port 59672
Nov 23 07:40:05 np0005532585.localdomain sshd[26172]: Received disconnect from 34.124.148.87 port 59672:11: Bye Bye [preauth]
Nov 23 07:40:05 np0005532585.localdomain sshd[26172]: Disconnected from invalid user daniel 34.124.148.87 port 59672 [preauth]
Nov 23 07:40:14 np0005532585.localdomain sshd[26174]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:14 np0005532585.localdomain sshd[26174]: Accepted publickey for ceph-admin from 192.168.122.103 port 43540 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:14 np0005532585.localdomain systemd-logind[761]: New session 14 of user ceph-admin.
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 1002.
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Starting User Manager for UID 1002...
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:14 np0005532585.localdomain sshd[26191]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Queued start job for default target Main User Target.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Created slice User Application Slice.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Reached target Paths.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Reached target Timers.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Starting D-Bus User Message Bus Socket...
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Starting Create User's Volatile Files and Directories...
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Finished Create User's Volatile Files and Directories.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Listening on D-Bus User Message Bus Socket.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Reached target Sockets.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Reached target Basic System.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Reached target Main User Target.
Nov 23 07:40:14 np0005532585.localdomain systemd[26178]: Startup finished in 105ms.
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Started User Manager for UID 1002.
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Nov 23 07:40:14 np0005532585.localdomain sshd[26174]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:14 np0005532585.localdomain sshd[26191]: Accepted publickey for ceph-admin from 192.168.122.103 port 43544 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:14 np0005532585.localdomain systemd-logind[761]: New session 16 of user ceph-admin.
Nov 23 07:40:14 np0005532585.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Nov 23 07:40:14 np0005532585.localdomain sshd[26191]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:14 np0005532585.localdomain sudo[26198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:40:14 np0005532585.localdomain sudo[26198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:14 np0005532585.localdomain sudo[26198]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:15 np0005532585.localdomain sshd[26213]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:15 np0005532585.localdomain sshd[26213]: Accepted publickey for ceph-admin from 192.168.122.103 port 43552 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:15 np0005532585.localdomain systemd-logind[761]: New session 17 of user ceph-admin.
Nov 23 07:40:15 np0005532585.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Nov 23 07:40:15 np0005532585.localdomain sshd[26213]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:15 np0005532585.localdomain sudo[26217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005532585.localdomain
Nov 23 07:40:15 np0005532585.localdomain sudo[26217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:15 np0005532585.localdomain sudo[26217]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:15 np0005532585.localdomain sshd[26232]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:15 np0005532585.localdomain sshd[26232]: Accepted publickey for ceph-admin from 192.168.122.103 port 43566 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:15 np0005532585.localdomain systemd-logind[761]: New session 18 of user ceph-admin.
Nov 23 07:40:15 np0005532585.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Nov 23 07:40:15 np0005532585.localdomain sshd[26232]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:15 np0005532585.localdomain sudo[26236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Nov 23 07:40:15 np0005532585.localdomain sudo[26236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:15 np0005532585.localdomain sudo[26236]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:15 np0005532585.localdomain sshd[26251]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:15 np0005532585.localdomain sshd[26251]: Accepted publickey for ceph-admin from 192.168.122.103 port 43576 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:15 np0005532585.localdomain systemd-logind[761]: New session 19 of user ceph-admin.
Nov 23 07:40:15 np0005532585.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Nov 23 07:40:15 np0005532585.localdomain sshd[26251]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:16 np0005532585.localdomain sudo[26255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 07:40:16 np0005532585.localdomain sudo[26255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:16 np0005532585.localdomain sudo[26255]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:16 np0005532585.localdomain sshd[26270]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:16 np0005532585.localdomain sshd[26270]: Accepted publickey for ceph-admin from 192.168.122.103 port 43582 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:16 np0005532585.localdomain systemd-logind[761]: New session 20 of user ceph-admin.
Nov 23 07:40:16 np0005532585.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Nov 23 07:40:16 np0005532585.localdomain sshd[26270]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:16 np0005532585.localdomain sudo[26274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 07:40:16 np0005532585.localdomain sudo[26274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:16 np0005532585.localdomain sudo[26274]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:16 np0005532585.localdomain sshd[26289]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:16 np0005532585.localdomain sshd[26289]: Accepted publickey for ceph-admin from 192.168.122.103 port 43592 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:16 np0005532585.localdomain systemd-logind[761]: New session 21 of user ceph-admin.
Nov 23 07:40:16 np0005532585.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Nov 23 07:40:16 np0005532585.localdomain sshd[26289]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:16 np0005532585.localdomain sudo[26293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Nov 23 07:40:16 np0005532585.localdomain sudo[26293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:16 np0005532585.localdomain sudo[26293]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:16 np0005532585.localdomain sshd[26308]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:16 np0005532585.localdomain sshd[26308]: Accepted publickey for ceph-admin from 192.168.122.103 port 43600 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:17 np0005532585.localdomain systemd-logind[761]: New session 22 of user ceph-admin.
Nov 23 07:40:17 np0005532585.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Nov 23 07:40:17 np0005532585.localdomain sshd[26308]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:17 np0005532585.localdomain sudo[26312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 07:40:17 np0005532585.localdomain sudo[26312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:17 np0005532585.localdomain sudo[26312]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:17 np0005532585.localdomain sshd[26327]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:17 np0005532585.localdomain sshd[26327]: Accepted publickey for ceph-admin from 192.168.122.103 port 43606 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:17 np0005532585.localdomain systemd-logind[761]: New session 23 of user ceph-admin.
Nov 23 07:40:17 np0005532585.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Nov 23 07:40:17 np0005532585.localdomain sshd[26327]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:17 np0005532585.localdomain sudo[26331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Nov 23 07:40:17 np0005532585.localdomain sudo[26331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:17 np0005532585.localdomain sudo[26331]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:17 np0005532585.localdomain sshd[26346]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:17 np0005532585.localdomain sshd[26346]: Accepted publickey for ceph-admin from 192.168.122.103 port 43616 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:17 np0005532585.localdomain systemd-logind[761]: New session 24 of user ceph-admin.
Nov 23 07:40:17 np0005532585.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Nov 23 07:40:17 np0005532585.localdomain sshd[26346]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:18 np0005532585.localdomain sshd[26363]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:18 np0005532585.localdomain sshd[26363]: Accepted publickey for ceph-admin from 192.168.122.103 port 43632 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:18 np0005532585.localdomain systemd-logind[761]: New session 25 of user ceph-admin.
Nov 23 07:40:18 np0005532585.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Nov 23 07:40:18 np0005532585.localdomain sshd[26363]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:18 np0005532585.localdomain sudo[26367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Nov 23 07:40:18 np0005532585.localdomain sudo[26367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:18 np0005532585.localdomain sudo[26367]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:18 np0005532585.localdomain sshd[26382]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:40:18 np0005532585.localdomain sshd[26382]: Accepted publickey for ceph-admin from 192.168.122.103 port 43648 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 07:40:18 np0005532585.localdomain systemd-logind[761]: New session 26 of user ceph-admin.
Nov 23 07:40:18 np0005532585.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Nov 23 07:40:18 np0005532585.localdomain sshd[26382]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 07:40:18 np0005532585.localdomain sudo[26386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005532585.localdomain
Nov 23 07:40:18 np0005532585.localdomain sudo[26386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:19 np0005532585.localdomain sudo[26386]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:42 np0005532585.localdomain sudo[26423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:40:42 np0005532585.localdomain sudo[26423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:42 np0005532585.localdomain sudo[26423]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:42 np0005532585.localdomain sudo[26438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:40:42 np0005532585.localdomain sudo[26438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:42 np0005532585.localdomain sudo[26438]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:42 np0005532585.localdomain sudo[26453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 07:40:42 np0005532585.localdomain sudo[26453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:43 np0005532585.localdomain sudo[26453]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:43 np0005532585.localdomain sudo[26488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:40:43 np0005532585.localdomain sudo[26488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:43 np0005532585.localdomain sudo[26488]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:43 np0005532585.localdomain sudo[26503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 07:40:43 np0005532585.localdomain sudo[26503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:43 np0005532585.localdomain sudo[26503]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:43 np0005532585.localdomain sudo[26554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:40:43 np0005532585.localdomain sudo[26554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:43 np0005532585.localdomain sudo[26554]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:44 np0005532585.localdomain sudo[26569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:40:44 np0005532585.localdomain sudo[26569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:44 np0005532585.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26596 (sysctl)
Nov 23 07:40:44 np0005532585.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 23 07:40:44 np0005532585.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 23 07:40:44 np0005532585.localdomain sudo[26569]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:44 np0005532585.localdomain sudo[26618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:40:44 np0005532585.localdomain sudo[26618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:44 np0005532585.localdomain sudo[26618]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:44 np0005532585.localdomain sudo[26633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 07:40:44 np0005532585.localdomain sudo[26633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:45 np0005532585.localdomain sudo[26633]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:45 np0005532585.localdomain sudo[26668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:40:45 np0005532585.localdomain sudo[26668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:45 np0005532585.localdomain sudo[26668]: pam_unix(sudo:session): session closed for user root
Nov 23 07:40:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:45 np0005532585.localdomain sudo[26683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 07:40:45 np0005532585.localdomain sudo[26683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:40:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:40:49 np0005532585.localdomain kernel: VFS: idmapped mount is not enabled.
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 2025-11-23 07:41:07.82143409 +0000 UTC m=+22.107074887 container create 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=553, ceph=True)
Nov 23 07:41:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck12222527-merged.mount: Deactivated successfully.
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 2025-11-23 07:40:45.756348559 +0000 UTC m=+0.041989386 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:07 np0005532585.localdomain systemd[1]: Created slice Slice /machine.
Nov 23 07:41:07 np0005532585.localdomain systemd[1]: Started libpod-conmon-3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126.scope.
Nov 23 07:41:07 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 2025-11-23 07:41:07.926179671 +0000 UTC m=+22.211820488 container init 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 2025-11-23 07:41:07.936262796 +0000 UTC m=+22.221903593 container start 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph)
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 2025-11-23 07:41:07.936687629 +0000 UTC m=+22.222328506 container attach 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=)
Nov 23 07:41:07 np0005532585.localdomain nifty_ishizaka[26881]: 167 167
Nov 23 07:41:07 np0005532585.localdomain systemd[1]: libpod-3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126.scope: Deactivated successfully.
Nov 23 07:41:07 np0005532585.localdomain podman[26736]: 2025-11-23 07:41:07.94098737 +0000 UTC m=+22.226628217 container died 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Nov 23 07:41:08 np0005532585.localdomain podman[26886]: 2025-11-23 07:41:08.022960059 +0000 UTC m=+0.073125441 container remove 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, architecture=x86_64)
Nov 23 07:41:08 np0005532585.localdomain systemd[1]: libpod-conmon-3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126.scope: Deactivated successfully.
Nov 23 07:41:08 np0005532585.localdomain podman[26908]: 
Nov 23 07:41:08 np0005532585.localdomain podman[26908]: 2025-11-23 07:41:08.252101376 +0000 UTC m=+0.062417656 container create a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 23 07:41:08 np0005532585.localdomain systemd[1]: Started libpod-conmon-a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647.scope.
Nov 23 07:41:08 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:08 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d501d3ca55a33bbd6dcd98428057573660dc63e7dd8e73144372362de04e8119/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:08 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d501d3ca55a33bbd6dcd98428057573660dc63e7dd8e73144372362de04e8119/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:08 np0005532585.localdomain podman[26908]: 2025-11-23 07:41:08.324971078 +0000 UTC m=+0.135287348 container init a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553)
Nov 23 07:41:08 np0005532585.localdomain podman[26908]: 2025-11-23 07:41:08.231344896 +0000 UTC m=+0.041661176 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:08 np0005532585.localdomain podman[26908]: 2025-11-23 07:41:08.33162318 +0000 UTC m=+0.141939450 container start a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=553, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public)
Nov 23 07:41:08 np0005532585.localdomain podman[26908]: 2025-11-23 07:41:08.331791705 +0000 UTC m=+0.142108035 container attach a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, release=553, GIT_CLEAN=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public)
Nov 23 07:41:08 np0005532585.localdomain systemd[1]: tmp-crun.sbBVmZ.mount: Deactivated successfully.
Nov 23 07:41:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e138ded2b6606e6b477798be8b9f6e3b9a918a0b2e76585a28268ec882051d3f-merged.mount: Deactivated successfully.
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]: [
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:     {
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "available": false,
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "ceph_device": false,
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "lsm_data": {},
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "lvs": [],
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "path": "/dev/sr0",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "rejected_reasons": [
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "Has a FileSystem",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "Insufficient space (<5GB)"
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         ],
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         "sys_api": {
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "actuators": null,
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "device_nodes": "sr0",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "human_readable_size": "482.00 KB",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "id_bus": "ata",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "model": "QEMU DVD-ROM",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "nr_requests": "2",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "partitions": {},
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "path": "/dev/sr0",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "removable": "1",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "rev": "2.5+",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "ro": "0",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "rotational": "1",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "sas_address": "",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "sas_device_handle": "",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "scheduler_mode": "mq-deadline",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "sectors": 0,
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "sectorsize": "2048",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "size": 493568.0,
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "support_discard": "0",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "type": "disk",
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:             "vendor": "QEMU"
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:         }
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]:     }
Nov 23 07:41:09 np0005532585.localdomain dreamy_faraday[26930]: ]
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: libpod-a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647.scope: Deactivated successfully.
Nov 23 07:41:09 np0005532585.localdomain podman[26908]: 2025-11-23 07:41:09.155846585 +0000 UTC m=+0.966162895 container died a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-09-24T08:57:55, release=553, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, architecture=x86_64)
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d501d3ca55a33bbd6dcd98428057573660dc63e7dd8e73144372362de04e8119-merged.mount: Deactivated successfully.
Nov 23 07:41:09 np0005532585.localdomain podman[28381]: 2025-11-23 07:41:09.248088636 +0000 UTC m=+0.079034252 container remove a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: libpod-conmon-a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647.scope: Deactivated successfully.
Nov 23 07:41:09 np0005532585.localdomain sudo[26683]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:09 np0005532585.localdomain sudo[28418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:41:09 np0005532585.localdomain sudo[28418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:09 np0005532585.localdomain sudo[28418]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:09 np0005532585.localdomain sudo[28433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b --coredump-max-size=32G
Nov 23 07:41:09 np0005532585.localdomain sudo[28433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: Closed Process Core Dump Socket.
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: Stopping Process Core Dump Socket...
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: Listening on Process Core Dump Socket.
Nov 23 07:41:09 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:09 np0005532585.localdomain systemd-sysv-generator[28488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:09 np0005532585.localdomain systemd-rc-local-generator[28485]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:10 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:10 np0005532585.localdomain systemd-rc-local-generator[28526]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:10 np0005532585.localdomain systemd-sysv-generator[28531]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:10 np0005532585.localdomain sudo[28433]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:11 np0005532585.localdomain rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 07:41:33 np0005532585.localdomain sshd[28794]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:41:34 np0005532585.localdomain sshd[28794]: Invalid user zzg from 187.72.57.82 port 48428
Nov 23 07:41:34 np0005532585.localdomain sshd[28794]: Received disconnect from 187.72.57.82 port 48428:11: Bye Bye [preauth]
Nov 23 07:41:34 np0005532585.localdomain sshd[28794]: Disconnected from invalid user zzg 187.72.57.82 port 48428 [preauth]
Nov 23 07:41:42 np0005532585.localdomain sudo[28796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:41:42 np0005532585.localdomain sudo[28796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:42 np0005532585.localdomain sudo[28796]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:42 np0005532585.localdomain sudo[28811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 07:41:42 np0005532585.localdomain sudo[28811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:41:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:41:43 np0005532585.localdomain sshd[28873]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 2025-11-23 07:41:43.059477627 +0000 UTC m=+0.062543223 container create 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git)
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Started libpod-conmon-88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6.scope.
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 2025-11-23 07:41:43.028339115 +0000 UTC m=+0.031404691 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 2025-11-23 07:41:43.143700861 +0000 UTC m=+0.146766417 container init 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 2025-11-23 07:41:43.153487082 +0000 UTC m=+0.156552638 container start 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True)
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 2025-11-23 07:41:43.153615056 +0000 UTC m=+0.156680622 container attach 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64)
Nov 23 07:41:43 np0005532585.localdomain wonderful_austin[28882]: 167 167
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: libpod-88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6.scope: Deactivated successfully.
Nov 23 07:41:43 np0005532585.localdomain podman[28866]: 2025-11-23 07:41:43.157265079 +0000 UTC m=+0.160330655 container died 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, ceph=True, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 07:41:43 np0005532585.localdomain podman[28887]: 2025-11-23 07:41:43.216906762 +0000 UTC m=+0.049829893 container remove 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: libpod-conmon-88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6.scope: Deactivated successfully.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:43 np0005532585.localdomain systemd-sysv-generator[28933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:43 np0005532585.localdomain systemd-rc-local-generator[28927]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:43 np0005532585.localdomain systemd-rc-local-generator[28964]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:43 np0005532585.localdomain systemd-sysv-generator[28969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:43 np0005532585.localdomain sshd[28976]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Reached target All Ceph clusters and services.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:43 np0005532585.localdomain systemd-rc-local-generator[29005]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:43 np0005532585.localdomain systemd-sysv-generator[29010]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Reached target Ceph cluster 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 07:41:43 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:44 np0005532585.localdomain systemd-rc-local-generator[29045]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:44 np0005532585.localdomain systemd-sysv-generator[29049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:44 np0005532585.localdomain systemd-rc-local-generator[29086]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:44 np0005532585.localdomain systemd-sysv-generator[29090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: Created slice Slice /system/ceph-46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: Reached target System Time Set.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: Reached target System Time Synchronized.
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: Starting Ceph crash.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:41:44 np0005532585.localdomain sshd[28873]: Invalid user wireguard from 34.124.148.87 port 44806
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 07:41:44 np0005532585.localdomain podman[29146]: 
Nov 23 07:41:44 np0005532585.localdomain podman[29146]: 2025-11-23 07:41:44.743624465 +0000 UTC m=+0.058457115 container create 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=553, architecture=x86_64, io.buildah.version=1.33.12, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Nov 23 07:41:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54336affea3174046a26e5cc0b745b5cc6b278f9d4d0515655329cf08f000053/merged/etc/ceph/ceph.client.crash.np0005532585.keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54336affea3174046a26e5cc0b745b5cc6b278f9d4d0515655329cf08f000053/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:44 np0005532585.localdomain podman[29146]: 2025-11-23 07:41:44.713804658 +0000 UTC m=+0.028637328 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54336affea3174046a26e5cc0b745b5cc6b278f9d4d0515655329cf08f000053/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:44 np0005532585.localdomain podman[29146]: 2025-11-23 07:41:44.832225737 +0000 UTC m=+0.147058387 container init 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True)
Nov 23 07:41:44 np0005532585.localdomain podman[29146]: 2025-11-23 07:41:44.842580717 +0000 UTC m=+0.157413397 container start 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, name=rhceph, distribution-scope=public, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:41:44 np0005532585.localdomain bash[29146]: 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee
Nov 23 07:41:44 np0005532585.localdomain systemd[1]: Started Ceph crash.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 07:41:44 np0005532585.localdomain sudo[28811]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:44 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.005+0000 7fba80027640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.005+0000 7fba80027640 -1 AuthRegistry(0x7fba780680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.006+0000 7fba80027640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.006+0000 7fba80027640 -1 AuthRegistry(0x7fba80026000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 23 07:41:45 np0005532585.localdomain sshd[28873]: Received disconnect from 34.124.148.87 port 44806:11: Bye Bye [preauth]
Nov 23 07:41:45 np0005532585.localdomain sshd[28873]: Disconnected from invalid user wireguard 34.124.148.87 port 44806 [preauth]
Nov 23 07:41:45 np0005532585.localdomain sudo[29167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.015+0000 7fba7d59b640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.016+0000 7fba7e59d640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.017+0000 7fba7dd9c640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.017+0000 7fba80027640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 23 07:41:45 np0005532585.localdomain sudo[29167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 23 07:41:45 np0005532585.localdomain sudo[29167]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 23 07:41:45 np0005532585.localdomain sudo[29192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Nov 23 07:41:45 np0005532585.localdomain sudo[29192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 2025-11-23 07:41:45.628418041 +0000 UTC m=+0.074950942 container create baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=)
Nov 23 07:41:45 np0005532585.localdomain systemd[1]: Started libpod-conmon-baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb.scope.
Nov 23 07:41:45 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 2025-11-23 07:41:45.597223248 +0000 UTC m=+0.043756079 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 2025-11-23 07:41:45.70093409 +0000 UTC m=+0.147466951 container init baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.33.12, GIT_BRANCH=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 2025-11-23 07:41:45.711691253 +0000 UTC m=+0.158224084 container start baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 2025-11-23 07:41:45.711978503 +0000 UTC m=+0.158511334 container attach baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Nov 23 07:41:45 np0005532585.localdomain priceless_brahmagupta[29261]: 167 167
Nov 23 07:41:45 np0005532585.localdomain systemd[1]: libpod-baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb.scope: Deactivated successfully.
Nov 23 07:41:45 np0005532585.localdomain podman[29247]: 2025-11-23 07:41:45.716634331 +0000 UTC m=+0.163167162 container died baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 07:41:45 np0005532585.localdomain podman[29266]: 2025-11-23 07:41:45.793411263 +0000 UTC m=+0.068931949 container remove baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7)
Nov 23 07:41:45 np0005532585.localdomain systemd[1]: libpod-conmon-baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb.scope: Deactivated successfully.
Nov 23 07:41:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c77b4887970b0551a3b6368b29506d225a364e531320f0a81721dce2e9ab5323-merged.mount: Deactivated successfully.
Nov 23 07:41:45 np0005532585.localdomain podman[29285]: 
Nov 23 07:41:45 np0005532585.localdomain podman[29285]: 2025-11-23 07:41:45.992852347 +0000 UTC m=+0.068121491 container create 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 07:41:46 np0005532585.localdomain systemd[1]: Started libpod-conmon-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope.
Nov 23 07:41:46 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:46 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:46 np0005532585.localdomain podman[29285]: 2025-11-23 07:41:45.964728897 +0000 UTC m=+0.039998011 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:46 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:46 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:46 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:46 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:46 np0005532585.localdomain podman[29285]: 2025-11-23 07:41:46.101304729 +0000 UTC m=+0.176573813 container init 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, version=7, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:41:46 np0005532585.localdomain podman[29285]: 2025-11-23 07:41:46.11166964 +0000 UTC m=+0.186938774 container start 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, ceph=True, version=7, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:41:46 np0005532585.localdomain podman[29285]: 2025-11-23 07:41:46.11258614 +0000 UTC m=+0.187855244 container attach 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph)
Nov 23 07:41:46 np0005532585.localdomain focused_elgamal[29301]: --> passed data devices: 0 physical, 2 LVM
Nov 23 07:41:46 np0005532585.localdomain focused_elgamal[29301]: --> relative data size: 1.0
Nov 23 07:41:46 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 07:41:46 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 07:41:47 np0005532585.localdomain lvm[29355]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 07:41:47 np0005532585.localdomain lvm[29355]: VG ceph_vg0 finished
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]:  stderr: got monmap epoch 3
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: --> Creating keyring file for osd.0
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Nov 23 07:41:47 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90 --setuser ceph --setgroup ceph
Nov 23 07:41:49 np0005532585.localdomain focused_elgamal[29301]:  stderr: 2025-11-23T07:41:47.752+0000 7ff72052aa80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 23 07:41:49 np0005532585.localdomain focused_elgamal[29301]:  stderr: 2025-11-23T07:41:47.752+0000 7ff72052aa80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Nov 23 07:41:49 np0005532585.localdomain focused_elgamal[29301]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: --> ceph-volume lvm activate successful for osd ID: 0
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 445e9929-9fbe-437e-be2a-5f2d52ad535b
Nov 23 07:41:50 np0005532585.localdomain lvm[30303]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 23 07:41:50 np0005532585.localdomain lvm[30303]: VG ceph_vg1 finished
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Nov 23 07:41:50 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap
Nov 23 07:41:51 np0005532585.localdomain focused_elgamal[29301]:  stderr: got monmap epoch 3
Nov 23 07:41:51 np0005532585.localdomain focused_elgamal[29301]: --> Creating keyring file for osd.3
Nov 23 07:41:51 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring
Nov 23 07:41:51 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/
Nov 23 07:41:51 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 445e9929-9fbe-437e-be2a-5f2d52ad535b --setuser ceph --setgroup ceph
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]:  stderr: 2025-11-23T07:41:51.225+0000 7f84b7679a80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]:  stderr: 2025-11-23T07:41:51.227+0000 7f84b7679a80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: --> ceph-volume lvm activate successful for osd ID: 3
Nov 23 07:41:53 np0005532585.localdomain focused_elgamal[29301]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Nov 23 07:41:53 np0005532585.localdomain systemd[1]: libpod-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope: Deactivated successfully.
Nov 23 07:41:53 np0005532585.localdomain systemd[1]: libpod-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope: Consumed 3.604s CPU time.
Nov 23 07:41:53 np0005532585.localdomain podman[31218]: 2025-11-23 07:41:53.639371065 +0000 UTC m=+0.043710907 container died 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, release=553, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 07:41:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a-merged.mount: Deactivated successfully.
Nov 23 07:41:53 np0005532585.localdomain podman[31218]: 2025-11-23 07:41:53.674160519 +0000 UTC m=+0.078500311 container remove 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:41:53 np0005532585.localdomain systemd[1]: libpod-conmon-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope: Deactivated successfully.
Nov 23 07:41:53 np0005532585.localdomain sudo[29192]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:53 np0005532585.localdomain sudo[31232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:41:53 np0005532585.localdomain sudo[31232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:53 np0005532585.localdomain sudo[31232]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:53 np0005532585.localdomain sudo[31247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- lvm list --format json
Nov 23 07:41:53 np0005532585.localdomain sudo[31247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 2025-11-23 07:41:54.391393248 +0000 UTC m=+0.073133681 container create 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, version=7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: Started libpod-conmon-368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77.scope.
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 2025-11-23 07:41:54.361132866 +0000 UTC m=+0.042873289 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 2025-11-23 07:41:54.473246921 +0000 UTC m=+0.154987344 container init 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, version=7, name=rhceph, release=553, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 2025-11-23 07:41:54.483929832 +0000 UTC m=+0.165670255 container start 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55)
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 2025-11-23 07:41:54.484240443 +0000 UTC m=+0.165980876 container attach 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, version=7, release=553)
Nov 23 07:41:54 np0005532585.localdomain jolly_cori[31314]: 167 167
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: libpod-368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77.scope: Deactivated successfully.
Nov 23 07:41:54 np0005532585.localdomain podman[31299]: 2025-11-23 07:41:54.488861529 +0000 UTC m=+0.170601982 container died 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, release=553, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git)
Nov 23 07:41:54 np0005532585.localdomain podman[31319]: 2025-11-23 07:41:54.578057811 +0000 UTC m=+0.075767929 container remove 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, RELEASE=main, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: libpod-conmon-368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77.scope: Deactivated successfully.
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3682c77af88429b49254555f73f1b71d36e838694a62cdab207aa7c40ab69b3c-merged.mount: Deactivated successfully.
Nov 23 07:41:54 np0005532585.localdomain podman[31339]: 
Nov 23 07:41:54 np0005532585.localdomain podman[31339]: 2025-11-23 07:41:54.794185879 +0000 UTC m=+0.084232025 container create c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: Started libpod-conmon-c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6.scope.
Nov 23 07:41:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:54 np0005532585.localdomain podman[31339]: 2025-11-23 07:41:54.765728418 +0000 UTC m=+0.055774564 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:54 np0005532585.localdomain podman[31339]: 2025-11-23 07:41:54.898114978 +0000 UTC m=+0.188161124 container init c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, ceph=True, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:41:54 np0005532585.localdomain podman[31339]: 2025-11-23 07:41:54.909320807 +0000 UTC m=+0.199366953 container start c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph)
Nov 23 07:41:54 np0005532585.localdomain podman[31339]: 2025-11-23 07:41:54.909536144 +0000 UTC m=+0.199582300 container attach c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., release=553, ceph=True)
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]: {
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:     "0": [
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:         {
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "devices": [
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "/dev/loop3"
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             ],
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_name": "ceph_lv0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_size": "7511998464",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=2tCPCp-iGLo-2Hfx-dJfg-tPJt-XdH9-0kGXBa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=46550e70-79cb-5f55-bf6d-1204b97e083b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_uuid": "2tCPCp-iGLo-2Hfx-dJfg-tPJt-XdH9-0kGXBa",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "name": "ceph_lv0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "path": "/dev/ceph_vg0/ceph_lv0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "tags": {
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.block_uuid": "2tCPCp-iGLo-2Hfx-dJfg-tPJt-XdH9-0kGXBa",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.cephx_lockbox_secret": "",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.cluster_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.cluster_name": "ceph",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.crush_device_class": "",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.encrypted": "0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.osd_fsid": "f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.osd_id": "0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.type": "block",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.vdo": "0"
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             },
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "type": "block",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "vg_name": "ceph_vg0"
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:         }
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:     ],
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:     "3": [
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:         {
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "devices": [
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "/dev/loop4"
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             ],
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_name": "ceph_lv1",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_size": "7511998464",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=2JMRQ6-FbUg-ZwHu-8fpn-2v4n-Eves-jpm5ot,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=46550e70-79cb-5f55-bf6d-1204b97e083b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=445e9929-9fbe-437e-be2a-5f2d52ad535b,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "lv_uuid": "2JMRQ6-FbUg-ZwHu-8fpn-2v4n-Eves-jpm5ot",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "name": "ceph_lv1",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "path": "/dev/ceph_vg1/ceph_lv1",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "tags": {
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.block_uuid": "2JMRQ6-FbUg-ZwHu-8fpn-2v4n-Eves-jpm5ot",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.cephx_lockbox_secret": "",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.cluster_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.cluster_name": "ceph",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.crush_device_class": "",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.encrypted": "0",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.osd_fsid": "445e9929-9fbe-437e-be2a-5f2d52ad535b",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.osd_id": "3",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.type": "block",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:                 "ceph.vdo": "0"
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             },
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "type": "block",
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:             "vg_name": "ceph_vg1"
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:         }
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]:     ]
Nov 23 07:41:55 np0005532585.localdomain lucid_fermi[31355]: }
Nov 23 07:41:55 np0005532585.localdomain systemd[1]: libpod-c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6.scope: Deactivated successfully.
Nov 23 07:41:55 np0005532585.localdomain podman[31339]: 2025-11-23 07:41:55.287950061 +0000 UTC m=+0.577996257 container died c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 23 07:41:55 np0005532585.localdomain podman[31364]: 2025-11-23 07:41:55.371067188 +0000 UTC m=+0.071841037 container remove c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55)
Nov 23 07:41:55 np0005532585.localdomain systemd[1]: libpod-conmon-c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6.scope: Deactivated successfully.
Nov 23 07:41:55 np0005532585.localdomain sudo[31247]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:55 np0005532585.localdomain sudo[31378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:41:55 np0005532585.localdomain sudo[31378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:55 np0005532585.localdomain sudo[31378]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:55 np0005532585.localdomain sudo[31393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 07:41:55 np0005532585.localdomain sudo[31393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df-merged.mount: Deactivated successfully.
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 2025-11-23 07:41:56.120210194 +0000 UTC m=+0.071457593 container create 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46.scope.
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 2025-11-23 07:41:56.160297308 +0000 UTC m=+0.111544717 container init 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 07:41:56 np0005532585.localdomain upbeat_mcclintock[31465]: 167 167
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: libpod-7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46.scope: Deactivated successfully.
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 2025-11-23 07:41:56.1831729 +0000 UTC m=+0.134420309 container start 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 2025-11-23 07:41:56.183379417 +0000 UTC m=+0.134626846 container attach 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 2025-11-23 07:41:56.18435981 +0000 UTC m=+0.135607219 container died 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, name=rhceph, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:41:56 np0005532585.localdomain podman[31449]: 2025-11-23 07:41:56.087930574 +0000 UTC m=+0.039178043 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:56 np0005532585.localdomain podman[31470]: 2025-11-23 07:41:56.21661768 +0000 UTC m=+0.044448702 container remove 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553)
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: libpod-conmon-7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46.scope: Deactivated successfully.
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 2025-11-23 07:41:56.495155595 +0000 UTC m=+0.064406296 container create 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, version=7, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad.scope.
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 2025-11-23 07:41:56.472349175 +0000 UTC m=+0.041599886 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 2025-11-23 07:41:56.595497523 +0000 UTC m=+0.164748204 container init 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55)
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 2025-11-23 07:41:56.60284051 +0000 UTC m=+0.172091181 container start 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main)
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 2025-11-23 07:41:56.603061579 +0000 UTC m=+0.172312270 container attach 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bf30e6521451ffeeba177399b7238bbe7bc12f36d09f657e456eb1b93cf13071-merged.mount: Deactivated successfully.
Nov 23 07:41:56 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test[31514]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 23 07:41:56 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test[31514]:                             [--no-systemd] [--no-tmpfs]
Nov 23 07:41:56 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test[31514]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: libpod-61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad.scope: Deactivated successfully.
Nov 23 07:41:56 np0005532585.localdomain podman[31498]: 2025-11-23 07:41:56.852259473 +0000 UTC m=+0.421510154 container died 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309-merged.mount: Deactivated successfully.
Nov 23 07:41:56 np0005532585.localdomain systemd-journald[618]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 23 07:41:56 np0005532585.localdomain systemd-journald[618]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 07:41:56 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 07:41:56 np0005532585.localdomain podman[31519]: 2025-11-23 07:41:56.947218599 +0000 UTC m=+0.082112133 container remove 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12)
Nov 23 07:41:56 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 07:41:56 np0005532585.localdomain systemd[1]: libpod-conmon-61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad.scope: Deactivated successfully.
Nov 23 07:41:57 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:57 np0005532585.localdomain systemd-rc-local-generator[31572]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:57 np0005532585.localdomain systemd-sysv-generator[31577]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:57 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:41:57 np0005532585.localdomain systemd-sysv-generator[31622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:41:57 np0005532585.localdomain systemd-rc-local-generator[31616]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:41:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:41:57 np0005532585.localdomain systemd[1]: Starting Ceph osd.0 for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 2025-11-23 07:41:58.112915371 +0000 UTC m=+0.074967232 container create af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 07:41:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:41:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 2025-11-23 07:41:58.082509405 +0000 UTC m=+0.044561296 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 2025-11-23 07:41:58.209611376 +0000 UTC m=+0.171663247 container init af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, io.buildah.version=1.33.12, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:41:58 np0005532585.localdomain systemd[1]: tmp-crun.3M9b1Q.mount: Deactivated successfully.
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 2025-11-23 07:41:58.221169706 +0000 UTC m=+0.183221567 container start af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=)
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 2025-11-23 07:41:58.221639542 +0000 UTC m=+0.183691423 container attach af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, io.openshift.expose-services=, release=553, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7)
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Nov 23 07:41:58 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: --> ceph-volume raw activate successful for osd ID: 0
Nov 23 07:41:58 np0005532585.localdomain bash[31681]: --> ceph-volume raw activate successful for osd ID: 0
Nov 23 07:41:58 np0005532585.localdomain systemd[1]: libpod-af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709.scope: Deactivated successfully.
Nov 23 07:41:58 np0005532585.localdomain podman[31681]: 2025-11-23 07:41:58.8612579 +0000 UTC m=+0.823309761 container died af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, release=553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git)
Nov 23 07:41:58 np0005532585.localdomain podman[31825]: 2025-11-23 07:41:58.949913253 +0000 UTC m=+0.079642859 container remove af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55)
Nov 23 07:41:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e-merged.mount: Deactivated successfully.
Nov 23 07:41:59 np0005532585.localdomain podman[31887]: 
Nov 23 07:41:59 np0005532585.localdomain podman[31887]: 2025-11-23 07:41:59.238367404 +0000 UTC m=+0.069143306 container create 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, architecture=x86_64)
Nov 23 07:41:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:59 np0005532585.localdomain podman[31887]: 2025-11-23 07:41:59.211175086 +0000 UTC m=+0.041951018 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:41:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Nov 23 07:41:59 np0005532585.localdomain podman[31887]: 2025-11-23 07:41:59.352214158 +0000 UTC m=+0.182990070 container init 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main)
Nov 23 07:41:59 np0005532585.localdomain systemd[1]: tmp-crun.BqUQ9W.mount: Deactivated successfully.
Nov 23 07:41:59 np0005532585.localdomain podman[31887]: 2025-11-23 07:41:59.364163041 +0000 UTC m=+0.194938923 container start 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0, CEPH_POINT_RELEASE=, release=553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:41:59 np0005532585.localdomain bash[31887]: 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962
Nov 23 07:41:59 np0005532585.localdomain systemd[1]: Started Ceph osd.0 for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 07:41:59 np0005532585.localdomain sudo[31393]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: pidfile_write: ignore empty --pid-file
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) close
Nov 23 07:41:59 np0005532585.localdomain sudo[31918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:41:59 np0005532585.localdomain sudo[31918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:59 np0005532585.localdomain sudo[31918]: pam_unix(sudo:session): session closed for user root
Nov 23 07:41:59 np0005532585.localdomain sudo[31933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 07:41:59 np0005532585.localdomain sudo[31933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) close
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: load: jerasure load: lrc 
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:41:59 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) close
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 2025-11-23 07:42:00.147153861 +0000 UTC m=+0.069340473 container create ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 07:42:00 np0005532585.localdomain systemd[1]: Started libpod-conmon-ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25.scope.
Nov 23 07:42:00 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 2025-11-23 07:42:00.118973099 +0000 UTC m=+0.041159711 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 2025-11-23 07:42:00.219359818 +0000 UTC m=+0.141546430 container init ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True)
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 2025-11-23 07:42:00.229577283 +0000 UTC m=+0.151763885 container start ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, GIT_BRANCH=main, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., version=7, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public)
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 2025-11-23 07:42:00.22976758 +0000 UTC m=+0.151954222 container attach ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph)
Nov 23 07:42:00 np0005532585.localdomain confident_newton[32013]: 167 167
Nov 23 07:42:00 np0005532585.localdomain systemd[1]: libpod-ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25.scope: Deactivated successfully.
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:42:00 np0005532585.localdomain podman[31998]: 2025-11-23 07:42:00.235676289 +0000 UTC m=+0.157862951 container died ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=553, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) close
Nov 23 07:42:00 np0005532585.localdomain podman[32018]: 2025-11-23 07:42:00.32276448 +0000 UTC m=+0.078609756 container remove ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc.)
Nov 23 07:42:00 np0005532585.localdomain systemd[1]: libpod-conmon-ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25.scope: Deactivated successfully.
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs mount
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs mount shared_bdev_used = 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: RocksDB version: 7.9.2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Git sha 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DB SUMMARY
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DB Session ID:  F6DONMZYLUSPVQLEPLPJ
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: CURRENT file:  CURRENT
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.error_if_exists: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.create_if_missing: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                     Options.env: 0x55d7b83e2cb0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                Options.info_log: 0x55d7b90c85a0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                              Options.statistics: (nil)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.use_fsync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                              Options.db_log_dir: 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                 Options.wal_dir: db.wal
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.write_buffer_manager: 0x55d7b8138140
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.unordered_write: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.row_cache: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                              Options.wal_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.two_write_queues: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.wal_compression: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.atomic_flush: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_background_jobs: 4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_background_compactions: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_subcompactions: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.max_open_files: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Compression algorithms supported:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kZSTD supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kXpressCompression supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kBZip2Compression supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kLZ4Compression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kZlibCompression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kLZ4HCCompression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kSnappyCompression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8126850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720541017, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720541236, "job": 1, "event": "recovery_finished"}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: freelist init
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: freelist _read_cfg
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs umount
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) close
Nov 23 07:42:00 np0005532585.localdomain podman[32243]: 
Nov 23 07:42:00 np0005532585.localdomain podman[32243]: 2025-11-23 07:42:00.644438342 +0000 UTC m=+0.072018753 container create 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, release=553, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 07:42:00 np0005532585.localdomain systemd[1]: Started libpod-conmon-3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656.scope.
Nov 23 07:42:00 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:00 np0005532585.localdomain podman[32243]: 2025-11-23 07:42:00.615733163 +0000 UTC m=+0.043313564 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:00 np0005532585.localdomain podman[32243]: 2025-11-23 07:42:00.775429965 +0000 UTC m=+0.203010366 container init 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, vcs-type=git, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs mount
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluefs mount shared_bdev_used = 4718592
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: RocksDB version: 7.9.2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Git sha 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DB SUMMARY
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DB Session ID:  F6DONMZYLUSPVQLEPLPI
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: CURRENT file:  CURRENT
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.error_if_exists: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.create_if_missing: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                     Options.env: 0x55d7b8274460
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                Options.info_log: 0x55d7b914e100
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                              Options.statistics: (nil)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.use_fsync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                              Options.db_log_dir: 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                                 Options.wal_dir: db.wal
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.write_buffer_manager: 0x55d7b81395e0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.unordered_write: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.row_cache: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                              Options.wal_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.two_write_queues: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.wal_compression: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.atomic_flush: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_background_jobs: 4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_background_compactions: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_subcompactions: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.max_open_files: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Compression algorithms supported:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kZSTD supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kXpressCompression supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kBZip2Compression supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kLZ4Compression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kZlibCompression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kLZ4HCCompression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         kSnappyCompression supported: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: 
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain podman[32243]: 2025-11-23 07:42:00.78950511 +0000 UTC m=+0.217085561 container start 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 07:42:00 np0005532585.localdomain podman[32243]: 2025-11-23 07:42:00.793226756 +0000 UTC m=+0.220807227 container attach 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b81262d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e440)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8127610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e440)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8127610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e440)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d7b8127610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720807330, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720813538, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883720, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b", "db_session_id": "F6DONMZYLUSPVQLEPLPI", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720818477, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883720, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b", "db_session_id": "F6DONMZYLUSPVQLEPLPI", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720822806, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883720, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b", "db_session_id": "F6DONMZYLUSPVQLEPLPI", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720826981, "job": 1, "event": "recovery_finished"}
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d7b90cc700
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: DB pointer 0x55d7b901fa00
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: _get_class not permitted to load lua
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: _get_class not permitted to load sdk
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: _get_class not permitted to load test_remote_reads
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 load_pgs
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 load_pgs opened 0 pgs
Nov 23 07:42:00 np0005532585.localdomain ceph-osd[31905]: osd.0 0 log_to_monitors true
Nov 23 07:42:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0[31901]: 2025-11-23T07:42:00.866+0000 7ff7bc464a80 -1 osd.0 0 log_to_monitors true
Nov 23 07:42:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test[32260]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 23 07:42:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test[32260]:                             [--no-systemd] [--no-tmpfs]
Nov 23 07:42:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test[32260]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: libpod-3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656.scope: Deactivated successfully.
Nov 23 07:42:01 np0005532585.localdomain podman[32243]: 2025-11-23 07:42:01.013801714 +0000 UTC m=+0.441382135 container died 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, release=553, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph)
Nov 23 07:42:01 np0005532585.localdomain podman[32480]: 2025-11-23 07:42:01.088782096 +0000 UTC m=+0.066499197 container remove 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=)
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: libpod-conmon-3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656.scope: Deactivated successfully.
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3c0e124341823678b30f6dad76696bef77ca5ee037913337adc23f1cc972227e-merged.mount: Deactivated successfully.
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:42:01 np0005532585.localdomain systemd-rc-local-generator[32533]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:42:01 np0005532585.localdomain systemd-sysv-generator[32539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:42:01 np0005532585.localdomain systemd-rc-local-generator[32578]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:42:01 np0005532585.localdomain systemd-sysv-generator[32583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:42:01 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 23 07:42:01 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 23 07:42:01 np0005532585.localdomain systemd[1]: Starting Ceph osd.3 for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 2025-11-23 07:42:02.16007229 +0000 UTC m=+0.078013646 container create 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, release=553, architecture=x86_64)
Nov 23 07:42:02 np0005532585.localdomain systemd[1]: tmp-crun.h9V2kN.mount: Deactivated successfully.
Nov 23 07:42:02 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 2025-11-23 07:42:02.125499443 +0000 UTC m=+0.043440799 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 2025-11-23 07:42:02.265979276 +0000 UTC m=+0.183920602 container init 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, vcs-type=git, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55)
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 2025-11-23 07:42:02.273104567 +0000 UTC m=+0.191045923 container start 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 2025-11-23 07:42:02.273448749 +0000 UTC m=+0.191390095 container attach 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0 done with init, starting boot process
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0 start_boot
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 23 07:42:02 np0005532585.localdomain ceph-osd[31905]: osd.0 0  bench count 12288000 bsize 4 KiB
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Nov 23 07:42:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: --> ceph-volume raw activate successful for osd ID: 3
Nov 23 07:42:02 np0005532585.localdomain bash[32642]: --> ceph-volume raw activate successful for osd ID: 3
Nov 23 07:42:02 np0005532585.localdomain systemd[1]: libpod-2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e.scope: Deactivated successfully.
Nov 23 07:42:02 np0005532585.localdomain podman[32642]: 2025-11-23 07:42:02.953298044 +0000 UTC m=+0.871239430 container died 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 07:42:03 np0005532585.localdomain podman[32777]: 2025-11-23 07:42:03.052269617 +0000 UTC m=+0.088500200 container remove 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=)
Nov 23 07:42:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249-merged.mount: Deactivated successfully.
Nov 23 07:42:03 np0005532585.localdomain podman[32840]: 
Nov 23 07:42:03 np0005532585.localdomain podman[32840]: 2025-11-23 07:42:03.350125404 +0000 UTC m=+0.083171680 container create 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64)
Nov 23 07:42:03 np0005532585.localdomain podman[32840]: 2025-11-23 07:42:03.317651887 +0000 UTC m=+0.050698183 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:03 np0005532585.localdomain podman[32840]: 2025-11-23 07:42:03.492837053 +0000 UTC m=+0.225883329 container init 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 07:42:03 np0005532585.localdomain podman[32840]: 2025-11-23 07:42:03.501376641 +0000 UTC m=+0.234422907 container start 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, RELEASE=main, release=553, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 07:42:03 np0005532585.localdomain bash[32840]: 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918
Nov 23 07:42:03 np0005532585.localdomain systemd[1]: Started Ceph osd.3 for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: pidfile_write: ignore empty --pid-file
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) close
Nov 23 07:42:03 np0005532585.localdomain sudo[31933]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:03 np0005532585.localdomain sudo[32871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:42:03 np0005532585.localdomain sudo[32871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:03 np0005532585.localdomain sudo[32871]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:03 np0005532585.localdomain sudo[32886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- raw list --format json
Nov 23 07:42:03 np0005532585.localdomain sudo[32886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:03 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) close
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: load: jerasure load: lrc 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) close
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) close
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 2025-11-23 07:42:04.249994629 +0000 UTC m=+0.066934581 container create 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, version=7, GIT_BRANCH=main, architecture=x86_64, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.)
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: Started libpod-conmon-3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0.scope.
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 2025-11-23 07:42:04.213549519 +0000 UTC m=+0.030489521 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 2025-11-23 07:42:04.330658803 +0000 UTC m=+0.147598765 container init 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, version=7, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 07:42:04 np0005532585.localdomain heuristic_darwin[32971]: 167 167
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 2025-11-23 07:42:04.342171642 +0000 UTC m=+0.159111604 container start 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: libpod-3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0.scope: Deactivated successfully.
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 2025-11-23 07:42:04.342375189 +0000 UTC m=+0.159315151 container attach 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, architecture=x86_64, RELEASE=main, version=7, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 07:42:04 np0005532585.localdomain podman[32955]: 2025-11-23 07:42:04.343501177 +0000 UTC m=+0.160441189 container died 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs mount
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs mount shared_bdev_used = 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: RocksDB version: 7.9.2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Git sha 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DB SUMMARY
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DB Session ID:  F8LN5U3KUYCOQ4JP56ZW
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: CURRENT file:  CURRENT
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.error_if_exists: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.create_if_missing: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                     Options.env: 0x56377a0bacb0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                Options.info_log: 0x56377ada0300
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                              Options.statistics: (nil)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.use_fsync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                              Options.db_log_dir: 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                 Options.wal_dir: db.wal
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.write_buffer_manager: 0x563779e10140
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.unordered_write: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.row_cache: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                              Options.wal_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.two_write_queues: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.wal_compression: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.atomic_flush: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_background_jobs: 4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_background_compactions: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_subcompactions: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.max_open_files: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Compression algorithms supported:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kZSTD supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kXpressCompression supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kBZip2Compression supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kLZ4Compression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kZlibCompression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kLZ4HCCompression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kSnappyCompression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d253d8ba80c0c9cde042619afdb1e293080e44cf6cffbd41363a62af66af008d-merged.mount: Deactivated successfully.
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada06e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada06e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada06e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f0255131-8bda-4abb-b8a5-2cf651f3fb8a
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724420048, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724420338, "job": 1, "event": "recovery_finished"}
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: freelist init
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: freelist _read_cfg
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs umount
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) close
Nov 23 07:42:04 np0005532585.localdomain podman[32977]: 2025-11-23 07:42:04.439926053 +0000 UTC m=+0.082817108 container remove 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: libpod-conmon-3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0.scope: Deactivated successfully.
Nov 23 07:42:04 np0005532585.localdomain podman[33191]: 
Nov 23 07:42:04 np0005532585.localdomain podman[33191]: 2025-11-23 07:42:04.620891354 +0000 UTC m=+0.055322089 container create 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, version=7, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True)
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: Started libpod-conmon-7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2.scope.
Nov 23 07:42:04 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs mount
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluefs mount shared_bdev_used = 4718592
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: RocksDB version: 7.9.2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Git sha 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DB SUMMARY
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DB Session ID:  F8LN5U3KUYCOQ4JP56ZX
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: CURRENT file:  CURRENT
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.error_if_exists: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.create_if_missing: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                     Options.env: 0x56377a0bbdc0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                Options.info_log: 0x56377ada1be0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                              Options.statistics: (nil)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.use_fsync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                              Options.db_log_dir: 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                                 Options.wal_dir: db.wal
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.write_buffer_manager: 0x563779e10140
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.unordered_write: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.row_cache: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                              Options.wal_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.two_write_queues: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.wal_compression: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.atomic_flush: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_background_jobs: 4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_background_compactions: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_subcompactions: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.max_open_files: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Compression algorithms supported:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kZSTD supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kXpressCompression supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kBZip2Compression supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kLZ4Compression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kZlibCompression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kLZ4HCCompression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         kSnappyCompression supported: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: 
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dfe2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain podman[33191]: 2025-11-23 07:42:04.594301855 +0000 UTC m=+0.028732580 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1fe0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dff610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1fe0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dff610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:           Options.merge_operator: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1fe0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x563779dff610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.compression: LZ4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.num_levels: 7
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.bloom_locality: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                               Options.ttl: 2592000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                       Options.enable_blob_files: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                           Options.min_blob_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f0255131-8bda-4abb-b8a5-2cf651f3fb8a
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724699645, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724709952, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883724, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f0255131-8bda-4abb-b8a5-2cf651f3fb8a", "db_session_id": "F8LN5U3KUYCOQ4JP56ZX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 23 07:42:04 np0005532585.localdomain podman[33191]: 2025-11-23 07:42:04.711509953 +0000 UTC m=+0.145940688 container init 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, io.openshift.expose-services=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.33.12, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724718501, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883724, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f0255131-8bda-4abb-b8a5-2cf651f3fb8a", "db_session_id": "F8LN5U3KUYCOQ4JP56ZX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 23 07:42:04 np0005532585.localdomain podman[33191]: 2025-11-23 07:42:04.719366699 +0000 UTC m=+0.153797464 container start 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64)
Nov 23 07:42:04 np0005532585.localdomain podman[33191]: 2025-11-23 07:42:04.719623297 +0000 UTC m=+0.154054062 container attach 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724722359, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883724, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f0255131-8bda-4abb-b8a5-2cf651f3fb8a", "db_session_id": "F8LN5U3KUYCOQ4JP56ZX", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724730714, "job": 1, "event": "recovery_finished"}
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x563779e6a700
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: DB pointer 0x56377acf7a00
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: _get_class not permitted to load lua
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: _get_class not permitted to load sdk
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: _get_class not permitted to load test_remote_reads
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 load_pgs
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 load_pgs opened 0 pgs
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[32858]: osd.3 0 log_to_monitors true
Nov 23 07:42:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3[32854]: 2025-11-23T07:42:04.784+0000 7f3b1596aa80 -1 osd.3 0 log_to_monitors true
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 35.258 iops: 9025.931 elapsed_sec: 0.332
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [WRN] : OSD bench result of 9025.930813 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 0 waiting for initial osdmap
Nov 23 07:42:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0[31901]: 2025-11-23T07:42:04.786+0000 7ff7b8bf8640 -1 osd.0 0 waiting for initial osdmap
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 crush map has features 288514050185494528, adjusting msgr requires for clients
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 crush map has features 3314932999778484224, adjusting msgr requires for osds
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 check_osdmap_features require_osd_release unknown -> reef
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 set_numa_affinity not setting numa affinity
Nov 23 07:42:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0[31901]: 2025-11-23T07:42:04.802+0000 7ff7b3a0d640 -1 osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 23 07:42:04 np0005532585.localdomain ceph-osd[31905]: osd.0 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]: {
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:     "445e9929-9fbe-437e-be2a-5f2d52ad535b": {
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "ceph_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b",
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "osd_id": 3,
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "osd_uuid": "445e9929-9fbe-437e-be2a-5f2d52ad535b",
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "type": "bluestore"
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:     },
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:     "f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90": {
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "ceph_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b",
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "osd_id": 0,
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "osd_uuid": "f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90",
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:         "type": "bluestore"
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]:     }
Nov 23 07:42:05 np0005532585.localdomain nifty_haibt[33206]: }
Nov 23 07:42:05 np0005532585.localdomain systemd[1]: libpod-7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2.scope: Deactivated successfully.
Nov 23 07:42:05 np0005532585.localdomain podman[33191]: 2025-11-23 07:42:05.274647479 +0000 UTC m=+0.709078234 container died 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12)
Nov 23 07:42:05 np0005532585.localdomain systemd[1]: tmp-crun.OpBkAI.mount: Deactivated successfully.
Nov 23 07:42:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f-merged.mount: Deactivated successfully.
Nov 23 07:42:05 np0005532585.localdomain podman[33458]: 2025-11-23 07:42:05.432665754 +0000 UTC m=+0.144915224 container remove 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 07:42:05 np0005532585.localdomain systemd[1]: libpod-conmon-7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2.scope: Deactivated successfully.
Nov 23 07:42:05 np0005532585.localdomain sudo[32886]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:05 np0005532585.localdomain ceph-osd[31905]: osd.0 12 state: booting -> active
Nov 23 07:42:05 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 23 07:42:05 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0 done with init, starting boot process
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0 start_boot
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 23 07:42:06 np0005532585.localdomain ceph-osd[32858]: osd.3 0  bench count 12288000 bsize 4 KiB
Nov 23 07:42:07 np0005532585.localdomain sudo[33473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:42:07 np0005532585.localdomain sudo[33473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:07 np0005532585.localdomain sudo[33473]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:07 np0005532585.localdomain sudo[33488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:42:07 np0005532585.localdomain sudo[33488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:07 np0005532585.localdomain sudo[33488]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:07 np0005532585.localdomain sudo[33503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 07:42:07 np0005532585.localdomain sudo[33503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:08 np0005532585.localdomain ceph-osd[31905]: osd.0 14 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 23 07:42:08 np0005532585.localdomain ceph-osd[31905]: osd.0 14 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Nov 23 07:42:08 np0005532585.localdomain ceph-osd[31905]: osd.0 14 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 23 07:42:08 np0005532585.localdomain podman[33584]: 2025-11-23 07:42:08.367925668 +0000 UTC m=+0.090592780 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, release=553)
Nov 23 07:42:08 np0005532585.localdomain podman[33584]: 2025-11-23 07:42:08.474055972 +0000 UTC m=+0.196723064 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 07:42:08 np0005532585.localdomain sudo[33503]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:08 np0005532585.localdomain sudo[33651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:42:08 np0005532585.localdomain sudo[33651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:08 np0005532585.localdomain sudo[33651]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:08 np0005532585.localdomain sudo[33666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:42:08 np0005532585.localdomain sudo[33666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:09 np0005532585.localdomain sudo[33666]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:09 np0005532585.localdomain sudo[33713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:42:09 np0005532585.localdomain sudo[33713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:09 np0005532585.localdomain sudo[33713]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 26.447 iops: 6770.493 elapsed_sec: 0.443
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [WRN] : OSD bench result of 6770.493371 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 0 waiting for initial osdmap
Nov 23 07:42:09 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3[32854]: 2025-11-23T07:42:09.570+0000 7f3b118e9640 -1 osd.3 0 waiting for initial osdmap
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 check_osdmap_features require_osd_release unknown -> reef
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 set_numa_affinity not setting numa affinity
Nov 23 07:42:09 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3[32854]: 2025-11-23T07:42:09.588+0000 7f3b0cf13640 -1 osd.3 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 15 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Nov 23 07:42:09 np0005532585.localdomain sudo[33730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 07:42:09 np0005532585.localdomain sudo[33730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:09 np0005532585.localdomain ceph-osd[32858]: osd.3 16 state: booting -> active
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 2025-11-23 07:42:10.215155823 +0000 UTC m=+0.072350124 container create 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, version=7, vcs-type=git, io.openshift.expose-services=, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970.scope.
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 2025-11-23 07:42:10.283334145 +0000 UTC m=+0.140528446 container init 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_BRANCH=main, version=7, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.33.12)
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 2025-11-23 07:42:10.185292455 +0000 UTC m=+0.042486766 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 2025-11-23 07:42:10.29533896 +0000 UTC m=+0.152533271 container start 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, version=7, GIT_CLEAN=True, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7)
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 2025-11-23 07:42:10.29564728 +0000 UTC m=+0.152841591 container attach 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Nov 23 07:42:10 np0005532585.localdomain dreamy_easley[33800]: 167 167
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: libpod-7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970.scope: Deactivated successfully.
Nov 23 07:42:10 np0005532585.localdomain podman[33784]: 2025-11-23 07:42:10.299188711 +0000 UTC m=+0.156383072 container died 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, ceph=True, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8cc95b34d97b5e8919b6489676deb70e99b6f013ec8fed680172a938a7859e1f-merged.mount: Deactivated successfully.
Nov 23 07:42:10 np0005532585.localdomain podman[33805]: 2025-11-23 07:42:10.469152299 +0000 UTC m=+0.158150261 container remove 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True)
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: libpod-conmon-7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970.scope: Deactivated successfully.
Nov 23 07:42:10 np0005532585.localdomain podman[33826]: 
Nov 23 07:42:10 np0005532585.localdomain podman[33826]: 2025-11-23 07:42:10.684243802 +0000 UTC m=+0.085273980 container create 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536.scope.
Nov 23 07:42:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:42:10 np0005532585.localdomain podman[33826]: 2025-11-23 07:42:10.642817783 +0000 UTC m=+0.043847961 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 07:42:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 07:42:10 np0005532585.localdomain podman[33826]: 2025-11-23 07:42:10.799940579 +0000 UTC m=+0.200970757 container init 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main)
Nov 23 07:42:10 np0005532585.localdomain podman[33826]: 2025-11-23 07:42:10.813581949 +0000 UTC m=+0.214612117 container start 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.33.12, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55)
Nov 23 07:42:10 np0005532585.localdomain podman[33826]: 2025-11-23 07:42:10.814022134 +0000 UTC m=+0.215052302 container attach 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553)
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]: [
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:     {
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "available": false,
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "ceph_device": false,
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "lsm_data": {},
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "lvs": [],
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "path": "/dev/sr0",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "rejected_reasons": [
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "Has a FileSystem",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "Insufficient space (<5GB)"
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         ],
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         "sys_api": {
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "actuators": null,
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "device_nodes": "sr0",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "human_readable_size": "482.00 KB",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "id_bus": "ata",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "model": "QEMU DVD-ROM",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "nr_requests": "2",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "partitions": {},
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "path": "/dev/sr0",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "removable": "1",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "rev": "2.5+",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "ro": "0",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "rotational": "1",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "sas_address": "",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "sas_device_handle": "",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "scheduler_mode": "mq-deadline",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "sectors": 0,
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "sectorsize": "2048",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "size": 493568.0,
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "support_discard": "0",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "type": "disk",
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:             "vendor": "QEMU"
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:         }
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]:     }
Nov 23 07:42:11 np0005532585.localdomain crazy_chandrasekhar[33841]: ]
Nov 23 07:42:11 np0005532585.localdomain systemd[1]: libpod-0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536.scope: Deactivated successfully.
Nov 23 07:42:11 np0005532585.localdomain podman[33826]: 2025-11-23 07:42:11.58359845 +0000 UTC m=+0.984628678 container died 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, version=7)
Nov 23 07:42:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016-merged.mount: Deactivated successfully.
Nov 23 07:42:11 np0005532585.localdomain podman[35138]: 2025-11-23 07:42:11.661368977 +0000 UTC m=+0.071606770 container remove 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, version=7, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, distribution-scope=public)
Nov 23 07:42:11 np0005532585.localdomain systemd[1]: libpod-conmon-0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536.scope: Deactivated successfully.
Nov 23 07:42:11 np0005532585.localdomain sudo[33730]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:11 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 17 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=17) [2,4,3] r=2 lpr=17 pi=[14,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 07:42:13 np0005532585.localdomain sudo[35153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:42:13 np0005532585.localdomain sudo[35153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:13 np0005532585.localdomain sudo[35153]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:14 np0005532585.localdomain sshd[35168]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:42:20 np0005532585.localdomain sudo[35169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:42:20 np0005532585.localdomain sudo[35169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:20 np0005532585.localdomain sudo[35169]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:20 np0005532585.localdomain sudo[35184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 07:42:20 np0005532585.localdomain sudo[35184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:20 np0005532585.localdomain systemd[26178]: Starting Mark boot as successful...
Nov 23 07:42:20 np0005532585.localdomain systemd[26178]: Finished Mark boot as successful.
Nov 23 07:42:20 np0005532585.localdomain podman[35269]: 2025-11-23 07:42:20.878276651 +0000 UTC m=+0.080956885 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553)
Nov 23 07:42:21 np0005532585.localdomain podman[35269]: 2025-11-23 07:42:21.014336896 +0000 UTC m=+0.217017120 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True)
Nov 23 07:42:21 np0005532585.localdomain sudo[35184]: pam_unix(sudo:session): session closed for user root
Nov 23 07:42:21 np0005532585.localdomain sudo[35333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:42:21 np0005532585.localdomain sudo[35333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:42:21 np0005532585.localdomain sudo[35333]: pam_unix(sudo:session): session closed for user root
Nov 23 07:43:07 np0005532585.localdomain sshd[35349]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:43:08 np0005532585.localdomain sshd[35349]: Invalid user julius from 187.72.57.82 port 33993
Nov 23 07:43:08 np0005532585.localdomain sshd[35349]: Received disconnect from 187.72.57.82 port 33993:11: Bye Bye [preauth]
Nov 23 07:43:08 np0005532585.localdomain sshd[35349]: Disconnected from invalid user julius 187.72.57.82 port 33993 [preauth]
Nov 23 07:43:22 np0005532585.localdomain sudo[35351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:43:22 np0005532585.localdomain sudo[35351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:43:22 np0005532585.localdomain sudo[35351]: pam_unix(sudo:session): session closed for user root
Nov 23 07:43:22 np0005532585.localdomain sudo[35366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 07:43:22 np0005532585.localdomain sudo[35366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:43:22 np0005532585.localdomain sshd[35381]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:43:22 np0005532585.localdomain podman[35453]: 2025-11-23 07:43:22.862127591 +0000 UTC m=+0.092373554 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 07:43:22 np0005532585.localdomain podman[35453]: 2025-11-23 07:43:22.990397815 +0000 UTC m=+0.220643618 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, version=7, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 07:43:23 np0005532585.localdomain sudo[35366]: pam_unix(sudo:session): session closed for user root
Nov 23 07:43:23 np0005532585.localdomain sudo[35518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:43:23 np0005532585.localdomain sudo[35518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:43:23 np0005532585.localdomain sudo[35518]: pam_unix(sudo:session): session closed for user root
Nov 23 07:43:23 np0005532585.localdomain sudo[35533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:43:23 np0005532585.localdomain sudo[35533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:43:23 np0005532585.localdomain sshd[35381]: Invalid user user12 from 34.124.148.87 port 43996
Nov 23 07:43:24 np0005532585.localdomain sshd[35381]: Received disconnect from 34.124.148.87 port 43996:11: Bye Bye [preauth]
Nov 23 07:43:24 np0005532585.localdomain sshd[35381]: Disconnected from invalid user user12 34.124.148.87 port 43996 [preauth]
Nov 23 07:43:24 np0005532585.localdomain sudo[35533]: pam_unix(sudo:session): session closed for user root
Nov 23 07:43:24 np0005532585.localdomain sudo[35580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:43:24 np0005532585.localdomain sudo[35580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:43:24 np0005532585.localdomain sudo[35580]: pam_unix(sudo:session): session closed for user root
Nov 23 07:43:31 np0005532585.localdomain sshd[24673]: Received disconnect from 192.168.122.100 port 35880:11: disconnected by user
Nov 23 07:43:31 np0005532585.localdomain sshd[24673]: Disconnected from user zuul 192.168.122.100 port 35880
Nov 23 07:43:31 np0005532585.localdomain sshd[24670]: pam_unix(sshd:session): session closed for user zuul
Nov 23 07:43:31 np0005532585.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Nov 23 07:43:31 np0005532585.localdomain systemd[1]: session-13.scope: Consumed 21.505s CPU time.
Nov 23 07:43:31 np0005532585.localdomain systemd-logind[761]: Session 13 logged out. Waiting for processes to exit.
Nov 23 07:43:31 np0005532585.localdomain systemd-logind[761]: Removed session 13.
Nov 23 07:43:43 np0005532585.localdomain sshd[28976]: fatal: Timeout before authentication for 61.233.4.50 port 47627
Nov 23 07:43:48 np0005532585.localdomain sshd[35595]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:43:48 np0005532585.localdomain sshd[35595]: error: kex_exchange_identification: banner line contains invalid characters
Nov 23 07:43:48 np0005532585.localdomain sshd[35595]: banner exchange: Connection from 64.62.156.142 port 62992: invalid format
Nov 23 07:44:14 np0005532585.localdomain sshd[35168]: fatal: Timeout before authentication for 70.166.167.59 port 40074
Nov 23 07:44:24 np0005532585.localdomain sudo[35596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:44:24 np0005532585.localdomain sudo[35596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:44:24 np0005532585.localdomain sudo[35596]: pam_unix(sudo:session): session closed for user root
Nov 23 07:44:24 np0005532585.localdomain sudo[35611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:44:24 np0005532585.localdomain sudo[35611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:44:25 np0005532585.localdomain sudo[35611]: pam_unix(sudo:session): session closed for user root
Nov 23 07:44:25 np0005532585.localdomain sudo[35657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:44:25 np0005532585.localdomain sudo[35657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:44:26 np0005532585.localdomain sudo[35657]: pam_unix(sudo:session): session closed for user root
Nov 23 07:44:46 np0005532585.localdomain sshd[35672]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:44:47 np0005532585.localdomain sshd[35672]: Invalid user sasan from 187.72.57.82 port 47793
Nov 23 07:44:47 np0005532585.localdomain sshd[35672]: Received disconnect from 187.72.57.82 port 47793:11: Bye Bye [preauth]
Nov 23 07:44:47 np0005532585.localdomain sshd[35672]: Disconnected from invalid user sasan 187.72.57.82 port 47793 [preauth]
Nov 23 07:44:53 np0005532585.localdomain sshd[35674]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:44:57 np0005532585.localdomain sshd[35674]: Connection closed by authenticating user operator 103.93.201.42 port 38068 [preauth]
Nov 23 07:44:58 np0005532585.localdomain sshd[35676]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:44:59 np0005532585.localdomain sshd[35676]: Invalid user printer from 34.124.148.87 port 52254
Nov 23 07:44:59 np0005532585.localdomain sshd[35676]: Received disconnect from 34.124.148.87 port 52254:11: Bye Bye [preauth]
Nov 23 07:44:59 np0005532585.localdomain sshd[35676]: Disconnected from invalid user printer 34.124.148.87 port 52254 [preauth]
Nov 23 07:45:26 np0005532585.localdomain sudo[35678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:45:26 np0005532585.localdomain sudo[35678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:45:26 np0005532585.localdomain sudo[35678]: pam_unix(sudo:session): session closed for user root
Nov 23 07:45:26 np0005532585.localdomain sudo[35693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:45:26 np0005532585.localdomain sudo[35693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:45:26 np0005532585.localdomain sudo[35693]: pam_unix(sudo:session): session closed for user root
Nov 23 07:45:27 np0005532585.localdomain sudo[35739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:45:27 np0005532585.localdomain sudo[35739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:45:27 np0005532585.localdomain sudo[35739]: pam_unix(sudo:session): session closed for user root
Nov 23 07:45:36 np0005532585.localdomain systemd[26178]: Created slice User Background Tasks Slice.
Nov 23 07:45:36 np0005532585.localdomain systemd[26178]: Starting Cleanup of User's Temporary Files and Directories...
Nov 23 07:45:36 np0005532585.localdomain systemd[26178]: Finished Cleanup of User's Temporary Files and Directories.
Nov 23 07:46:21 np0005532585.localdomain sshd[35755]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:46:22 np0005532585.localdomain sshd[35755]: Invalid user aris from 187.72.57.82 port 33357
Nov 23 07:46:22 np0005532585.localdomain sshd[35755]: Received disconnect from 187.72.57.82 port 33357:11: Bye Bye [preauth]
Nov 23 07:46:22 np0005532585.localdomain sshd[35755]: Disconnected from invalid user aris 187.72.57.82 port 33357 [preauth]
Nov 23 07:46:27 np0005532585.localdomain sudo[35757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:46:27 np0005532585.localdomain sudo[35757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:46:27 np0005532585.localdomain sudo[35757]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:27 np0005532585.localdomain sudo[35772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:46:27 np0005532585.localdomain sudo[35772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:46:28 np0005532585.localdomain sudo[35772]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:28 np0005532585.localdomain sudo[35820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:46:28 np0005532585.localdomain sudo[35820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:46:28 np0005532585.localdomain sudo[35820]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:32 np0005532585.localdomain sshd[35835]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:46:33 np0005532585.localdomain sshd[35835]: Invalid user ubuntu from 34.124.148.87 port 38184
Nov 23 07:46:33 np0005532585.localdomain sshd[35835]: Received disconnect from 34.124.148.87 port 38184:11: Bye Bye [preauth]
Nov 23 07:46:33 np0005532585.localdomain sshd[35835]: Disconnected from invalid user ubuntu 34.124.148.87 port 38184 [preauth]
Nov 23 07:46:56 np0005532585.localdomain sshd[35837]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:46:56 np0005532585.localdomain sshd[35837]: Accepted publickey for zuul from 192.168.122.100 port 34248 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:46:56 np0005532585.localdomain systemd-logind[761]: New session 27 of user zuul.
Nov 23 07:46:56 np0005532585.localdomain systemd[1]: Started Session 27 of User zuul.
Nov 23 07:46:56 np0005532585.localdomain sshd[35837]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 07:46:56 np0005532585.localdomain sudo[35883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfxozofilkximbxveckderknneifxixc ; /usr/bin/python3
Nov 23 07:46:56 np0005532585.localdomain sudo[35883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:56 np0005532585.localdomain python3[35885]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 23 07:46:56 np0005532585.localdomain sudo[35883]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:57 np0005532585.localdomain sudo[35928]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwinaotgnttybdxbszilpiojtmdgauza ; /usr/bin/python3
Nov 23 07:46:57 np0005532585.localdomain sudo[35928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:57 np0005532585.localdomain python3[35930]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 07:46:57 np0005532585.localdomain sudo[35928]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:57 np0005532585.localdomain sudo[35948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugltluxjtkejwfcwwaaswwopkrepcmgw ; /usr/bin/python3
Nov 23 07:46:57 np0005532585.localdomain sudo[35948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:58 np0005532585.localdomain python3[35950]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 23 07:46:58 np0005532585.localdomain useradd[35952]: new group: name=tripleo-admin, GID=1003
Nov 23 07:46:58 np0005532585.localdomain useradd[35952]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Nov 23 07:46:58 np0005532585.localdomain sudo[35948]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:58 np0005532585.localdomain sudo[36004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmuvvqaaknxabefwlgalhcfoeykppijk ; /usr/bin/python3
Nov 23 07:46:58 np0005532585.localdomain sudo[36004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:58 np0005532585.localdomain python3[36006]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:46:58 np0005532585.localdomain sudo[36004]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:58 np0005532585.localdomain sudo[36047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vivlzkcpmwykmggnchdjfzabbdloyaqk ; /usr/bin/python3
Nov 23 07:46:58 np0005532585.localdomain sudo[36047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:58 np0005532585.localdomain python3[36049]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763884018.3521554-66724-13791909958476/source _original_basename=tmpq65hmubd follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:46:58 np0005532585.localdomain sudo[36047]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:59 np0005532585.localdomain sudo[36077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulwctqifwmqvpwdnaabvgvxvhnefpslm ; /usr/bin/python3
Nov 23 07:46:59 np0005532585.localdomain sudo[36077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:59 np0005532585.localdomain python3[36079]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:46:59 np0005532585.localdomain sudo[36077]: pam_unix(sudo:session): session closed for user root
Nov 23 07:46:59 np0005532585.localdomain sudo[36093]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcntpdqhgrwmgktoldyoztisiotheebc ; /usr/bin/python3
Nov 23 07:46:59 np0005532585.localdomain sudo[36093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:46:59 np0005532585.localdomain python3[36095]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:46:59 np0005532585.localdomain sudo[36093]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:00 np0005532585.localdomain sudo[36109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dspsooydrcrpttshfxvfuoyknfyigbsu ; /usr/bin/python3
Nov 23 07:47:00 np0005532585.localdomain sudo[36109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:47:00 np0005532585.localdomain python3[36111]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:47:00 np0005532585.localdomain sudo[36109]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:00 np0005532585.localdomain sudo[36125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukdcllsdtrpwxqovmwaqgknnthgruztm ; /usr/bin/python3
Nov 23 07:47:00 np0005532585.localdomain sudo[36125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 07:47:00 np0005532585.localdomain python3[36127]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:47:00 np0005532585.localdomain sudo[36125]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:01 np0005532585.localdomain anacron[19004]: Job `cron.monthly' started
Nov 23 07:47:01 np0005532585.localdomain anacron[19004]: Job `cron.monthly' terminated
Nov 23 07:47:01 np0005532585.localdomain anacron[19004]: Normal exit (3 jobs run)
Nov 23 07:47:01 np0005532585.localdomain python3[36141]: ansible-ping Invoked with data=pong
Nov 23 07:47:12 np0005532585.localdomain sshd[36144]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:47:12 np0005532585.localdomain sshd[36144]: Accepted publickey for tripleo-admin from 192.168.122.100 port 55476 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 07:47:12 np0005532585.localdomain systemd-logind[761]: New session 28 of user tripleo-admin.
Nov 23 07:47:12 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 23 07:47:12 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 23 07:47:12 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 23 07:47:12 np0005532585.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Queued start job for default target Main User Target.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Created slice User Application Slice.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Reached target Paths.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Reached target Timers.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Starting D-Bus User Message Bus Socket...
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Starting Create User's Volatile Files and Directories...
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Listening on D-Bus User Message Bus Socket.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Finished Create User's Volatile Files and Directories.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Reached target Sockets.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Reached target Basic System.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Reached target Main User Target.
Nov 23 07:47:12 np0005532585.localdomain systemd[36148]: Startup finished in 117ms.
Nov 23 07:47:12 np0005532585.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 23 07:47:12 np0005532585.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Nov 23 07:47:12 np0005532585.localdomain sshd[36144]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 23 07:47:13 np0005532585.localdomain sudo[36208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcufrmyxmjuafpxlczygonauxsnkaolx ; /usr/bin/python3
Nov 23 07:47:13 np0005532585.localdomain sudo[36208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:13 np0005532585.localdomain python3[36210]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 07:47:13 np0005532585.localdomain sudo[36208]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:18 np0005532585.localdomain sudo[36228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icqiuhjyqjeqrczkraxinscurvbrzvoa ; /usr/bin/python3
Nov 23 07:47:18 np0005532585.localdomain sudo[36228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:18 np0005532585.localdomain python3[36230]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Nov 23 07:47:18 np0005532585.localdomain sudo[36228]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:19 np0005532585.localdomain sudo[36244]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpvtuzvznpfpdleqmdkkfrcnumrqymca ; /usr/bin/python3
Nov 23 07:47:19 np0005532585.localdomain sudo[36244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:19 np0005532585.localdomain python3[36246]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Nov 23 07:47:19 np0005532585.localdomain sudo[36244]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:19 np0005532585.localdomain sudo[36292]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sljvzpvntyosciuwvauqezpwisjvtgtr ; /usr/bin/python3
Nov 23 07:47:19 np0005532585.localdomain sudo[36292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:19 np0005532585.localdomain python3[36294]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.g4zi971qtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:47:19 np0005532585.localdomain sudo[36292]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:20 np0005532585.localdomain sudo[36322]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbmmhsxdgbsyhzoaacviryophmcewwok ; /usr/bin/python3
Nov 23 07:47:20 np0005532585.localdomain sudo[36322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:20 np0005532585.localdomain python3[36324]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.g4zi971qtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:47:20 np0005532585.localdomain sudo[36322]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:21 np0005532585.localdomain sudo[36338]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phrthpxrpzqrmaezkkkemgvzsyajmokt ; /usr/bin/python3
Nov 23 07:47:21 np0005532585.localdomain sudo[36338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:21 np0005532585.localdomain python3[36340]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.g4zi971qtmphosts insertbefore=BOF block=172.17.0.106 np0005532584.localdomain np0005532584
                                                         172.18.0.106 np0005532584.storage.localdomain np0005532584.storage
                                                         172.20.0.106 np0005532584.storagemgmt.localdomain np0005532584.storagemgmt
                                                         172.17.0.106 np0005532584.internalapi.localdomain np0005532584.internalapi
                                                         172.19.0.106 np0005532584.tenant.localdomain np0005532584.tenant
                                                         192.168.122.106 np0005532584.ctlplane.localdomain np0005532584.ctlplane
                                                         172.17.0.107 np0005532585.localdomain np0005532585
                                                         172.18.0.107 np0005532585.storage.localdomain np0005532585.storage
                                                         172.20.0.107 np0005532585.storagemgmt.localdomain np0005532585.storagemgmt
                                                         172.17.0.107 np0005532585.internalapi.localdomain np0005532585.internalapi
                                                         172.19.0.107 np0005532585.tenant.localdomain np0005532585.tenant
                                                         192.168.122.107 np0005532585.ctlplane.localdomain np0005532585.ctlplane
                                                         172.17.0.108 np0005532586.localdomain np0005532586
                                                         172.18.0.108 np0005532586.storage.localdomain np0005532586.storage
                                                         172.20.0.108 np0005532586.storagemgmt.localdomain np0005532586.storagemgmt
                                                         172.17.0.108 np0005532586.internalapi.localdomain np0005532586.internalapi
                                                         172.19.0.108 np0005532586.tenant.localdomain np0005532586.tenant
                                                         192.168.122.108 np0005532586.ctlplane.localdomain np0005532586.ctlplane
                                                         172.17.0.103 np0005532581.localdomain np0005532581
                                                         172.18.0.103 np0005532581.storage.localdomain np0005532581.storage
                                                         172.20.0.103 np0005532581.storagemgmt.localdomain np0005532581.storagemgmt
                                                         172.17.0.103 np0005532581.internalapi.localdomain np0005532581.internalapi
                                                         172.19.0.103 np0005532581.tenant.localdomain np0005532581.tenant
                                                         192.168.122.103 np0005532581.ctlplane.localdomain np0005532581.ctlplane
                                                         172.17.0.104 np0005532582.localdomain np0005532582
                                                         172.18.0.104 np0005532582.storage.localdomain np0005532582.storage
                                                         172.20.0.104 np0005532582.storagemgmt.localdomain np0005532582.storagemgmt
                                                         172.17.0.104 np0005532582.internalapi.localdomain np0005532582.internalapi
                                                         172.19.0.104 np0005532582.tenant.localdomain np0005532582.tenant
                                                         192.168.122.104 np0005532582.ctlplane.localdomain np0005532582.ctlplane
                                                         172.17.0.105 np0005532583.localdomain np0005532583
                                                         172.18.0.105 np0005532583.storage.localdomain np0005532583.storage
                                                         172.20.0.105 np0005532583.storagemgmt.localdomain np0005532583.storagemgmt
                                                         172.17.0.105 np0005532583.internalapi.localdomain np0005532583.internalapi
                                                         172.19.0.105 np0005532583.tenant.localdomain np0005532583.tenant
                                                         192.168.122.105 np0005532583.ctlplane.localdomain np0005532583.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.204  overcloud.storage.localdomain
                                                         172.20.0.141  overcloud.storagemgmt.localdomain
                                                         172.17.0.224  overcloud.internalapi.localdomain
                                                         172.21.0.154  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:47:21 np0005532585.localdomain sudo[36338]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:21 np0005532585.localdomain sudo[36354]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcrmjvzclmszqfgoiitztibmkmzogmlw ; /usr/bin/python3
Nov 23 07:47:21 np0005532585.localdomain sudo[36354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:22 np0005532585.localdomain python3[36356]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.g4zi971qtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:47:22 np0005532585.localdomain sudo[36354]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:22 np0005532585.localdomain sudo[36371]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlckpnrzgqpzwkocvocuucsqmamoyyap ; /usr/bin/python3
Nov 23 07:47:22 np0005532585.localdomain sudo[36371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:22 np0005532585.localdomain python3[36373]: ansible-file Invoked with path=/tmp/ansible.g4zi971qtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:47:22 np0005532585.localdomain sudo[36371]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:23 np0005532585.localdomain sudo[36387]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvpwgzxvehsbdwecjyjonzjdetwglrxd ; /usr/bin/python3
Nov 23 07:47:23 np0005532585.localdomain sudo[36387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:23 np0005532585.localdomain python3[36389]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:47:23 np0005532585.localdomain sudo[36387]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:24 np0005532585.localdomain sudo[36404]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzjgitlygifepghkywscgyiqmefsvrdf ; /usr/bin/python3
Nov 23 07:47:24 np0005532585.localdomain sudo[36404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:24 np0005532585.localdomain python3[36406]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:47:27 np0005532585.localdomain sudo[36404]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:28 np0005532585.localdomain sudo[36424]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elzlanxcfdjkucprvuvqrqqnyacbgfva ; /usr/bin/python3
Nov 23 07:47:28 np0005532585.localdomain sudo[36424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:28 np0005532585.localdomain python3[36426]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:47:28 np0005532585.localdomain sudo[36424]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:29 np0005532585.localdomain sudo[36428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:47:29 np0005532585.localdomain sudo[36428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:47:29 np0005532585.localdomain sudo[36428]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:29 np0005532585.localdomain sudo[36461]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvbsvpiuvilnklnpijuyzrwapxzypeeq ; /usr/bin/python3
Nov 23 07:47:29 np0005532585.localdomain sudo[36461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:47:29 np0005532585.localdomain sudo[36453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:47:29 np0005532585.localdomain sudo[36453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:47:29 np0005532585.localdomain python3[36472]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:47:29 np0005532585.localdomain sudo[36453]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:30 np0005532585.localdomain sudo[36507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:47:30 np0005532585.localdomain sudo[36507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:47:30 np0005532585.localdomain sudo[36507]: pam_unix(sudo:session): session closed for user root
Nov 23 07:47:44 np0005532585.localdomain groupadd[36689]: group added to /etc/group: name=puppet, GID=52
Nov 23 07:47:44 np0005532585.localdomain groupadd[36689]: group added to /etc/gshadow: name=puppet
Nov 23 07:47:44 np0005532585.localdomain groupadd[36689]: new group: name=puppet, GID=52
Nov 23 07:47:44 np0005532585.localdomain useradd[36696]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Nov 23 07:47:57 np0005532585.localdomain sshd[37265]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:47:58 np0005532585.localdomain sshd[37265]: Invalid user gitlab-runner from 187.72.57.82 port 47153
Nov 23 07:47:59 np0005532585.localdomain sshd[37265]: Received disconnect from 187.72.57.82 port 47153:11: Bye Bye [preauth]
Nov 23 07:47:59 np0005532585.localdomain sshd[37265]: Disconnected from invalid user gitlab-runner 187.72.57.82 port 47153 [preauth]
Nov 23 07:48:04 np0005532585.localdomain sshd[37299]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:48:06 np0005532585.localdomain sshd[37299]: Received disconnect from 34.124.148.87 port 59118:11: Bye Bye [preauth]
Nov 23 07:48:06 np0005532585.localdomain sshd[37299]: Disconnected from authenticating user root 34.124.148.87 port 59118 [preauth]
Nov 23 07:48:08 np0005532585.localdomain sshd[37322]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:48:10 np0005532585.localdomain sshd[37322]: Invalid user 1 from 80.94.95.115 port 35360
Nov 23 07:48:10 np0005532585.localdomain sshd[37322]: Connection closed by invalid user 1 80.94.95.115 port 35360 [preauth]
Nov 23 07:48:30 np0005532585.localdomain sudo[37460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:48:30 np0005532585.localdomain sudo[37460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:48:30 np0005532585.localdomain sudo[37460]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:30 np0005532585.localdomain sudo[37475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:48:30 np0005532585.localdomain sudo[37475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:48:31 np0005532585.localdomain sudo[37475]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:31 np0005532585.localdomain sudo[37521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:48:31 np0005532585.localdomain sudo[37521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:48:31 np0005532585.localdomain sudo[37521]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  Converting 2699 SID table entries...
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:48:40 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:48:40 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 23 07:48:40 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:48:40 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 07:48:40 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:48:40 np0005532585.localdomain systemd-rc-local-generator[37668]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:48:40 np0005532585.localdomain systemd-sysv-generator[37671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:48:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:48:41 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 07:48:41 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 07:48:41 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 07:48:41 np0005532585.localdomain systemd[1]: run-r6b924f3f86914144a6454e657bd0b87d.service: Deactivated successfully.
Nov 23 07:48:42 np0005532585.localdomain sudo[36461]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:42 np0005532585.localdomain sudo[38108]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onnpgwnfvwcysaqkzmwkikzrwzlobggs ; /usr/bin/python3
Nov 23 07:48:42 np0005532585.localdomain sudo[38108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:42 np0005532585.localdomain python3[38110]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:43 np0005532585.localdomain sudo[38108]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:43 np0005532585.localdomain sudo[38247]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iggzgqjmsfgmqgekonsxcwksbcufsgus ; /usr/bin/python3
Nov 23 07:48:43 np0005532585.localdomain sudo[38247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:44 np0005532585.localdomain python3[38249]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:48:44 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:48:44 np0005532585.localdomain systemd-rc-local-generator[38275]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:48:44 np0005532585.localdomain systemd-sysv-generator[38280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:48:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:48:44 np0005532585.localdomain sudo[38247]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:45 np0005532585.localdomain sudo[38301]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujmxcxsxhddxxnlkiuaadfdtltspdtlv ; /usr/bin/python3
Nov 23 07:48:45 np0005532585.localdomain sudo[38301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:45 np0005532585.localdomain python3[38303]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:48:45 np0005532585.localdomain sudo[38301]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:46 np0005532585.localdomain sudo[38317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmesfnzfakkrrselsneeceycqvcvyssf ; /usr/bin/python3
Nov 23 07:48:46 np0005532585.localdomain sudo[38317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:46 np0005532585.localdomain python3[38319]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:46 np0005532585.localdomain sudo[38317]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:46 np0005532585.localdomain sudo[38334]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuxxgdnbfzpdomysgnsnjpiqpealtvmo ; /usr/bin/python3
Nov 23 07:48:46 np0005532585.localdomain sudo[38334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:47 np0005532585.localdomain python3[38336]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 07:48:47 np0005532585.localdomain sudo[38334]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:47 np0005532585.localdomain sudo[38352]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfaenztypmhepiejbdftlyreomwjnqqg ; /usr/bin/python3
Nov 23 07:48:47 np0005532585.localdomain sudo[38352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:47 np0005532585.localdomain python3[38354]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:48:47 np0005532585.localdomain sudo[38352]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:47 np0005532585.localdomain sudo[38370]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blvbuzvbwbvwrncabmzwhyptzktkfxlp ; /usr/bin/python3
Nov 23 07:48:47 np0005532585.localdomain sudo[38370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:48 np0005532585.localdomain python3[38372]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:48:48 np0005532585.localdomain sudo[38370]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:48 np0005532585.localdomain sudo[38388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chzgeowwvcjhklkrxkoowotzdiabreis ; /usr/bin/python3
Nov 23 07:48:48 np0005532585.localdomain sudo[38388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:48 np0005532585.localdomain python3[38390]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:48:48 np0005532585.localdomain systemd[1]: Reloading Network Manager...
Nov 23 07:48:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763884128.7652] audit: op="reload" arg="0" pid=38393 uid=0 result="success"
Nov 23 07:48:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763884128.7661] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Nov 23 07:48:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763884128.7662] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Nov 23 07:48:48 np0005532585.localdomain systemd[1]: Reloaded Network Manager.
Nov 23 07:48:48 np0005532585.localdomain sudo[38388]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:49 np0005532585.localdomain sudo[38407]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqyztinstwgrvngvhxwlglaojstuwzry ; /usr/bin/python3
Nov 23 07:48:49 np0005532585.localdomain sudo[38407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:50 np0005532585.localdomain python3[38409]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:50 np0005532585.localdomain sudo[38407]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:50 np0005532585.localdomain sudo[38424]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adwxhddmfdjptpgagyjpskqfvrvrecqw ; /usr/bin/python3
Nov 23 07:48:50 np0005532585.localdomain sudo[38424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:50 np0005532585.localdomain python3[38426]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:48:50 np0005532585.localdomain sudo[38424]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:50 np0005532585.localdomain sudo[38442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofqdnslyjuyimdemqspwdwtqmewcddcs ; /usr/bin/python3
Nov 23 07:48:50 np0005532585.localdomain sudo[38442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:50 np0005532585.localdomain python3[38444]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:48:51 np0005532585.localdomain sudo[38442]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:51 np0005532585.localdomain sudo[38458]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isbbaticqkvcpxdctmxsuzkhkhnvmocp ; /usr/bin/python3
Nov 23 07:48:51 np0005532585.localdomain sudo[38458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:51 np0005532585.localdomain python3[38460]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:48:51 np0005532585.localdomain sudo[38458]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:52 np0005532585.localdomain sudo[38474]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocsujzsucksoflgbkowgolvlnmqlhzec ; /usr/bin/python3
Nov 23 07:48:52 np0005532585.localdomain sudo[38474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:52 np0005532585.localdomain python3[38476]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 23 07:48:52 np0005532585.localdomain sudo[38474]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:52 np0005532585.localdomain sudo[38490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgjfvmdijygblaizsmgcbydmyqxbeuoh ; /usr/bin/python3
Nov 23 07:48:52 np0005532585.localdomain sudo[38490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:52 np0005532585.localdomain python3[38492]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:48:52 np0005532585.localdomain sudo[38490]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:53 np0005532585.localdomain sudo[38506]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpfshyyqzvgkzqpvgjumjqclhlogsvmr ; /usr/bin/python3
Nov 23 07:48:53 np0005532585.localdomain sudo[38506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:53 np0005532585.localdomain python3[38508]: ansible-blockinfile Invoked with path=/tmp/ansible.wmck7r26 block=[192.168.122.106]*,[np0005532584.ctlplane.localdomain]*,[172.17.0.106]*,[np0005532584.internalapi.localdomain]*,[172.18.0.106]*,[np0005532584.storage.localdomain]*,[172.20.0.106]*,[np0005532584.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005532584.tenant.localdomain]*,[np0005532584.localdomain]*,[np0005532584]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3OrbPXlomvlluk5pGQwXwJu+cR1IMLHg5EnGcI5epB1SB6q/EzlEo5+bOYmmvILsoesUzBIBq21mRhn1Wi2yjlys0pArFDqiLkUBvTW9ro6MKci9Smc12m7AkLus6UO6h3pzqcOdRZQ3KOQDL/83yYJVBCJyqlISXWzzHJpGRVnZHeT4CgKZ1nG5UEvOrtPXRAVWkz3v5TghJrYXvWaPQPmWcEy1rfhCjkCfQY++JB/Dlgammmd1+ZldadeXQi1b2X02a6GFyW0pUMFLjAP7Wr+KcRa5FIPmGwsPuc1NhveAH6zyLrabrh7jPR5O0tBjz9KcNYXbQmJetGt9ZWzFsl0qzXrvI38q5RlGptbqg0iSez61VBAUtnfs33hnYc3dvzJKXReR76PoU3yu/tLrhdK6szqIVsMdw2LGEro7l3KKMKXHSpi8n77fH8ICiU3F5Oif+nvS/e7xr4LccSEnFEHA9PdNxOWxJYLcxTQCt3BkNFrWw4oB1LiDsn98HlS8=
                                                         [192.168.122.107]*,[np0005532585.ctlplane.localdomain]*,[172.17.0.107]*,[np0005532585.internalapi.localdomain]*,[172.18.0.107]*,[np0005532585.storage.localdomain]*,[172.20.0.107]*,[np0005532585.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005532585.tenant.localdomain]*,[np0005532585.localdomain]*,[np0005532585]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCU6ocW8HWtJJyWPSFUqcN5z70XYnNrE5KeWh/VJ4bDkpVePpxxcdD8r8cKL121q0MKPRgia3jLqnKz+o4MH3AqTAWCZamBc1+ePq9OvZDenK69byea8TM176uYzfePjNlud4LSZ6lfkgneO5jeNE6/RcHgBc8Me+2mlzpavioA814r6Ci6hFaEIOS1Zd2b/yKzI4QRl6xg/aJKvlIe9w3G3BvKOG5pixPx2ng4wYc0OMtJb9ItJgZLY92GGuvVRwn9e0D4lab84+x/Nn3XatQdqU69ev7da/bQCUeBivyEZo03olh56YxCKvNfG3ZYwwhMTn9Hg/EdnwrGHYHj0ZgfSR1+Dzvnk0WW/MRs0276Ojj5O0hhnlaAh5n97W6fgHldGKvdEafYeD602C1Zkd+ISqF13W56MWhtUhiUsdUHShnpM/EBOITg6mTDFP1i/qMS0PjRaCzBpdqpJIoKzQpsi4Z3QTHTZ7uK/lqOEaE/wqXHuYlMKcTuOuX33gIp28k=
                                                         [192.168.122.108]*,[np0005532586.ctlplane.localdomain]*,[172.17.0.108]*,[np0005532586.internalapi.localdomain]*,[172.18.0.108]*,[np0005532586.storage.localdomain]*,[172.20.0.108]*,[np0005532586.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005532586.tenant.localdomain]*,[np0005532586.localdomain]*,[np0005532586]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD6U4JggC29IKqxQ7GjhK23AehQb1S2zLryOxLwLEs9rP0qOZpJ9wR1VsBNLXDCmoRVTsH2+3V00hmkvlanKUuzgmLO61hdur+5NQD0xHnY7lOLpOoyR7hJiMuHj/nRgBLWY2OB8Gim121dgfuc2zRF92igDYe65Uf0et83vWlgRmc7KlziaJ91iVcBUmhGYf3Ij7QxfhQH5TTnGoQizdiBpuP+yVuU2AepbvQ8ZFvzioCwzWAVu/xfdRFp9QyLT4JP1jM6dadTjD5RUAjRL6qR1tLXVq/rvqtXSL8ruBSYm3NCOys9RtdrNolZ7frd+zmvF+VzMNLtlRxiuy1ReR+ZO3felB+4TwfEfLZ+DqE1s3+ksCQH/sVCrxzFsRz5lamWG3p78ZBWTiQ/7WdJS1dQOHz+pKNSSW/NYMIqitxsCsEWPJLq/EWoHVxvjREucCb5YvWHPKOv5RLlbm5lSHFLuFVV8O3AAzD/3JsjTbKGOjJhmtxPCgEy7RPqtIUX90s=
                                                         [192.168.122.103]*,[np0005532581.ctlplane.localdomain]*,[172.17.0.103]*,[np0005532581.internalapi.localdomain]*,[172.18.0.103]*,[np0005532581.storage.localdomain]*,[172.20.0.103]*,[np0005532581.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005532581.tenant.localdomain]*,[np0005532581.localdomain]*,[np0005532581]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRibSMIP5+E9lJWuaKDEuCaJoGhGPTqff+o8SP2Twk+NhPOa5FC7WQhHPLXVhKAtlCX60ckYE53Q/H/RVRZ55JdWQLSdY/1tQCD6c0Ry6N+UD+mxo9iN9cHk6vd6J5kJu+v/gBEmFY1A9pjzsD1CTR8gZJHZFqbUTzXrKkoUjK3Kqa8UtvzyhgYQtYIaUwaf1z7CMNQ3A4EaGVKyRsVwb11jlaT9fjB43E3tp9p5EG6PPJEGux/Xea6iHnhSwZHpkD/ylneDOkBbGvYKhL33bpXMcbuHy32jAFr+2Q07sKvgy/b5/f/nTgNCyxEIpoXUbEhX+Vlh+gycU7KJw6FRyR3dQFjooV97NQ/oov2VP9DnTObziZA8lhaJ20ChTfDVUyvFCFi3dKgBUPCeNWCGI69eNHu3dQcwCNJ3kANqhHdkYpBd00PVBritJfxfzH1DCLo0I9CSi1buWYhein9VHZWtzePv/+ucWERRIo+J04QPkV+6P6vgOTRl5U75RctJU=
                                                         [192.168.122.104]*,[np0005532582.ctlplane.localdomain]*,[172.17.0.104]*,[np0005532582.internalapi.localdomain]*,[172.18.0.104]*,[np0005532582.storage.localdomain]*,[172.20.0.104]*,[np0005532582.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005532582.tenant.localdomain]*,[np0005532582.localdomain]*,[np0005532582]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0v47OVdr7YS/5xSUmMc7u26O7OwPomkdDR6s8rrcencbx7seRSeU00QGeRQcJJ023bD3xk26W8iiJTRUDkYSy//cSfHODdDy+CNEfDUTkGzIjiApoLi2b+S4J6wcAldMsj02MZmx67vUHyM5Qwok+22XqopryL8BiGPJbnoUcZy773f5OKPPMNuj3Fyb7jd5mrC7awK4NniZHyHPYBQeBa234HL42fRjcOqCcxuauy5cbz9PeBv5/kg+nYc8cY5qCyLqNhzMVRUa/PcepMBcfThk17LtPGzCYS7IR2cGdUDP6Pe0QD34Hu6+mpwKwYx73v5uHcmy9CeZ8fK83/F84Lr6jxsiwoU2e+hUfzVRq8gnkjk6kuL86eSM2POSGgBYYgCb+Ma6lOkF1MA+rLAh0gAsUhBgVlz6HtaMoDvLOi/NrQeoQyNE1Pv4vPAndmGGc8A7JCtmCMk9VvMy0Ht4IOvtDJFfx1lg7NuMIKqePYTEk56p8wTUNM+BmdJEhFPU=
                                                         [192.168.122.105]*,[np0005532583.ctlplane.localdomain]*,[172.17.0.105]*,[np0005532583.internalapi.localdomain]*,[172.18.0.105]*,[np0005532583.storage.localdomain]*,[172.20.0.105]*,[np0005532583.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005532583.tenant.localdomain]*,[np0005532583.localdomain]*,[np0005532583]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkB1Cq8AQaEBYTlv5Hzs024jg//D6wieNnvsI5WcYj7wckm9vKTJQfUD6yZBMmyPw6+vVzsM16bj2hagkDR5wkO7uSIaMqWrcoQ1h9HkJQLK8QB0iuzUvQzdr22kUgkLII8thNHK4VxF4VhAKNmzqCofZ4ZSaLUMwauFCFUjx1VJISEZdgYRZ4+++wAN5bdK+WrwSOAHJYJWQX2pRRsPiunSdY1BOUKB3sp7IBcQ3MDJgnKlkR7tiGSYB2W8JsLvIsIb0I2EaqmPUTIzKUuxSJnWEls/WyDT9MNkjhobVeAyFZ5TEik4OvobUhVGJ8CsU7O101KQNQ3IywPM+V0UpjA1yK49z5Qs0LjApmqORsTcjOojYaKGr9n64dVjXdFOMwajB9UmMEFtlIngm6kx7mJQGXqYxVAscW34JY832iKOEzQWrUSdo6mVJ7TXhYYcbdFp+G/128SfhNrbHwKinHeE9Nqu48BR7bmRZXO7ef+UMY1dG3AIvFt4JwFvLihZc=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:48:53 np0005532585.localdomain sudo[38506]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:53 np0005532585.localdomain sudo[38522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvcmztloynwxtodbvgpgplyhpkkzcspg ; /usr/bin/python3
Nov 23 07:48:53 np0005532585.localdomain sudo[38522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:54 np0005532585.localdomain python3[38524]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wmck7r26' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:54 np0005532585.localdomain sudo[38522]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:54 np0005532585.localdomain sudo[38540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udonlghsvgkspfbpucjaasgbohqqnlwj ; /usr/bin/python3
Nov 23 07:48:54 np0005532585.localdomain sudo[38540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:54 np0005532585.localdomain python3[38542]: ansible-file Invoked with path=/tmp/ansible.wmck7r26 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:48:54 np0005532585.localdomain sudo[38540]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:55 np0005532585.localdomain sudo[38556]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vahjomufqbilehbteffgroyuivlhlscd ; /usr/bin/python3
Nov 23 07:48:55 np0005532585.localdomain sudo[38556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:55 np0005532585.localdomain python3[38558]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:48:55 np0005532585.localdomain sudo[38556]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:55 np0005532585.localdomain sudo[38572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmctvuptrhrhlkgzlxwqyhwolpcttgbz ; /usr/bin/python3
Nov 23 07:48:55 np0005532585.localdomain sudo[38572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:55 np0005532585.localdomain python3[38574]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:55 np0005532585.localdomain sudo[38572]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:55 np0005532585.localdomain sudo[38590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgubjpvdconhlieecseudwfhrebrlnnc ; /usr/bin/python3
Nov 23 07:48:55 np0005532585.localdomain sudo[38590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:55 np0005532585.localdomain python3[38592]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:55 np0005532585.localdomain sudo[38590]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:56 np0005532585.localdomain sudo[38609]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyhguoptqvrqqzwanxjuulbznjdrxcpl ; /usr/bin/python3
Nov 23 07:48:56 np0005532585.localdomain sudo[38609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:56 np0005532585.localdomain python3[38611]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Nov 23 07:48:56 np0005532585.localdomain sudo[38609]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:56 np0005532585.localdomain sudo[38625]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhvscptxpkcfepwoiuhncwhswdoypshq ; /usr/bin/python3
Nov 23 07:48:56 np0005532585.localdomain sudo[38625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:56 np0005532585.localdomain sudo[38625]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:57 np0005532585.localdomain sudo[38673]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ituplgztaxfuptzppavcdvgqaushdxxn ; /usr/bin/python3
Nov 23 07:48:57 np0005532585.localdomain sudo[38673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:57 np0005532585.localdomain sudo[38673]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:57 np0005532585.localdomain sudo[38716]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yncabadxnvogsphqciebnosadleenzlp ; /usr/bin/python3
Nov 23 07:48:57 np0005532585.localdomain sudo[38716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:57 np0005532585.localdomain sudo[38716]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:58 np0005532585.localdomain sudo[38746]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anbrtczseretbyhimofvtqwjuyahbsns ; /usr/bin/python3
Nov 23 07:48:58 np0005532585.localdomain sudo[38746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:59 np0005532585.localdomain python3[38748]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:48:59 np0005532585.localdomain sudo[38746]: pam_unix(sudo:session): session closed for user root
Nov 23 07:48:59 np0005532585.localdomain sudo[38763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppotyzoerzfbcshentkbppqagacqbczu ; /usr/bin/python3
Nov 23 07:48:59 np0005532585.localdomain sudo[38763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:48:59 np0005532585.localdomain python3[38765]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:49:02 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:49:02 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:49:02 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:49:02 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 07:49:02 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:49:02 np0005532585.localdomain systemd-rc-local-generator[38836]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:49:02 np0005532585.localdomain systemd-sysv-generator[38839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:49:02 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:49:02 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 07:49:02 np0005532585.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: tuned.service: Consumed 1.621s CPU time.
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 07:49:03 np0005532585.localdomain systemd[1]: run-r66937b8c3a3b4851987bc97de40f15d9.service: Deactivated successfully.
Nov 23 07:49:04 np0005532585.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 07:49:04 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:49:04 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 07:49:04 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 07:49:04 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 07:49:04 np0005532585.localdomain systemd[1]: run-r242fc32500e148adbd6fc9f0ea9df66d.service: Deactivated successfully.
Nov 23 07:49:05 np0005532585.localdomain sudo[38763]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:05 np0005532585.localdomain sudo[39200]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fetzjpqgeczaxwssqchwtppyelrqtnbu ; /usr/bin/python3
Nov 23 07:49:05 np0005532585.localdomain sudo[39200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:05 np0005532585.localdomain python3[39202]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:49:05 np0005532585.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 23 07:49:05 np0005532585.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 23 07:49:05 np0005532585.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 23 07:49:05 np0005532585.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 07:49:07 np0005532585.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 07:49:07 np0005532585.localdomain sudo[39200]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:07 np0005532585.localdomain sudo[39395]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwyevnkgjryyjunutrxahnoafpsuklbq ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 23 07:49:07 np0005532585.localdomain sudo[39395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:07 np0005532585.localdomain python3[39397]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:07 np0005532585.localdomain sudo[39395]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:08 np0005532585.localdomain sudo[39412]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmiywkyfgpxijzpzxlzvcudvabywhgtz ; /usr/bin/python3
Nov 23 07:49:08 np0005532585.localdomain sudo[39412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:08 np0005532585.localdomain python3[39414]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Nov 23 07:49:08 np0005532585.localdomain sudo[39412]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:08 np0005532585.localdomain sudo[39428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebcqslrhgtdctbzgqwulzxckbjbgawax ; /usr/bin/python3
Nov 23 07:49:08 np0005532585.localdomain sudo[39428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:08 np0005532585.localdomain python3[39430]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:49:08 np0005532585.localdomain sudo[39428]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:09 np0005532585.localdomain sudo[39444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmbtdghqhlgefqcsvrnwquhawbhbdodw ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 23 07:49:09 np0005532585.localdomain sudo[39444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:09 np0005532585.localdomain python3[39446]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:10 np0005532585.localdomain sudo[39444]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:11 np0005532585.localdomain sudo[39464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snteujxmitntxxurmvbfqdqxhovhzaoq ; /usr/bin/python3
Nov 23 07:49:11 np0005532585.localdomain sudo[39464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:11 np0005532585.localdomain python3[39466]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:11 np0005532585.localdomain sudo[39464]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:11 np0005532585.localdomain sudo[39481]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eohpoccvtxrqmxlqpsgepikjsyvlibnk ; /usr/bin/python3
Nov 23 07:49:11 np0005532585.localdomain sudo[39481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:11 np0005532585.localdomain python3[39483]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:49:11 np0005532585.localdomain sudo[39481]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:14 np0005532585.localdomain sudo[39497]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbkewcmpxpgdeyfgatqmgyxrnxakynde ; /usr/bin/python3
Nov 23 07:49:14 np0005532585.localdomain sudo[39497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:14 np0005532585.localdomain python3[39499]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:14 np0005532585.localdomain sudo[39497]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:18 np0005532585.localdomain sudo[39513]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwlxdawvyhdkcjoeluqapcpyafqjonsx ; /usr/bin/python3
Nov 23 07:49:18 np0005532585.localdomain sudo[39513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:18 np0005532585.localdomain python3[39515]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:18 np0005532585.localdomain sudo[39513]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:19 np0005532585.localdomain sudo[39561]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ownwluzchnrlsokkjdkpmwdigwbusvsf ; /usr/bin/python3
Nov 23 07:49:19 np0005532585.localdomain sudo[39561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:19 np0005532585.localdomain python3[39563]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:19 np0005532585.localdomain sudo[39561]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:19 np0005532585.localdomain sudo[39606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiqjpourdtffpajmafjgqecaxwadbnvy ; /usr/bin/python3
Nov 23 07:49:19 np0005532585.localdomain sudo[39606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:19 np0005532585.localdomain python3[39608]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884159.114268-71272-49181084642236/source _original_basename=tmp7opzimul follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:19 np0005532585.localdomain sudo[39606]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:20 np0005532585.localdomain sudo[39636]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egzkgxqikquiyipiqehxuqhejzgyqfzm ; /usr/bin/python3
Nov 23 07:49:20 np0005532585.localdomain sudo[39636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:20 np0005532585.localdomain python3[39638]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:20 np0005532585.localdomain sudo[39636]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:20 np0005532585.localdomain sudo[39684]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncjtmahrlsaukpyfrrezbonxfplrguom ; /usr/bin/python3
Nov 23 07:49:20 np0005532585.localdomain sudo[39684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:21 np0005532585.localdomain python3[39686]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:21 np0005532585.localdomain sudo[39684]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:21 np0005532585.localdomain sudo[39727]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izvaopofhwtnvwzxcmmcexcacwkeyswi ; /usr/bin/python3
Nov 23 07:49:21 np0005532585.localdomain sudo[39727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:21 np0005532585.localdomain python3[39729]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884160.7282596-71412-171025363337915/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=72c5ef7909b5cdbbb2310fa1b5c8d166a17f7155 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:21 np0005532585.localdomain sudo[39727]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:21 np0005532585.localdomain sudo[39789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fndxkvygdzcdcfwuufnxyxkchuqdhuzl ; /usr/bin/python3
Nov 23 07:49:21 np0005532585.localdomain sudo[39789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:21 np0005532585.localdomain python3[39791]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:21 np0005532585.localdomain sudo[39789]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:22 np0005532585.localdomain sudo[39832]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkqpefraalmdzxrtbefvwwgnebruwjfu ; /usr/bin/python3
Nov 23 07:49:22 np0005532585.localdomain sudo[39832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:22 np0005532585.localdomain python3[39834]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884161.588198-71471-110575407979223/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=6552073e0e4bb04b7faeda3f8c2098edf889171a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:22 np0005532585.localdomain sudo[39832]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:22 np0005532585.localdomain sudo[39894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcryjaxycjbfuirxjshwmcydkzsllnet ; /usr/bin/python3
Nov 23 07:49:22 np0005532585.localdomain sudo[39894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:22 np0005532585.localdomain python3[39896]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:22 np0005532585.localdomain sudo[39894]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:23 np0005532585.localdomain sudo[39937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztjlopwfhmomuwlvredfouhubbbqbomu ; /usr/bin/python3
Nov 23 07:49:23 np0005532585.localdomain sudo[39937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:23 np0005532585.localdomain python3[39939]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884162.4989095-71471-97080044522266/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=1bc51567bc68ec6d87ea2fcfee756b886ebb9f92 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:23 np0005532585.localdomain sudo[39937]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:23 np0005532585.localdomain sudo[39999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghohtxnwoilzvqsucddpymbyvbitwzxx ; /usr/bin/python3
Nov 23 07:49:23 np0005532585.localdomain sudo[39999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:23 np0005532585.localdomain python3[40001]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:23 np0005532585.localdomain sudo[39999]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:24 np0005532585.localdomain sudo[40042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzhovcmtqbokyedhnidbkkpiiimwnddi ; /usr/bin/python3
Nov 23 07:49:24 np0005532585.localdomain sudo[40042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:24 np0005532585.localdomain python3[40044]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884163.3904185-71471-175239504976019/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=175c760950d63a47f443f25b58088dba962f090b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:24 np0005532585.localdomain sudo[40042]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:24 np0005532585.localdomain sudo[40104]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmyojqyreqhhmmsmjgrlkgzpynpygofs ; /usr/bin/python3
Nov 23 07:49:24 np0005532585.localdomain sudo[40104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:24 np0005532585.localdomain python3[40106]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:24 np0005532585.localdomain sudo[40104]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:24 np0005532585.localdomain sudo[40147]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huwqrycwftduulvivdtajvleenmalfai ; /usr/bin/python3
Nov 23 07:49:24 np0005532585.localdomain sudo[40147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:25 np0005532585.localdomain python3[40149]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884164.3886087-71471-256168401867334/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:25 np0005532585.localdomain sudo[40147]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:25 np0005532585.localdomain sudo[40209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhczslijjnaisitulpgblqlzxlvnoouz ; /usr/bin/python3
Nov 23 07:49:25 np0005532585.localdomain sudo[40209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:25 np0005532585.localdomain python3[40211]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:25 np0005532585.localdomain sudo[40209]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:25 np0005532585.localdomain sudo[40252]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lebqmobjlfjmojggacbbhcbfzeoiohnj ; /usr/bin/python3
Nov 23 07:49:25 np0005532585.localdomain sudo[40252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:25 np0005532585.localdomain python3[40254]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884165.2767217-71471-183434796375795/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=56e83bd8f0316b152a1db4641581399e13c698c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:26 np0005532585.localdomain sudo[40252]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:26 np0005532585.localdomain sudo[40314]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bodamdsyrhxruhdxliwwlexchjtrtobz ; /usr/bin/python3
Nov 23 07:49:26 np0005532585.localdomain sudo[40314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:26 np0005532585.localdomain python3[40316]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:26 np0005532585.localdomain sudo[40314]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:26 np0005532585.localdomain sudo[40357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkemxsncpnxwxtgxlrxyhwehevgsyklv ; /usr/bin/python3
Nov 23 07:49:26 np0005532585.localdomain sudo[40357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:26 np0005532585.localdomain python3[40359]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884166.159478-71471-228978687949099/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:26 np0005532585.localdomain sudo[40357]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:27 np0005532585.localdomain sudo[40419]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pafzwzamzeuuvrwprnirvgemgqpfotfp ; /usr/bin/python3
Nov 23 07:49:27 np0005532585.localdomain sudo[40419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:27 np0005532585.localdomain python3[40421]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:27 np0005532585.localdomain sudo[40419]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:27 np0005532585.localdomain sudo[40462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpeoaukwahhxntkytblvzdppgoqbxnst ; /usr/bin/python3
Nov 23 07:49:27 np0005532585.localdomain sudo[40462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:27 np0005532585.localdomain python3[40464]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884167.0312788-71471-165341677504853/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=66f0a2c6a0832caadadc4d66bd975147c152464b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:27 np0005532585.localdomain sudo[40462]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:28 np0005532585.localdomain sudo[40524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gttakuredjudlhiznbkzuhouqbhzmgzq ; /usr/bin/python3
Nov 23 07:49:28 np0005532585.localdomain sudo[40524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:28 np0005532585.localdomain python3[40526]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:28 np0005532585.localdomain sudo[40524]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:28 np0005532585.localdomain sudo[40567]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjzsqzmuhqovcvcfftxsysmuoryvrawe ; /usr/bin/python3
Nov 23 07:49:28 np0005532585.localdomain sudo[40567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:28 np0005532585.localdomain python3[40569]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884167.902713-71471-161681992095249/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:28 np0005532585.localdomain sudo[40567]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:28 np0005532585.localdomain sudo[40629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgivybbfvrchkbhcavwacypvtgmvprst ; /usr/bin/python3
Nov 23 07:49:28 np0005532585.localdomain sudo[40629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:29 np0005532585.localdomain python3[40631]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:29 np0005532585.localdomain sudo[40629]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:29 np0005532585.localdomain sudo[40672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rektsysbdgiofcntazqznlmvykbaphff ; /usr/bin/python3
Nov 23 07:49:29 np0005532585.localdomain sudo[40672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:29 np0005532585.localdomain python3[40674]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884168.7701542-71471-54531224539471/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:29 np0005532585.localdomain sudo[40672]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:29 np0005532585.localdomain sudo[40734]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnjzqwxqznyklfbmoghmzumkdklysdnj ; /usr/bin/python3
Nov 23 07:49:29 np0005532585.localdomain sudo[40734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:29 np0005532585.localdomain python3[40736]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:29 np0005532585.localdomain sudo[40734]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:30 np0005532585.localdomain sudo[40777]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbpimgrxhatcbgpsyuluzwajpupaiwet ; /usr/bin/python3
Nov 23 07:49:30 np0005532585.localdomain sudo[40777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:30 np0005532585.localdomain python3[40779]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884169.5790863-71471-203678503565073/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=3376f53556298731e6a35da8f5186b37e7a2bd16 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:30 np0005532585.localdomain sudo[40777]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:31 np0005532585.localdomain sudo[40807]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ionqlyyvxmexsgbsmjvvwpiahqqglpvu ; /usr/bin/python3
Nov 23 07:49:31 np0005532585.localdomain sudo[40807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:31 np0005532585.localdomain python3[40809]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:49:31 np0005532585.localdomain sudo[40807]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:31 np0005532585.localdomain sudo[40855]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcbnueqvvsxtxzsdhydqnoofwahokkuq ; /usr/bin/python3
Nov 23 07:49:31 np0005532585.localdomain sudo[40855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:31 np0005532585.localdomain python3[40857]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:49:31 np0005532585.localdomain sudo[40855]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:31 np0005532585.localdomain sudo[40858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:49:31 np0005532585.localdomain sudo[40858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:49:31 np0005532585.localdomain sudo[40858]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:31 np0005532585.localdomain sudo[40886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:49:31 np0005532585.localdomain sudo[40886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:49:32 np0005532585.localdomain sudo[40928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdlylasfpteknjauuhzuppfqepueoqpj ; /usr/bin/python3
Nov 23 07:49:32 np0005532585.localdomain sudo[40928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:32 np0005532585.localdomain python3[40930]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884171.5423822-72256-205073522694703/source _original_basename=tmpyycxxdqj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:49:32 np0005532585.localdomain sudo[40928]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:32 np0005532585.localdomain sudo[40886]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:33 np0005532585.localdomain sudo[40978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:49:33 np0005532585.localdomain sudo[40978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:49:33 np0005532585.localdomain sudo[40978]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:36 np0005532585.localdomain sudo[41006]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upmqdpdkofqntsykodpmtadlfsksmzsd ; /usr/bin/python3
Nov 23 07:49:36 np0005532585.localdomain sudo[41006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:36 np0005532585.localdomain systemd[36148]: Starting Mark boot as successful...
Nov 23 07:49:36 np0005532585.localdomain systemd[36148]: Finished Mark boot as successful.
Nov 23 07:49:37 np0005532585.localdomain python3[41008]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 07:49:37 np0005532585.localdomain sudo[41006]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:37 np0005532585.localdomain sudo[41068]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmztqmhwesmxybbofheodxlurbdbslcc ; /usr/bin/python3
Nov 23 07:49:37 np0005532585.localdomain sudo[41068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:37 np0005532585.localdomain python3[41070]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:41 np0005532585.localdomain sudo[41068]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:41 np0005532585.localdomain sudo[41085]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgybxnzvprfeuhdjsqywxkjgitrtyqtw ; /usr/bin/python3
Nov 23 07:49:41 np0005532585.localdomain sudo[41085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:41 np0005532585.localdomain python3[41087]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:46 np0005532585.localdomain sudo[41085]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:47 np0005532585.localdomain sudo[41102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueepsctpemvyyobryzxalqcmyyjwbvaq ; /usr/bin/python3
Nov 23 07:49:47 np0005532585.localdomain sudo[41102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:47 np0005532585.localdomain python3[41104]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:47 np0005532585.localdomain sudo[41102]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:47 np0005532585.localdomain sudo[41125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vifvwamhdcrxeoeduevbmcipejjklimx ; /usr/bin/python3
Nov 23 07:49:47 np0005532585.localdomain sudo[41125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:47 np0005532585.localdomain python3[41127]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:51 np0005532585.localdomain sudo[41125]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:52 np0005532585.localdomain sudo[41142]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubjzyvijuvxrvoxdggdaudtnjnswfktp ; /usr/bin/python3
Nov 23 07:49:52 np0005532585.localdomain sudo[41142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:52 np0005532585.localdomain python3[41144]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:52 np0005532585.localdomain sudo[41142]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:52 np0005532585.localdomain sudo[41165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlfqmlvualghviqhrwecwnhzrmjyirnk ; /usr/bin/python3
Nov 23 07:49:52 np0005532585.localdomain sudo[41165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:52 np0005532585.localdomain python3[41167]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:49:58 np0005532585.localdomain sudo[41165]: pam_unix(sudo:session): session closed for user root
Nov 23 07:49:58 np0005532585.localdomain sudo[41182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mltbzfrapgevjxcchdhsrwvtwgqkwebv ; /usr/bin/python3
Nov 23 07:49:58 np0005532585.localdomain sudo[41182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:49:58 np0005532585.localdomain python3[41184]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:02 np0005532585.localdomain sshd[41186]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:50:02 np0005532585.localdomain sudo[41182]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:02 np0005532585.localdomain sudo[41201]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmaunntfwstbucpbowijygyorthukbwx ; /usr/bin/python3
Nov 23 07:50:02 np0005532585.localdomain sudo[41201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:02 np0005532585.localdomain python3[41203]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:02 np0005532585.localdomain sudo[41201]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:03 np0005532585.localdomain sudo[41224]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzzkojtcwvugdtuyfaruldkgfbwhikpw ; /usr/bin/python3
Nov 23 07:50:03 np0005532585.localdomain sudo[41224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:03 np0005532585.localdomain python3[41226]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:03 np0005532585.localdomain sshd[41186]: Invalid user ftpuser from 34.124.148.87 port 37128
Nov 23 07:50:03 np0005532585.localdomain sshd[41186]: Received disconnect from 34.124.148.87 port 37128:11: Bye Bye [preauth]
Nov 23 07:50:03 np0005532585.localdomain sshd[41186]: Disconnected from invalid user ftpuser 34.124.148.87 port 37128 [preauth]
Nov 23 07:50:07 np0005532585.localdomain sudo[41224]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:07 np0005532585.localdomain sudo[41241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keadkrhopnhecsebjdpooglhxvrfphge ; /usr/bin/python3
Nov 23 07:50:07 np0005532585.localdomain sudo[41241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:07 np0005532585.localdomain python3[41243]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:11 np0005532585.localdomain sudo[41241]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:12 np0005532585.localdomain sudo[41258]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwzejfwsehnxzkqqxgnbxfehlapbtvmf ; /usr/bin/python3
Nov 23 07:50:12 np0005532585.localdomain sudo[41258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:12 np0005532585.localdomain python3[41260]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:12 np0005532585.localdomain sudo[41258]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:12 np0005532585.localdomain sudo[41281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbiceaajwiuxndgxzfsjngwotslvqxny ; /usr/bin/python3
Nov 23 07:50:12 np0005532585.localdomain sudo[41281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:12 np0005532585.localdomain python3[41283]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:16 np0005532585.localdomain sudo[41281]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:16 np0005532585.localdomain sudo[41298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkexhcqsbnxytpdunjpnzdsujggxhorp ; /usr/bin/python3
Nov 23 07:50:16 np0005532585.localdomain sudo[41298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:17 np0005532585.localdomain python3[41300]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:21 np0005532585.localdomain sudo[41298]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:21 np0005532585.localdomain sudo[41315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idpwiapzdylnzvnzgrivpeicyzidnkml ; /usr/bin/python3
Nov 23 07:50:21 np0005532585.localdomain sudo[41315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:21 np0005532585.localdomain python3[41317]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:21 np0005532585.localdomain sudo[41315]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:21 np0005532585.localdomain sudo[41338]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnhiudyimjfdakglpjoigqhjvnsisvhz ; /usr/bin/python3
Nov 23 07:50:21 np0005532585.localdomain sudo[41338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:22 np0005532585.localdomain python3[41340]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:26 np0005532585.localdomain sudo[41338]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:26 np0005532585.localdomain sudo[41355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abvsyjmmssaqgngxmzjgkrsnlgizgyjj ; /usr/bin/python3
Nov 23 07:50:26 np0005532585.localdomain sudo[41355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:26 np0005532585.localdomain python3[41357]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:30 np0005532585.localdomain sudo[41355]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:31 np0005532585.localdomain sudo[41372]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knvlaffcsarjqmrdpskdgrrdcvywcsyo ; /usr/bin/python3
Nov 23 07:50:31 np0005532585.localdomain sudo[41372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:31 np0005532585.localdomain python3[41374]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:31 np0005532585.localdomain sudo[41372]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:32 np0005532585.localdomain sudo[41420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duhjxpstffecncylpazpfmfuxzdbqwyy ; /usr/bin/python3
Nov 23 07:50:32 np0005532585.localdomain sudo[41420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:32 np0005532585.localdomain python3[41422]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:32 np0005532585.localdomain sudo[41420]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:32 np0005532585.localdomain sudo[41438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdczyubtivpircqfdgwtrvvspqkxjxjn ; /usr/bin/python3
Nov 23 07:50:32 np0005532585.localdomain sudo[41438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:32 np0005532585.localdomain python3[41440]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp_zu7kv7y recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:32 np0005532585.localdomain sudo[41438]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:32 np0005532585.localdomain sudo[41468]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aplkbovxxdsbsfevvdiwhogqcuirhwua ; /usr/bin/python3
Nov 23 07:50:32 np0005532585.localdomain sudo[41468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:32 np0005532585.localdomain python3[41470]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:32 np0005532585.localdomain sudo[41468]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:33 np0005532585.localdomain sudo[41504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:50:33 np0005532585.localdomain sudo[41529]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqdkqldtmgzidoqlluezxzsnwvtmkxpy ; /usr/bin/python3
Nov 23 07:50:33 np0005532585.localdomain sudo[41529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:33 np0005532585.localdomain sudo[41504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:50:33 np0005532585.localdomain sudo[41504]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:33 np0005532585.localdomain sudo[41534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:50:33 np0005532585.localdomain sudo[41534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:50:33 np0005532585.localdomain python3[41532]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:33 np0005532585.localdomain sudo[41529]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:33 np0005532585.localdomain sudo[41564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qshqgawclzzmyhlbzgxehacngwldkpbw ; /usr/bin/python3
Nov 23 07:50:33 np0005532585.localdomain sudo[41564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:33 np0005532585.localdomain python3[41566]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:33 np0005532585.localdomain sudo[41564]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:33 np0005532585.localdomain sudo[41534]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:34 np0005532585.localdomain sudo[41658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmnfjwcwttnxnkhkohibkojisvadxotv ; /usr/bin/python3
Nov 23 07:50:34 np0005532585.localdomain sudo[41658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:34 np0005532585.localdomain python3[41660]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:34 np0005532585.localdomain sudo[41658]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:34 np0005532585.localdomain sudo[41676]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zobsiueekvaijffowlapefegbopfkcyn ; /usr/bin/python3
Nov 23 07:50:34 np0005532585.localdomain sudo[41676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:34 np0005532585.localdomain python3[41678]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:34 np0005532585.localdomain sudo[41676]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:34 np0005532585.localdomain sudo[41738]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfrdswstehqmomnqgniwsdjzdtqbmimd ; /usr/bin/python3
Nov 23 07:50:34 np0005532585.localdomain sudo[41738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:35 np0005532585.localdomain python3[41740]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:35 np0005532585.localdomain sudo[41738]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:35 np0005532585.localdomain sudo[41756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biqfnmsjlkatsvhxfltmxafzuqpjgqky ; /usr/bin/python3
Nov 23 07:50:35 np0005532585.localdomain sudo[41756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:35 np0005532585.localdomain python3[41758]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:35 np0005532585.localdomain sudo[41756]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:35 np0005532585.localdomain sudo[41818]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdzbrvhveufvoklnyjpvjfjrvzdwzqtl ; /usr/bin/python3
Nov 23 07:50:35 np0005532585.localdomain sudo[41818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:35 np0005532585.localdomain python3[41820]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:35 np0005532585.localdomain sudo[41818]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:35 np0005532585.localdomain sudo[41836]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbetkrqyxrfkgvccqkvawdfzflolbrco ; /usr/bin/python3
Nov 23 07:50:35 np0005532585.localdomain sudo[41836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:36 np0005532585.localdomain python3[41838]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:36 np0005532585.localdomain sudo[41836]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:36 np0005532585.localdomain sudo[41853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:50:36 np0005532585.localdomain sudo[41853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:50:36 np0005532585.localdomain sudo[41853]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:36 np0005532585.localdomain sudo[41913]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdpnktaxsjpkudgsohezuvkkfprnjezq ; /usr/bin/python3
Nov 23 07:50:36 np0005532585.localdomain sudo[41913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:36 np0005532585.localdomain python3[41915]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:36 np0005532585.localdomain sudo[41913]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:36 np0005532585.localdomain sudo[41931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npqbgvyxivonetjixarejqofevqwzqpz ; /usr/bin/python3
Nov 23 07:50:36 np0005532585.localdomain sudo[41931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:36 np0005532585.localdomain python3[41933]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:36 np0005532585.localdomain sudo[41931]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:37 np0005532585.localdomain sudo[41993]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msepbswdvjtvsnuvpxoggzjqxjpizhoz ; /usr/bin/python3
Nov 23 07:50:37 np0005532585.localdomain sudo[41993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:37 np0005532585.localdomain python3[41995]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:37 np0005532585.localdomain sudo[41993]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:37 np0005532585.localdomain sudo[42011]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygdmoqbfkctapcqwyfflkjhvyddgboln ; /usr/bin/python3
Nov 23 07:50:37 np0005532585.localdomain sudo[42011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:37 np0005532585.localdomain python3[42013]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:37 np0005532585.localdomain sudo[42011]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:37 np0005532585.localdomain sudo[42073]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svkfzobblywnpeovgyeqgtouljfnwuqy ; /usr/bin/python3
Nov 23 07:50:37 np0005532585.localdomain sudo[42073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:38 np0005532585.localdomain python3[42075]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:38 np0005532585.localdomain sudo[42073]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:38 np0005532585.localdomain sudo[42091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbscoliffdyawipeummpxjenwrhnxtzf ; /usr/bin/python3
Nov 23 07:50:38 np0005532585.localdomain sudo[42091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:38 np0005532585.localdomain python3[42093]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:38 np0005532585.localdomain sudo[42091]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:38 np0005532585.localdomain sudo[42153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hugimdaekrhlreiduxtoqbmqebczcomx ; /usr/bin/python3
Nov 23 07:50:38 np0005532585.localdomain sudo[42153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:38 np0005532585.localdomain python3[42155]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:38 np0005532585.localdomain sudo[42153]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:38 np0005532585.localdomain sudo[42171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzthnotbxcivuqniyualszyvwsrfvnqh ; /usr/bin/python3
Nov 23 07:50:38 np0005532585.localdomain sudo[42171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:39 np0005532585.localdomain python3[42173]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:39 np0005532585.localdomain sudo[42171]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:39 np0005532585.localdomain sudo[42233]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kraqrkehirznbtkqpfluieuycnugrzja ; /usr/bin/python3
Nov 23 07:50:39 np0005532585.localdomain sudo[42233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:39 np0005532585.localdomain python3[42235]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:39 np0005532585.localdomain sudo[42233]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:39 np0005532585.localdomain sudo[42251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfibcnbccapnnykkmrqeiovafshzaoev ; /usr/bin/python3
Nov 23 07:50:39 np0005532585.localdomain sudo[42251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:39 np0005532585.localdomain python3[42253]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:39 np0005532585.localdomain sudo[42251]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:40 np0005532585.localdomain sudo[42313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qndjubbfmildiaisqvpzedynigxmseuv ; /usr/bin/python3
Nov 23 07:50:40 np0005532585.localdomain sudo[42313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:40 np0005532585.localdomain python3[42315]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:40 np0005532585.localdomain sudo[42313]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:40 np0005532585.localdomain sudo[42331]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnnhunlyqkqipnwvyykfaymttmruyuxh ; /usr/bin/python3
Nov 23 07:50:40 np0005532585.localdomain sudo[42331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:40 np0005532585.localdomain python3[42333]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:40 np0005532585.localdomain sudo[42331]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:40 np0005532585.localdomain sudo[42393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpaqdghjlukcbanpxaebjkvphmuubrag ; /usr/bin/python3
Nov 23 07:50:40 np0005532585.localdomain sudo[42393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:40 np0005532585.localdomain python3[42395]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:40 np0005532585.localdomain sudo[42393]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:41 np0005532585.localdomain sudo[42411]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czzynieclyxxorchblldqoohwxzsucbe ; /usr/bin/python3
Nov 23 07:50:41 np0005532585.localdomain sudo[42411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:41 np0005532585.localdomain python3[42413]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:41 np0005532585.localdomain sudo[42411]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:41 np0005532585.localdomain sudo[42441]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrvzswfzkibxvgaxqmujrcrcjhwaulgc ; /usr/bin/python3
Nov 23 07:50:41 np0005532585.localdomain sudo[42441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:41 np0005532585.localdomain python3[42443]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:50:41 np0005532585.localdomain sudo[42441]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:42 np0005532585.localdomain sudo[42489]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqfahrxiafulnsuvwdabdbqrseadsqyv ; /usr/bin/python3
Nov 23 07:50:42 np0005532585.localdomain sudo[42489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:42 np0005532585.localdomain python3[42491]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:42 np0005532585.localdomain sudo[42489]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:42 np0005532585.localdomain sudo[42507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edruvnfyjwcoosyiwurottfvwpcytwed ; /usr/bin/python3
Nov 23 07:50:42 np0005532585.localdomain sudo[42507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:42 np0005532585.localdomain python3[42509]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmp2iguvxzb recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:42 np0005532585.localdomain sudo[42507]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:45 np0005532585.localdomain sudo[42537]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erazfesspvbbkyuuouzxpcpxfzeweyqv ; /usr/bin/python3
Nov 23 07:50:45 np0005532585.localdomain sudo[42537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:45 np0005532585.localdomain python3[42539]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:50:48 np0005532585.localdomain sudo[42537]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:50 np0005532585.localdomain sudo[42554]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgwfhscrzvhvfzcqhcabroljeqrfejdy ; /usr/bin/python3
Nov 23 07:50:50 np0005532585.localdomain sudo[42554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:50 np0005532585.localdomain python3[42556]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:50:51 np0005532585.localdomain sudo[42554]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:52 np0005532585.localdomain sudo[42572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slzmlpesbdwgtszkblsbobdtqtcwfulm ; /usr/bin/python3
Nov 23 07:50:52 np0005532585.localdomain sudo[42572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:52 np0005532585.localdomain python3[42574]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:50:52 np0005532585.localdomain sudo[42572]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:52 np0005532585.localdomain sudo[42590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqkncvhvzzhledsfztontntqunudgwov ; /usr/bin/python3
Nov 23 07:50:52 np0005532585.localdomain sudo[42590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:52 np0005532585.localdomain python3[42592]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:50:53 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:50:53 np0005532585.localdomain systemd-rc-local-generator[42620]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:50:53 np0005532585.localdomain systemd-sysv-generator[42625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:50:53 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:50:54 np0005532585.localdomain systemd[1]: Starting Netfilter Tables...
Nov 23 07:50:54 np0005532585.localdomain systemd[1]: Finished Netfilter Tables.
Nov 23 07:50:54 np0005532585.localdomain sudo[42590]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:54 np0005532585.localdomain sudo[42680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeqquyluommslzwrpgwfpmuskrpfqhrd ; /usr/bin/python3
Nov 23 07:50:54 np0005532585.localdomain sudo[42680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:54 np0005532585.localdomain python3[42682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:54 np0005532585.localdomain sudo[42680]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:55 np0005532585.localdomain sudo[42723]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsnlazmurnlnlhhqtckltkvmwagcbvyx ; /usr/bin/python3
Nov 23 07:50:55 np0005532585.localdomain sudo[42723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:55 np0005532585.localdomain python3[42725]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884254.5923202-75169-249923330290417/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:55 np0005532585.localdomain sudo[42723]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:55 np0005532585.localdomain sudo[42753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biemiqxznlshdtojixafzavjaxuzhwdh ; /usr/bin/python3
Nov 23 07:50:55 np0005532585.localdomain sudo[42753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:55 np0005532585.localdomain python3[42755]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:55 np0005532585.localdomain sudo[42753]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:56 np0005532585.localdomain sudo[42771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kerekoebgwzsbvvwewsiumswxaeoulit ; /usr/bin/python3
Nov 23 07:50:56 np0005532585.localdomain sudo[42771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:56 np0005532585.localdomain python3[42773]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:50:56 np0005532585.localdomain sudo[42771]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:56 np0005532585.localdomain sudo[42820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkczdtlelqvswotgjizvbnfwuvianeic ; /usr/bin/python3
Nov 23 07:50:56 np0005532585.localdomain sudo[42820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:56 np0005532585.localdomain python3[42822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:56 np0005532585.localdomain sudo[42820]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:56 np0005532585.localdomain sudo[42863]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-junalglqxcfouhfifwccwjchfnmmxmnq ; /usr/bin/python3
Nov 23 07:50:56 np0005532585.localdomain sudo[42863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:57 np0005532585.localdomain python3[42865]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884256.2745795-75274-236542045680377/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:57 np0005532585.localdomain sudo[42863]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:57 np0005532585.localdomain sudo[42925]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlchpzawrotneludrcwbftgifgfyotxa ; /usr/bin/python3
Nov 23 07:50:57 np0005532585.localdomain sudo[42925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:57 np0005532585.localdomain python3[42927]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:57 np0005532585.localdomain sudo[42925]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:57 np0005532585.localdomain sudo[42968]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfqopudafdzikrjlzhucgibupsixlqkg ; /usr/bin/python3
Nov 23 07:50:57 np0005532585.localdomain sudo[42968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:58 np0005532585.localdomain python3[42970]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884257.2028265-75398-127407320322808/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:58 np0005532585.localdomain sudo[42968]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:58 np0005532585.localdomain sudo[43030]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuuzccabezmwvekpvumxjzrvtqdjaiza ; /usr/bin/python3
Nov 23 07:50:58 np0005532585.localdomain sudo[43030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:58 np0005532585.localdomain python3[43032]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:58 np0005532585.localdomain sudo[43030]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:58 np0005532585.localdomain sudo[43073]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arttpctutsiqzdizadjeagndscdgynfs ; /usr/bin/python3
Nov 23 07:50:58 np0005532585.localdomain sudo[43073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:58 np0005532585.localdomain python3[43075]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884258.217897-75452-28217145799119/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:58 np0005532585.localdomain sudo[43073]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:59 np0005532585.localdomain sudo[43135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibjclehegknqxvvxwfwtszrxwzbmgjna ; /usr/bin/python3
Nov 23 07:50:59 np0005532585.localdomain sudo[43135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:59 np0005532585.localdomain python3[43137]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:50:59 np0005532585.localdomain sudo[43135]: pam_unix(sudo:session): session closed for user root
Nov 23 07:50:59 np0005532585.localdomain sudo[43178]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnpehinlwmajpmhlmlytqtrxquhppzdd ; /usr/bin/python3
Nov 23 07:50:59 np0005532585.localdomain sudo[43178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:50:59 np0005532585.localdomain python3[43180]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884259.1261356-75514-78615101295627/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:50:59 np0005532585.localdomain sudo[43178]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:00 np0005532585.localdomain sudo[43240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otpxpvjuvabynrrialjkqltzxnwkdpgm ; /usr/bin/python3
Nov 23 07:51:00 np0005532585.localdomain sudo[43240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:00 np0005532585.localdomain python3[43242]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:00 np0005532585.localdomain sudo[43240]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:00 np0005532585.localdomain sudo[43283]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znldxepgrlvnheizltidpprmwkflnwpg ; /usr/bin/python3
Nov 23 07:51:00 np0005532585.localdomain sudo[43283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:01 np0005532585.localdomain python3[43285]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884260.0139656-75551-95252839979549/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:01 np0005532585.localdomain sudo[43283]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:01 np0005532585.localdomain sudo[43313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkckrdorrsyiuxsaabwohszvlcmddtcl ; /usr/bin/python3
Nov 23 07:51:01 np0005532585.localdomain sudo[43313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:01 np0005532585.localdomain python3[43315]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:01 np0005532585.localdomain sudo[43313]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:01 np0005532585.localdomain sudo[43378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdnyogofypvkldvqoctkveldwiyfusur ; /usr/bin/python3
Nov 23 07:51:02 np0005532585.localdomain sudo[43378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:02 np0005532585.localdomain python3[43380]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:02 np0005532585.localdomain sudo[43378]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:02 np0005532585.localdomain sudo[43395]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsczcddvvtpedezhpkavtoullnnjbuxt ; /usr/bin/python3
Nov 23 07:51:02 np0005532585.localdomain sudo[43395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:02 np0005532585.localdomain python3[43397]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:02 np0005532585.localdomain sudo[43395]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:02 np0005532585.localdomain sudo[43412]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhgwuauzaqeddzhmrmogdkcmzgpzwjep ; /usr/bin/python3
Nov 23 07:51:02 np0005532585.localdomain sudo[43412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:03 np0005532585.localdomain python3[43414]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:03 np0005532585.localdomain sudo[43412]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:03 np0005532585.localdomain sudo[43431]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnntkbgehmwlqffjwrnjynltrxwcdmzx ; /usr/bin/python3
Nov 23 07:51:03 np0005532585.localdomain sudo[43431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:03 np0005532585.localdomain python3[43433]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:03 np0005532585.localdomain sudo[43431]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:03 np0005532585.localdomain sudo[43447]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqilrswwtsnavvuwaikoxseahridgamr ; /usr/bin/python3
Nov 23 07:51:03 np0005532585.localdomain sudo[43447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:03 np0005532585.localdomain python3[43449]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:03 np0005532585.localdomain sudo[43447]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:04 np0005532585.localdomain sudo[43463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwehtsagnylaxwmzyjaajlrqsutsrfwb ; /usr/bin/python3
Nov 23 07:51:04 np0005532585.localdomain sudo[43463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:04 np0005532585.localdomain python3[43465]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:04 np0005532585.localdomain sudo[43463]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:04 np0005532585.localdomain sudo[43479]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgxnjnsexcmprjwdfbutvuydofuhdzjl ; /usr/bin/python3
Nov 23 07:51:04 np0005532585.localdomain sudo[43479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:04 np0005532585.localdomain python3[43481]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 23 07:51:05 np0005532585.localdomain sudo[43479]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:05 np0005532585.localdomain sudo[43503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eshhctaqnxkhehexafqaepxadwlenalv ; /usr/bin/python3
Nov 23 07:51:05 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Nov 23 07:51:05 np0005532585.localdomain sudo[43503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:05 np0005532585.localdomain python3[43505]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:51:06 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:51:06 np0005532585.localdomain sudo[43503]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:06 np0005532585.localdomain sudo[43524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwwjjrvtgaxwftjceemoligbpvzslhjv ; /usr/bin/python3
Nov 23 07:51:06 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 23 07:51:06 np0005532585.localdomain sudo[43524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:07 np0005532585.localdomain python3[43526]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:51:07 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:51:08 np0005532585.localdomain sudo[43524]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:08 np0005532585.localdomain sudo[43545]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txtmbqzicblzjzbtnmwfnlfinqipxjzq ; /usr/bin/python3
Nov 23 07:51:08 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 23 07:51:08 np0005532585.localdomain sudo[43545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:08 np0005532585.localdomain python3[43547]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:51:09 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:51:09 np0005532585.localdomain sudo[43545]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:09 np0005532585.localdomain sudo[43566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvkcezilphjkunmgevlusdhklkakmzll ; /usr/bin/python3
Nov 23 07:51:09 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 23 07:51:09 np0005532585.localdomain sudo[43566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:09 np0005532585.localdomain python3[43568]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:09 np0005532585.localdomain sudo[43566]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:09 np0005532585.localdomain sudo[43582]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyntfluspcpzszxjssbcceojlcivezzs ; /usr/bin/python3
Nov 23 07:51:09 np0005532585.localdomain sudo[43582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:09 np0005532585.localdomain python3[43584]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:09 np0005532585.localdomain sudo[43582]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:10 np0005532585.localdomain sudo[43598]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kieddeetpjykebnejddjulfnddymcldk ; /usr/bin/python3
Nov 23 07:51:10 np0005532585.localdomain sudo[43598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:10 np0005532585.localdomain python3[43600]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:10 np0005532585.localdomain sudo[43598]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:10 np0005532585.localdomain sudo[43614]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwnnpvummqwrurvhbhiwcwbvhefgljdc ; /usr/bin/python3
Nov 23 07:51:10 np0005532585.localdomain sudo[43614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:10 np0005532585.localdomain python3[43616]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:51:10 np0005532585.localdomain sudo[43614]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:11 np0005532585.localdomain sudo[43630]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apgcuziwnpgejaagayjjdtvocqfeidrm ; /usr/bin/python3
Nov 23 07:51:11 np0005532585.localdomain sudo[43630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:11 np0005532585.localdomain python3[43632]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:11 np0005532585.localdomain sudo[43630]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:11 np0005532585.localdomain sudo[43647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdnwndknawyhqwcatgpdqjlgjwfmobau ; /usr/bin/python3
Nov 23 07:51:11 np0005532585.localdomain sudo[43647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:12 np0005532585.localdomain python3[43649]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:51:15 np0005532585.localdomain sudo[43647]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:15 np0005532585.localdomain sudo[43664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idaskyisncirafnvicccbjzrjauihvnn ; /usr/bin/python3
Nov 23 07:51:15 np0005532585.localdomain sudo[43664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:15 np0005532585.localdomain python3[43666]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:15 np0005532585.localdomain sudo[43664]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:15 np0005532585.localdomain sudo[43712]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jazhnipuxhrfsipstadnwmhurkkjdxyj ; /usr/bin/python3
Nov 23 07:51:15 np0005532585.localdomain sudo[43712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:16 np0005532585.localdomain python3[43714]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:16 np0005532585.localdomain sudo[43712]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:16 np0005532585.localdomain sudo[43755]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djxfpyxzsgrihpkyxlgvosqrosbhbmww ; /usr/bin/python3
Nov 23 07:51:16 np0005532585.localdomain sudo[43755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:17 np0005532585.localdomain python3[43757]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884275.7861643-76415-129523325808288/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:17 np0005532585.localdomain sudo[43755]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:17 np0005532585.localdomain sudo[43785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bohnceryrrffppvtylmswochtaqwkkfb ; /usr/bin/python3
Nov 23 07:51:17 np0005532585.localdomain sudo[43785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:17 np0005532585.localdomain python3[43787]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:51:17 np0005532585.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 07:51:17 np0005532585.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 23 07:51:17 np0005532585.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 23 07:51:17 np0005532585.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 23 07:51:17 np0005532585.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 23 07:51:17 np0005532585.localdomain kernel: Bridge firewalling registered
Nov 23 07:51:17 np0005532585.localdomain systemd-modules-load[43790]: Inserted module 'br_netfilter'
Nov 23 07:51:17 np0005532585.localdomain systemd-modules-load[43790]: Module 'msr' is built in
Nov 23 07:51:17 np0005532585.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 23 07:51:17 np0005532585.localdomain sudo[43785]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:17 np0005532585.localdomain sudo[43839]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alkflycbbdetejrfiozgzdgbmguplota ; /usr/bin/python3
Nov 23 07:51:17 np0005532585.localdomain sudo[43839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:18 np0005532585.localdomain python3[43841]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:18 np0005532585.localdomain sudo[43839]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:18 np0005532585.localdomain sudo[43882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzgemzplagzhdcsldkkxyvvtbwfnlcan ; /usr/bin/python3
Nov 23 07:51:18 np0005532585.localdomain sudo[43882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:18 np0005532585.localdomain python3[43884]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884277.8322845-76454-243313197577992/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:18 np0005532585.localdomain sudo[43882]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:18 np0005532585.localdomain sudo[43912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxxvfvamxtdillxpwprpoilbovnczoet ; /usr/bin/python3
Nov 23 07:51:18 np0005532585.localdomain sudo[43912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:18 np0005532585.localdomain python3[43914]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:18 np0005532585.localdomain sudo[43912]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:19 np0005532585.localdomain sudo[43929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiwinvkbxxadobknmbxodpfwxwxlhcis ; /usr/bin/python3
Nov 23 07:51:19 np0005532585.localdomain sudo[43929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:19 np0005532585.localdomain python3[43931]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:19 np0005532585.localdomain sudo[43929]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:19 np0005532585.localdomain sudo[43947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxjwlfptqmotogrqrdwfprvahaowmmwn ; /usr/bin/python3
Nov 23 07:51:19 np0005532585.localdomain sudo[43947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:19 np0005532585.localdomain python3[43949]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:19 np0005532585.localdomain sudo[43947]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:19 np0005532585.localdomain sudo[43965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-othvlszeasgdqkgwdytjhifwessmnifk ; /usr/bin/python3
Nov 23 07:51:19 np0005532585.localdomain sudo[43965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:19 np0005532585.localdomain python3[43967]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:19 np0005532585.localdomain sudo[43965]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:19 np0005532585.localdomain sudo[43982]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmpyggdpfvsphfckgywxuluswnujgtdf ; /usr/bin/python3
Nov 23 07:51:19 np0005532585.localdomain sudo[43982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:20 np0005532585.localdomain python3[43984]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:21 np0005532585.localdomain sudo[43982]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:21 np0005532585.localdomain sudo[43999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qutiukhtxvxllvudjegxmzuvnnpyjggp ; /usr/bin/python3
Nov 23 07:51:21 np0005532585.localdomain sudo[43999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:21 np0005532585.localdomain python3[44001]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:21 np0005532585.localdomain sudo[43999]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:21 np0005532585.localdomain sudo[44016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvyudzkpkxwqoxqqqvxamhnlkvuhsntn ; /usr/bin/python3
Nov 23 07:51:21 np0005532585.localdomain sudo[44016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:21 np0005532585.localdomain python3[44018]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:21 np0005532585.localdomain sudo[44016]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:21 np0005532585.localdomain sudo[44034]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbmzhindbawrvpsljzdtouqwaasdfdfc ; /usr/bin/python3
Nov 23 07:51:21 np0005532585.localdomain sudo[44034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:22 np0005532585.localdomain python3[44036]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:22 np0005532585.localdomain sudo[44034]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:22 np0005532585.localdomain sudo[44052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwjcwlrhlcgpfefcbjqcprkxzmxhgqbu ; /usr/bin/python3
Nov 23 07:51:22 np0005532585.localdomain sudo[44052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:22 np0005532585.localdomain python3[44054]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:22 np0005532585.localdomain sudo[44052]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:22 np0005532585.localdomain sudo[44070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxkeajisfvmxpjetnsimrywlkrhmdmjx ; /usr/bin/python3
Nov 23 07:51:22 np0005532585.localdomain sudo[44070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:22 np0005532585.localdomain python3[44072]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:22 np0005532585.localdomain sudo[44070]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:22 np0005532585.localdomain sudo[44088]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrejwhugzdfjdyrrrokhbgxiziknkdnu ; /usr/bin/python3
Nov 23 07:51:22 np0005532585.localdomain sudo[44088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:22 np0005532585.localdomain python3[44090]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:23 np0005532585.localdomain sudo[44088]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:23 np0005532585.localdomain sudo[44106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmzdmbkcremaypbhdccyvfesduiwfgqp ; /usr/bin/python3
Nov 23 07:51:23 np0005532585.localdomain sudo[44106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:23 np0005532585.localdomain python3[44108]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:23 np0005532585.localdomain sudo[44106]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:23 np0005532585.localdomain sudo[44124]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjpzhhpfpbiavelfqjvpsnvzvolidxiv ; /usr/bin/python3
Nov 23 07:51:23 np0005532585.localdomain sudo[44124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:23 np0005532585.localdomain python3[44126]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:23 np0005532585.localdomain sudo[44124]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:23 np0005532585.localdomain sudo[44142]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-carzacrdrkvhppryofpahbnzchzeknfi ; /usr/bin/python3
Nov 23 07:51:23 np0005532585.localdomain sudo[44142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:23 np0005532585.localdomain python3[44144]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:23 np0005532585.localdomain sudo[44142]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:24 np0005532585.localdomain sudo[44159]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-foikaijoknufmegggnhxsixnzarxjjxy ; /usr/bin/python3
Nov 23 07:51:24 np0005532585.localdomain sudo[44159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:24 np0005532585.localdomain python3[44161]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:24 np0005532585.localdomain sudo[44159]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:24 np0005532585.localdomain sudo[44176]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocvcneegcnpdsyygwewdmhveyprzbqby ; /usr/bin/python3
Nov 23 07:51:24 np0005532585.localdomain sudo[44176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:24 np0005532585.localdomain python3[44178]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:24 np0005532585.localdomain sudo[44176]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:24 np0005532585.localdomain sudo[44193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkwesvugexfukofdmrtbefihoumbezwe ; /usr/bin/python3
Nov 23 07:51:24 np0005532585.localdomain sudo[44193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:24 np0005532585.localdomain python3[44195]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:24 np0005532585.localdomain sudo[44193]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:24 np0005532585.localdomain sudo[44210]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwmgpbinbtmyvjpvonrsmndzubrmgtgb ; /usr/bin/python3
Nov 23 07:51:24 np0005532585.localdomain sudo[44210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:25 np0005532585.localdomain python3[44212]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 23 07:51:25 np0005532585.localdomain sudo[44210]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:25 np0005532585.localdomain sudo[44228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfnmqufswanaiidktlxhyfjwomutohnl ; /usr/bin/python3
Nov 23 07:51:25 np0005532585.localdomain sudo[44228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:25 np0005532585.localdomain python3[44230]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 07:51:25 np0005532585.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 23 07:51:25 np0005532585.localdomain systemd[1]: Stopped Apply Kernel Variables.
Nov 23 07:51:25 np0005532585.localdomain systemd[1]: Stopping Apply Kernel Variables...
Nov 23 07:51:25 np0005532585.localdomain systemd[1]: Starting Apply Kernel Variables...
Nov 23 07:51:25 np0005532585.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 23 07:51:25 np0005532585.localdomain systemd[1]: Finished Apply Kernel Variables.
Nov 23 07:51:25 np0005532585.localdomain sudo[44228]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:25 np0005532585.localdomain sudo[44248]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqbrphwuulfkztenxepapjbtfkhhvkpw ; /usr/bin/python3
Nov 23 07:51:25 np0005532585.localdomain sudo[44248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:26 np0005532585.localdomain python3[44250]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:26 np0005532585.localdomain sudo[44248]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:26 np0005532585.localdomain sudo[44264]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acqqzmxfevqrhnvlyiqaawotrjzznhml ; /usr/bin/python3
Nov 23 07:51:26 np0005532585.localdomain sudo[44264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:26 np0005532585.localdomain python3[44266]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:26 np0005532585.localdomain sudo[44264]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:26 np0005532585.localdomain sudo[44280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vztuxhettrszlijczyfytyetxssjxzgv ; /usr/bin/python3
Nov 23 07:51:26 np0005532585.localdomain sudo[44280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:26 np0005532585.localdomain python3[44282]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:26 np0005532585.localdomain sudo[44280]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:26 np0005532585.localdomain sudo[44296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyzgwydxeuaaxfxtllmnljknziwyuyps ; /usr/bin/python3
Nov 23 07:51:26 np0005532585.localdomain sudo[44296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:27 np0005532585.localdomain python3[44298]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:51:27 np0005532585.localdomain sudo[44296]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:27 np0005532585.localdomain sudo[44312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxmlwrakbuuytlhjtdkxbrttgaglsapk ; /usr/bin/python3
Nov 23 07:51:27 np0005532585.localdomain sudo[44312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:27 np0005532585.localdomain python3[44314]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:27 np0005532585.localdomain sudo[44312]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:27 np0005532585.localdomain sudo[44328]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsvpmgykwibfuhtmqurznqbzflzkpfbt ; /usr/bin/python3
Nov 23 07:51:27 np0005532585.localdomain sudo[44328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:27 np0005532585.localdomain python3[44330]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:27 np0005532585.localdomain sudo[44328]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:27 np0005532585.localdomain sudo[44344]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfxzzzkgvuqyfujexzhhjnnlyynouagk ; /usr/bin/python3
Nov 23 07:51:27 np0005532585.localdomain sudo[44344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:28 np0005532585.localdomain python3[44346]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:28 np0005532585.localdomain sudo[44344]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:28 np0005532585.localdomain sudo[44360]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xezdjyyraewkomvicauxoteajstyksst ; /usr/bin/python3
Nov 23 07:51:28 np0005532585.localdomain sudo[44360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:28 np0005532585.localdomain python3[44362]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:28 np0005532585.localdomain sudo[44360]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:28 np0005532585.localdomain sudo[44376]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbtzdmsowgkbusexpnaisjeftkaivdtc ; /usr/bin/python3
Nov 23 07:51:28 np0005532585.localdomain sudo[44376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:28 np0005532585.localdomain python3[44378]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:28 np0005532585.localdomain sudo[44376]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:28 np0005532585.localdomain sudo[44424]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwvcnqavqgwwaxxqlkorloyyjmsrjaqu ; /usr/bin/python3
Nov 23 07:51:28 np0005532585.localdomain sudo[44424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:29 np0005532585.localdomain python3[44426]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:29 np0005532585.localdomain sudo[44424]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:29 np0005532585.localdomain sudo[44467]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpautnassyhxqvswfpgxxpjijrmjlrfk ; /usr/bin/python3
Nov 23 07:51:29 np0005532585.localdomain sudo[44467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:29 np0005532585.localdomain python3[44469]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884288.820977-76946-155105315225353/source _original_basename=tmpvxoavkrd follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:29 np0005532585.localdomain sudo[44467]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:29 np0005532585.localdomain sudo[44497]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssiqdndglkmyrldrnbpqgalvjmdykcnz ; /usr/bin/python3
Nov 23 07:51:29 np0005532585.localdomain sudo[44497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:29 np0005532585.localdomain python3[44499]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:29 np0005532585.localdomain sudo[44497]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:30 np0005532585.localdomain sudo[44514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceeogevsxrmgekfygrcrvllsfifgnewx ; /usr/bin/python3
Nov 23 07:51:30 np0005532585.localdomain sudo[44514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:31 np0005532585.localdomain python3[44516]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:31 np0005532585.localdomain sudo[44514]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:31 np0005532585.localdomain sudo[44562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsjnlopqrvuwihguiiqetelakrutqxsx ; /usr/bin/python3
Nov 23 07:51:31 np0005532585.localdomain sudo[44562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:31 np0005532585.localdomain python3[44564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:31 np0005532585.localdomain sudo[44562]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:31 np0005532585.localdomain sudo[44605]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfaqnlozuryuxqacvkaueiaacwpepeii ; /usr/bin/python3
Nov 23 07:51:31 np0005532585.localdomain sudo[44605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:31 np0005532585.localdomain python3[44607]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884291.2656968-77142-104627080339130/source _original_basename=tmpqckmq4c3 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:31 np0005532585.localdomain sudo[44605]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:32 np0005532585.localdomain sudo[44635]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gltueectcufyvayfpcamdpnrnrjgudgu ; /usr/bin/python3
Nov 23 07:51:32 np0005532585.localdomain sudo[44635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:32 np0005532585.localdomain python3[44637]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:32 np0005532585.localdomain sudo[44635]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:32 np0005532585.localdomain sudo[44651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpsqterbbdgjpnstnfzqniijzgnonfia ; /usr/bin/python3
Nov 23 07:51:32 np0005532585.localdomain sudo[44651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:32 np0005532585.localdomain python3[44653]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:32 np0005532585.localdomain sudo[44651]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:32 np0005532585.localdomain sudo[44667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlbdzmmsiczkmtihjaelftqyqoylsfcf ; /usr/bin/python3
Nov 23 07:51:32 np0005532585.localdomain sudo[44667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:33 np0005532585.localdomain python3[44669]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:33 np0005532585.localdomain sudo[44667]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:33 np0005532585.localdomain sudo[44683]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwritcrwypvuoifmnnvnjeffxxnqijyv ; /usr/bin/python3
Nov 23 07:51:33 np0005532585.localdomain sudo[44683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:33 np0005532585.localdomain python3[44685]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:33 np0005532585.localdomain sudo[44683]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:33 np0005532585.localdomain sudo[44699]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofjrmbxhjianmdfejpigvkrhlvthtjsu ; /usr/bin/python3
Nov 23 07:51:33 np0005532585.localdomain sudo[44699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:33 np0005532585.localdomain python3[44701]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:33 np0005532585.localdomain sudo[44699]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:33 np0005532585.localdomain sudo[44715]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiojfesnminrjifkttljxdkfephlhdfn ; /usr/bin/python3
Nov 23 07:51:33 np0005532585.localdomain sudo[44715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:33 np0005532585.localdomain python3[44717]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:33 np0005532585.localdomain sudo[44715]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:34 np0005532585.localdomain sudo[44731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmteqasjjzdmyklpqvdbdheonowfkzrm ; /usr/bin/python3
Nov 23 07:51:34 np0005532585.localdomain sudo[44731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:34 np0005532585.localdomain python3[44733]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:34 np0005532585.localdomain sudo[44731]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:34 np0005532585.localdomain sudo[44747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trdzahuvwauyooggdlksvrmwszjtxcvj ; /usr/bin/python3
Nov 23 07:51:34 np0005532585.localdomain sudo[44747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:34 np0005532585.localdomain python3[44749]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:34 np0005532585.localdomain sudo[44747]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:34 np0005532585.localdomain sudo[44763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zubjdnowuacsmpgtwkzqnhfyqalvbywv ; /usr/bin/python3
Nov 23 07:51:34 np0005532585.localdomain sudo[44763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:34 np0005532585.localdomain python3[44765]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:34 np0005532585.localdomain sudo[44763]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:34 np0005532585.localdomain sudo[44779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjqxkezymtsngtgkhdlgtcecfhwtsdpp ; /usr/bin/python3
Nov 23 07:51:34 np0005532585.localdomain sudo[44779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:35 np0005532585.localdomain python3[44781]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Nov 23 07:51:35 np0005532585.localdomain groupadd[44782]: group added to /etc/group: name=qemu, GID=107
Nov 23 07:51:35 np0005532585.localdomain groupadd[44782]: group added to /etc/gshadow: name=qemu
Nov 23 07:51:35 np0005532585.localdomain groupadd[44782]: new group: name=qemu, GID=107
Nov 23 07:51:35 np0005532585.localdomain sudo[44779]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:35 np0005532585.localdomain sudo[44801]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siccenhaalcjkwvqpwngqdzsnewjcwus ; /usr/bin/python3
Nov 23 07:51:35 np0005532585.localdomain sudo[44801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:35 np0005532585.localdomain python3[44803]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 23 07:51:35 np0005532585.localdomain useradd[44805]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Nov 23 07:51:35 np0005532585.localdomain sudo[44801]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:35 np0005532585.localdomain sudo[44825]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvqqghzxhttyznsvfdxjerirvtmyblue ; /usr/bin/python3
Nov 23 07:51:35 np0005532585.localdomain sudo[44825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:36 np0005532585.localdomain python3[44827]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Nov 23 07:51:36 np0005532585.localdomain sudo[44825]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:36 np0005532585.localdomain sudo[44841]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhvgophwxkxbheyugbvgyqzimjzphcmj ; /usr/bin/python3
Nov 23 07:51:36 np0005532585.localdomain sudo[44841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:36 np0005532585.localdomain python3[44843]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:36 np0005532585.localdomain sudo[44844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:51:36 np0005532585.localdomain sudo[44844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:51:36 np0005532585.localdomain sudo[44844]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:36 np0005532585.localdomain sudo[44841]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:36 np0005532585.localdomain sudo[44860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 07:51:36 np0005532585.localdomain sudo[44860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:51:36 np0005532585.localdomain sudo[44920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isjiowlygbakhmfrqsjdvpdzabalnyle ; /usr/bin/python3
Nov 23 07:51:36 np0005532585.localdomain sudo[44920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:36 np0005532585.localdomain sudo[44860]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:36 np0005532585.localdomain python3[44929]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:36 np0005532585.localdomain sudo[44920]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:36 np0005532585.localdomain sudo[44943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:51:36 np0005532585.localdomain sudo[44943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:51:36 np0005532585.localdomain sudo[44943]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:37 np0005532585.localdomain sudo[44985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:51:37 np0005532585.localdomain sudo[44985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:51:37 np0005532585.localdomain sudo[45012]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnzxiooawctobaxwospufyiuezllfaeb ; /usr/bin/python3
Nov 23 07:51:37 np0005532585.localdomain sudo[45012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:37 np0005532585.localdomain python3[45015]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884296.56821-77440-253364882993894/source _original_basename=tmpf9uzpquc follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:37 np0005532585.localdomain sudo[45012]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:37 np0005532585.localdomain sudo[45057]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljeajsjgcflohmapntwyrvlphbhryvtz ; /usr/bin/python3
Nov 23 07:51:37 np0005532585.localdomain sudo[45057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:37 np0005532585.localdomain sudo[44985]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:37 np0005532585.localdomain python3[45062]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 23 07:51:38 np0005532585.localdomain sudo[45082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:51:38 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 23 07:51:38 np0005532585.localdomain sudo[45082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:51:38 np0005532585.localdomain sudo[45082]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:38 np0005532585.localdomain sudo[45057]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:38 np0005532585.localdomain sudo[45110]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjpcrzblpydtzlpfnadijedrmbntiuci ; /usr/bin/python3
Nov 23 07:51:38 np0005532585.localdomain sudo[45110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:38 np0005532585.localdomain python3[45112]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:38 np0005532585.localdomain sudo[45110]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:38 np0005532585.localdomain sudo[45126]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffwwwnhduznnvjqrecvnsqtoqpeofkrb ; /usr/bin/python3
Nov 23 07:51:38 np0005532585.localdomain sudo[45126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:38 np0005532585.localdomain python3[45128]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:38 np0005532585.localdomain sudo[45126]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:39 np0005532585.localdomain sudo[45142]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbnzhbgmbpgkiobggblnoqyydidaquah ; /usr/bin/python3
Nov 23 07:51:39 np0005532585.localdomain sudo[45142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:39 np0005532585.localdomain python3[45144]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Nov 23 07:51:40 np0005532585.localdomain sudo[45142]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:40 np0005532585.localdomain sudo[45163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euvcrvkoibsxnfgwwfoeqqsrrgrxlxtz ; /usr/bin/python3
Nov 23 07:51:40 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 23 07:51:40 np0005532585.localdomain sudo[45163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:40 np0005532585.localdomain python3[45165]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:51:41 np0005532585.localdomain sshd[45167]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:51:43 np0005532585.localdomain sudo[45163]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:43 np0005532585.localdomain sudo[45182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkcqzrynedyzbvhwiscxydjkllcsuuwh ; /usr/bin/python3
Nov 23 07:51:43 np0005532585.localdomain sudo[45182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:44 np0005532585.localdomain python3[45184]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 07:51:44 np0005532585.localdomain sudo[45182]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:44 np0005532585.localdomain sshd[45167]: Invalid user geoserver from 34.124.148.87 port 47288
Nov 23 07:51:44 np0005532585.localdomain sshd[45167]: Received disconnect from 34.124.148.87 port 47288:11: Bye Bye [preauth]
Nov 23 07:51:44 np0005532585.localdomain sshd[45167]: Disconnected from invalid user geoserver 34.124.148.87 port 47288 [preauth]
Nov 23 07:51:44 np0005532585.localdomain sudo[45243]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntcsxkuzjzonfqevgvbpcsyhozpzdnvx ; /usr/bin/python3
Nov 23 07:51:44 np0005532585.localdomain sudo[45243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:44 np0005532585.localdomain python3[45245]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:44 np0005532585.localdomain sudo[45243]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:44 np0005532585.localdomain sudo[45259]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iufygzqtnyridndrpubbytpectsaijge ; /usr/bin/python3
Nov 23 07:51:44 np0005532585.localdomain sudo[45259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:45 np0005532585.localdomain python3[45261]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:45 np0005532585.localdomain sudo[45259]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:45 np0005532585.localdomain sudo[45320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljydqravkddnaipeuxvjqjoiaiymtnak ; /usr/bin/python3
Nov 23 07:51:45 np0005532585.localdomain sudo[45320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:45 np0005532585.localdomain python3[45322]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:45 np0005532585.localdomain sudo[45320]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:45 np0005532585.localdomain sudo[45363]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bakaftevitdtifhayelcuegqrfimexrs ; /usr/bin/python3
Nov 23 07:51:45 np0005532585.localdomain sudo[45363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:46 np0005532585.localdomain python3[45365]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884305.2939782-77877-149074396637874/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=abd00f17d9c4e4815bd9c520c8599e87c7741b66 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:46 np0005532585.localdomain sudo[45363]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:46 np0005532585.localdomain sudo[45425]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhyyecopgkzvdbfceytojkgoenccyytf ; /usr/bin/python3
Nov 23 07:51:46 np0005532585.localdomain sudo[45425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:46 np0005532585.localdomain python3[45427]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:46 np0005532585.localdomain sudo[45425]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:46 np0005532585.localdomain sudo[45470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwnuraalvretouycoycxeitnktttyuhi ; /usr/bin/python3
Nov 23 07:51:46 np0005532585.localdomain sudo[45470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:46 np0005532585.localdomain python3[45472]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884306.2207127-77927-223311503495547/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:46 np0005532585.localdomain sudo[45470]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:47 np0005532585.localdomain sudo[45500]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puvlybkqzkrlshwkhkluexdaexwagles ; /usr/bin/python3
Nov 23 07:51:47 np0005532585.localdomain sudo[45500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:47 np0005532585.localdomain python3[45502]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:47 np0005532585.localdomain sudo[45500]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:47 np0005532585.localdomain sudo[45516]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxjjbgqtozxaaunvvbjzvmhtnrvdbkmx ; /usr/bin/python3
Nov 23 07:51:47 np0005532585.localdomain sudo[45516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:47 np0005532585.localdomain python3[45518]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:47 np0005532585.localdomain sudo[45516]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:47 np0005532585.localdomain sudo[45532]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlqntlknexccxsynpiziothscsnghqhr ; /usr/bin/python3
Nov 23 07:51:47 np0005532585.localdomain sudo[45532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:47 np0005532585.localdomain python3[45534]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:47 np0005532585.localdomain sudo[45532]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:48 np0005532585.localdomain sudo[45548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfazeluaocmxyguhiytcvwxvuudpguiy ; /usr/bin/python3
Nov 23 07:51:48 np0005532585.localdomain sudo[45548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:48 np0005532585.localdomain python3[45550]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:48 np0005532585.localdomain sudo[45548]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:48 np0005532585.localdomain sudo[45596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsywcjfsnszgfqapnmvrobyrystlfpgc ; /usr/bin/python3
Nov 23 07:51:48 np0005532585.localdomain sudo[45596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:48 np0005532585.localdomain python3[45598]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:48 np0005532585.localdomain sudo[45596]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:49 np0005532585.localdomain sudo[45639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nchrjvxhuxgispcwqdivhuvzayvndtvj ; /usr/bin/python3
Nov 23 07:51:49 np0005532585.localdomain sudo[45639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:49 np0005532585.localdomain python3[45641]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884308.6139896-78040-161620703582448/source _original_basename=tmpvditmsu0 follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:49 np0005532585.localdomain sudo[45639]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:49 np0005532585.localdomain sudo[45669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgsbltpaltbyoonejefqelqqkafpauwm ; /usr/bin/python3
Nov 23 07:51:49 np0005532585.localdomain sudo[45669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:49 np0005532585.localdomain python3[45671]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:49 np0005532585.localdomain sudo[45669]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:49 np0005532585.localdomain sudo[45685]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjxxxxhooxnyadahxargtrtbcjwxbout ; /usr/bin/python3
Nov 23 07:51:49 np0005532585.localdomain sudo[45685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:50 np0005532585.localdomain python3[45687]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:51:50 np0005532585.localdomain sudo[45685]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:50 np0005532585.localdomain sudo[45701]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeqqmgpannakvvnpzdfijdpaylqbcdxo ; /usr/bin/python3
Nov 23 07:51:50 np0005532585.localdomain sudo[45701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:50 np0005532585.localdomain python3[45703]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:51:53 np0005532585.localdomain sudo[45701]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:54 np0005532585.localdomain sudo[45750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhjugcdkjekbasdfnprcvdsandqpzyai ; /usr/bin/python3
Nov 23 07:51:54 np0005532585.localdomain sudo[45750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:54 np0005532585.localdomain python3[45752]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:51:54 np0005532585.localdomain sudo[45750]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:54 np0005532585.localdomain sudo[45795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeuduzqkamekgbzvqtvkulqbrvrcdlkm ; /usr/bin/python3
Nov 23 07:51:54 np0005532585.localdomain sudo[45795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:54 np0005532585.localdomain python3[45797]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884313.8971488-78327-127124705590039/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:51:54 np0005532585.localdomain sudo[45795]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:54 np0005532585.localdomain sudo[45826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aouuqumwgzyloqdxbhlgqldkrnvaokdr ; /usr/bin/python3
Nov 23 07:51:54 np0005532585.localdomain sudo[45826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:55 np0005532585.localdomain python3[45828]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:51:55 np0005532585.localdomain sshd[1134]: Received signal 15; terminating.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: sshd.service: Consumed 4.842s CPU time, read 2.5M from disk, written 216.0K to disk.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 23 07:51:55 np0005532585.localdomain sshd[45832]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:51:55 np0005532585.localdomain sshd[45832]: Server listening on 0.0.0.0 port 22.
Nov 23 07:51:55 np0005532585.localdomain sshd[45832]: Server listening on :: port 22.
Nov 23 07:51:55 np0005532585.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 23 07:51:55 np0005532585.localdomain sudo[45826]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:55 np0005532585.localdomain sudo[45846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iagvclcbrtpxfwjjtjhstyphjfqzzlbn ; /usr/bin/python3
Nov 23 07:51:55 np0005532585.localdomain sudo[45846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:55 np0005532585.localdomain python3[45848]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:55 np0005532585.localdomain sudo[45846]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:56 np0005532585.localdomain sudo[45864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dutgnhjemhfbrfayxsfczzdorpdkwiow ; /usr/bin/python3
Nov 23 07:51:56 np0005532585.localdomain sudo[45864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:56 np0005532585.localdomain python3[45866]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:51:56 np0005532585.localdomain sudo[45864]: pam_unix(sudo:session): session closed for user root
Nov 23 07:51:56 np0005532585.localdomain sudo[45882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyweoewxjveegqeyprwwqnpnhxzjytfi ; /usr/bin/python3
Nov 23 07:51:56 np0005532585.localdomain sudo[45882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:51:57 np0005532585.localdomain python3[45884]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:51:59 np0005532585.localdomain sudo[45882]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:00 np0005532585.localdomain sudo[45931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxfplbtgprgmjnkrdaafxwommgxvxhll ; /usr/bin/python3
Nov 23 07:52:00 np0005532585.localdomain sudo[45931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:00 np0005532585.localdomain python3[45933]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:52:00 np0005532585.localdomain sudo[45931]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:00 np0005532585.localdomain sudo[45949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwosoytbgkchhqrdxwyirtimxhjvlsxw ; /usr/bin/python3
Nov 23 07:52:00 np0005532585.localdomain sudo[45949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:00 np0005532585.localdomain python3[45951]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:00 np0005532585.localdomain sudo[45949]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 07:52:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3253 writes, 16K keys, 3253 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3253 writes, 142 syncs, 22.91 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3253 writes, 16K keys, 3253 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s
                                                          Interval WAL: 3253 writes, 142 syncs, 22.91 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 07:52:01 np0005532585.localdomain sudo[45979]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klfljxvhdnvqzbochozgwuhegyfvfjqe ; /usr/bin/python3
Nov 23 07:52:01 np0005532585.localdomain sudo[45979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:01 np0005532585.localdomain python3[45981]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:52:01 np0005532585.localdomain sudo[45979]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:02 np0005532585.localdomain sudo[46029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbhzprkefyhezsmtrsrzjkzdfxkohhav ; /usr/bin/python3
Nov 23 07:52:02 np0005532585.localdomain sudo[46029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:02 np0005532585.localdomain python3[46031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:52:02 np0005532585.localdomain sudo[46029]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:02 np0005532585.localdomain sudo[46047]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isyixzguytwuibdreqgnbhchyficktfo ; /usr/bin/python3
Nov 23 07:52:02 np0005532585.localdomain sudo[46047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:02 np0005532585.localdomain python3[46049]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:02 np0005532585.localdomain sudo[46047]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:02 np0005532585.localdomain sudo[46077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgbfgjvzekotdzpebyhbmttgyoevxvxf ; /usr/bin/python3
Nov 23 07:52:02 np0005532585.localdomain sudo[46077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:02 np0005532585.localdomain python3[46079]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:52:02 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:52:03 np0005532585.localdomain systemd-rc-local-generator[46105]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:52:03 np0005532585.localdomain systemd-sysv-generator[46109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:52:03 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:52:03 np0005532585.localdomain systemd[1]: Starting chronyd online sources service...
Nov 23 07:52:03 np0005532585.localdomain chronyc[46119]: 200 OK
Nov 23 07:52:03 np0005532585.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Nov 23 07:52:03 np0005532585.localdomain systemd[1]: Finished chronyd online sources service.
Nov 23 07:52:03 np0005532585.localdomain sudo[46077]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:03 np0005532585.localdomain sudo[46133]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlmzxihxrraiulxsmflrbbqurnobmsfd ; /usr/bin/python3
Nov 23 07:52:03 np0005532585.localdomain sudo[46133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:03 np0005532585.localdomain python3[46135]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:03 np0005532585.localdomain chronyd[25967]: System clock was stepped by 0.000009 seconds
Nov 23 07:52:03 np0005532585.localdomain sudo[46133]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:03 np0005532585.localdomain sudo[46150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhkigdoiqmganhpkisodrzktomjfzmir ; /usr/bin/python3
Nov 23 07:52:03 np0005532585.localdomain sudo[46150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:04 np0005532585.localdomain python3[46152]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:04 np0005532585.localdomain sudo[46150]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:04 np0005532585.localdomain sudo[46167]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqwjzeyhygtbilhwtwomsxnnfxwlooah ; /usr/bin/python3
Nov 23 07:52:04 np0005532585.localdomain sudo[46167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:04 np0005532585.localdomain python3[46169]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:04 np0005532585.localdomain chronyd[25967]: System clock was stepped by -0.000000 seconds
Nov 23 07:52:04 np0005532585.localdomain sudo[46167]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:04 np0005532585.localdomain sudo[46184]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpcezsrhpyrlsrnpdvgcbjivjlectdtz ; /usr/bin/python3
Nov 23 07:52:04 np0005532585.localdomain sudo[46184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:04 np0005532585.localdomain python3[46186]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 07:52:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s
                                                          Interval WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 07:52:04 np0005532585.localdomain sudo[46184]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:05 np0005532585.localdomain sudo[46201]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdsopsjaiekibjdljcytlqrmjuewapau ; /usr/bin/python3
Nov 23 07:52:05 np0005532585.localdomain sudo[46201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:05 np0005532585.localdomain python3[46203]: ansible-timezone Invoked with name=UTC hwclock=None
Nov 23 07:52:05 np0005532585.localdomain systemd[1]: Starting Time & Date Service...
Nov 23 07:52:05 np0005532585.localdomain systemd[1]: Started Time & Date Service.
Nov 23 07:52:05 np0005532585.localdomain sudo[46201]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:06 np0005532585.localdomain sudo[46221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snznatzumtnbduenoojdxwlvotznzaqa ; /usr/bin/python3
Nov 23 07:52:06 np0005532585.localdomain sudo[46221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:06 np0005532585.localdomain python3[46223]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:06 np0005532585.localdomain sudo[46221]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:07 np0005532585.localdomain sudo[46238]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkqnbcotldcdnzlwuizlccmtfsdkokxa ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 23 07:52:07 np0005532585.localdomain sudo[46238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:07 np0005532585.localdomain python3[46240]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:07 np0005532585.localdomain sudo[46238]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:07 np0005532585.localdomain sudo[46255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdxdnvvfpictedjxhwkximzsrswmwdfl ; /usr/bin/python3
Nov 23 07:52:07 np0005532585.localdomain sudo[46255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:07 np0005532585.localdomain python3[46257]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Nov 23 07:52:07 np0005532585.localdomain sudo[46255]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:07 np0005532585.localdomain sudo[46271]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veegixbbpevgxliotyxhlcykdijcqcmd ; /usr/bin/python3
Nov 23 07:52:07 np0005532585.localdomain sudo[46271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:08 np0005532585.localdomain python3[46273]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:52:08 np0005532585.localdomain sudo[46271]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:08 np0005532585.localdomain sudo[46287]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpqfqckbprmdymwutqzbgcqstzwozqhn ; /usr/bin/python3
Nov 23 07:52:08 np0005532585.localdomain sudo[46287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:08 np0005532585.localdomain python3[46289]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:52:08 np0005532585.localdomain sudo[46287]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:08 np0005532585.localdomain sudo[46303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xumcfvniqzpdzimkkblirljodnfpvvam ; /usr/bin/python3
Nov 23 07:52:08 np0005532585.localdomain sudo[46303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:09 np0005532585.localdomain python3[46305]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:52:09 np0005532585.localdomain sudo[46303]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:09 np0005532585.localdomain sudo[46351]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgqpleihqanaqnudcfeopmtkaentnhkt ; /usr/bin/python3
Nov 23 07:52:09 np0005532585.localdomain sudo[46351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:09 np0005532585.localdomain python3[46353]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:52:09 np0005532585.localdomain sudo[46351]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:09 np0005532585.localdomain sudo[46394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcjxwegirrclwcudjtmbmapqurwpbjcr ; /usr/bin/python3
Nov 23 07:52:09 np0005532585.localdomain sudo[46394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:09 np0005532585.localdomain python3[46396]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884329.3164322-79237-87657879580995/source _original_basename=tmpkz1ja64k follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:10 np0005532585.localdomain sudo[46394]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:10 np0005532585.localdomain sudo[46456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlshuvwqwxzyuljuxisezryqkhvghbiq ; /usr/bin/python3
Nov 23 07:52:10 np0005532585.localdomain sudo[46456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:10 np0005532585.localdomain python3[46458]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:52:10 np0005532585.localdomain sudo[46456]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:10 np0005532585.localdomain sudo[46499]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bddfnneriqwypjyfrbbvkewnhdshjfth ; /usr/bin/python3
Nov 23 07:52:10 np0005532585.localdomain sudo[46499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:10 np0005532585.localdomain python3[46501]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884330.2286217-79444-156898966445303/source _original_basename=tmp0_s1gr6x follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:10 np0005532585.localdomain sudo[46499]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:11 np0005532585.localdomain sudo[46529]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqvtsbrzippnfixnioymeonpdjgssato ; /usr/bin/python3
Nov 23 07:52:11 np0005532585.localdomain sudo[46529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:11 np0005532585.localdomain python3[46531]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 07:52:11 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:52:11 np0005532585.localdomain systemd-rc-local-generator[46557]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:52:11 np0005532585.localdomain systemd-sysv-generator[46561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:52:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:52:11 np0005532585.localdomain sudo[46529]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:12 np0005532585.localdomain sudo[46583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivckcyzcavboxipkwimduveregmfnfaf ; /usr/bin/python3
Nov 23 07:52:12 np0005532585.localdomain sudo[46583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:12 np0005532585.localdomain python3[46585]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:52:12 np0005532585.localdomain sudo[46583]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:12 np0005532585.localdomain sudo[46599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjjsapdniutyovtfcjpfuqcbyaqsyusy ; /usr/bin/python3
Nov 23 07:52:12 np0005532585.localdomain sudo[46599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:12 np0005532585.localdomain python3[46601]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:12 np0005532585.localdomain sudo[46599]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:12 np0005532585.localdomain sudo[46616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzwdeqwwatcpkustgzqrbhzisveqneki ; /usr/bin/python3
Nov 23 07:52:12 np0005532585.localdomain sudo[46616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:12 np0005532585.localdomain python3[46618]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:12 np0005532585.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Nov 23 07:52:12 np0005532585.localdomain systemd[36148]: Created slice User Background Tasks Slice.
Nov 23 07:52:12 np0005532585.localdomain systemd[36148]: Starting Cleanup of User's Temporary Files and Directories...
Nov 23 07:52:12 np0005532585.localdomain systemd[36148]: Finished Cleanup of User's Temporary Files and Directories.
Nov 23 07:52:12 np0005532585.localdomain sudo[46616]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:13 np0005532585.localdomain sudo[46634]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mivflzcakcnpofqwipujipxxzhbmgovs ; /usr/bin/python3
Nov 23 07:52:13 np0005532585.localdomain sudo[46634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:13 np0005532585.localdomain python3[46636]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:52:13 np0005532585.localdomain sudo[46634]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:13 np0005532585.localdomain sudo[46650]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcplnynfmvdxjzmjulmosscbfslyttta ; /usr/bin/python3
Nov 23 07:52:13 np0005532585.localdomain sudo[46650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:13 np0005532585.localdomain python3[46652]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:13 np0005532585.localdomain sudo[46650]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:13 np0005532585.localdomain sudo[46698]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-husrfxztkpktozicwoblreboefipkrkj ; /usr/bin/python3
Nov 23 07:52:13 np0005532585.localdomain sudo[46698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:14 np0005532585.localdomain python3[46700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:52:14 np0005532585.localdomain sudo[46698]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:14 np0005532585.localdomain sudo[46741]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpkxesqrlzuriedusrzwwlvmabsyfhne ; /usr/bin/python3
Nov 23 07:52:14 np0005532585.localdomain sudo[46741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:14 np0005532585.localdomain python3[46743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884333.8525093-79628-143651833038167/source _original_basename=tmpz0knaay8 follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:14 np0005532585.localdomain sudo[46741]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:35 np0005532585.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 07:52:37 np0005532585.localdomain sudo[46773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvviibucqnuwopnntkfqomfsrgdnmnzr ; /usr/bin/python3
Nov 23 07:52:37 np0005532585.localdomain sudo[46773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:37 np0005532585.localdomain python3[46775]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:52:37 np0005532585.localdomain sudo[46773]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:37 np0005532585.localdomain sudo[46789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gklhfxwajrfexohymmejhibyybxvdttd ; /usr/bin/python3
Nov 23 07:52:37 np0005532585.localdomain sudo[46789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:38 np0005532585.localdomain python3[46791]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Nov 23 07:52:38 np0005532585.localdomain sudo[46789]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:38 np0005532585.localdomain sudo[46805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgaifpjrcfqejotixjxzwjmpesebdagd ; /usr/bin/python3
Nov 23 07:52:38 np0005532585.localdomain sudo[46805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:38 np0005532585.localdomain sudo[46808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:52:38 np0005532585.localdomain sudo[46808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:52:38 np0005532585.localdomain sudo[46808]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:38 np0005532585.localdomain python3[46807]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:52:38 np0005532585.localdomain sudo[46805]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:38 np0005532585.localdomain sudo[46823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:52:38 np0005532585.localdomain sudo[46823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:52:38 np0005532585.localdomain sudo[46851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgklibzyriexuyutccxfldxfzppqopqb ; /usr/bin/python3
Nov 23 07:52:38 np0005532585.localdomain sudo[46851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:38 np0005532585.localdomain python3[46853]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:38 np0005532585.localdomain sudo[46851]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:38 np0005532585.localdomain sudo[46898]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obyudpxicldpbczrmnycxsbazpqariiq ; /usr/bin/python3
Nov 23 07:52:38 np0005532585.localdomain sudo[46898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:38 np0005532585.localdomain sudo[46823]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:39 np0005532585.localdomain python3[46900]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:39 np0005532585.localdomain sudo[46898]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:39 np0005532585.localdomain sudo[46914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtgdjczbvjbansbctgqidxabnuyuqmvc ; /usr/bin/python3
Nov 23 07:52:39 np0005532585.localdomain sudo[46914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:39 np0005532585.localdomain python3[46916]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  Converting 2706 SID table entries...
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:52:40 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:52:40 np0005532585.localdomain sudo[46914]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:40 np0005532585.localdomain sudo[46935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbqqzqcijpphdvogrmjwpacuzvavzsai ; /usr/bin/python3
Nov 23 07:52:40 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 23 07:52:40 np0005532585.localdomain sudo[46935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:40 np0005532585.localdomain sudo[46938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:52:40 np0005532585.localdomain sudo[46938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:52:40 np0005532585.localdomain sudo[46938]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:40 np0005532585.localdomain python3[46937]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:52:40 np0005532585.localdomain sudo[46935]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:41 np0005532585.localdomain sudo[46966]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqwwvoupojuoafsvxezsemknhstyiixj ; /usr/bin/python3
Nov 23 07:52:41 np0005532585.localdomain sudo[46966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:41 np0005532585.localdomain sudo[46966]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:41 np0005532585.localdomain sudo[47014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uynsbppsnuyrhiftalgeelzhbadoblnt ; /usr/bin/python3
Nov 23 07:52:41 np0005532585.localdomain sudo[47014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:41 np0005532585.localdomain sudo[47014]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:41 np0005532585.localdomain sudo[47057]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnxeqhyjhdreaiwfyubojkfnkkhlnied ; /usr/bin/python3
Nov 23 07:52:41 np0005532585.localdomain sudo[47057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:42 np0005532585.localdomain sudo[47057]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:42 np0005532585.localdomain sudo[47087]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tchzakxbewbxcxylnpofmrpxvrhuxeql ; /usr/bin/python3
Nov 23 07:52:42 np0005532585.localdomain sudo[47087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:42 np0005532585.localdomain python3[47089]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Nov 23 07:52:42 np0005532585.localdomain sudo[47087]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:42 np0005532585.localdomain rsyslogd[760]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Nov 23 07:52:43 np0005532585.localdomain sudo[47103]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eixxtwrzwknyxwditemgdpbhbpebqwtn ; /usr/bin/python3
Nov 23 07:52:43 np0005532585.localdomain sudo[47103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:43 np0005532585.localdomain python3[47105]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:52:43 np0005532585.localdomain sudo[47103]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:43 np0005532585.localdomain sudo[47119]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nghbkldtvqxrnhnmzfegwyldlrndczon ; /usr/bin/python3
Nov 23 07:52:43 np0005532585.localdomain sudo[47119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:43 np0005532585.localdomain python3[47121]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:52:43 np0005532585.localdomain sudo[47119]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:43 np0005532585.localdomain sudo[47135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syjkhgeoodelmdvkwuouegodhicsxzdc ; /usr/bin/python3
Nov 23 07:52:43 np0005532585.localdomain sudo[47135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:44 np0005532585.localdomain python3[47137]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Nov 23 07:52:44 np0005532585.localdomain sudo[47135]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:49 np0005532585.localdomain sudo[47183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfsadeyefuarxlgxuktqndbrctdfyapz ; /usr/bin/python3
Nov 23 07:52:49 np0005532585.localdomain sudo[47183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:49 np0005532585.localdomain python3[47185]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:52:49 np0005532585.localdomain sudo[47183]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:49 np0005532585.localdomain sudo[47226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atxmlcuifptfnikcobtiaibugnnddvxa ; /usr/bin/python3
Nov 23 07:52:49 np0005532585.localdomain sudo[47226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:49 np0005532585.localdomain python3[47228]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884369.037533-81103-108734803260782/source _original_basename=tmp9zzq8u8g follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:52:49 np0005532585.localdomain sudo[47226]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:50 np0005532585.localdomain sudo[47256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thdnfkoeyteuubxfprmuoocgjsmbuvjz ; /usr/bin/python3
Nov 23 07:52:50 np0005532585.localdomain sudo[47256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:50 np0005532585.localdomain python3[47258]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:52:50 np0005532585.localdomain sudo[47256]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:51 np0005532585.localdomain sudo[47306]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcszitggbseqqgixqufgfaoaloucqzsv ; /usr/bin/python3
Nov 23 07:52:51 np0005532585.localdomain sudo[47306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:51 np0005532585.localdomain sudo[47306]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:51 np0005532585.localdomain sudo[47349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfgpxnkssngkbevanekemtxkbdkzwzmj ; /usr/bin/python3
Nov 23 07:52:51 np0005532585.localdomain sudo[47349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:51 np0005532585.localdomain sudo[47349]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:52 np0005532585.localdomain sudo[47379]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xobckskyrlbblmzsxmwyxonqvgvvljfp ; /usr/bin/python3
Nov 23 07:52:52 np0005532585.localdomain sudo[47379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:52 np0005532585.localdomain python3[47381]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:52:52 np0005532585.localdomain sudo[47379]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:53 np0005532585.localdomain sudo[47427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dniqiavbmrbvcaqukhdhysezawfzlfeb ; /usr/bin/python3
Nov 23 07:52:53 np0005532585.localdomain sudo[47427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:53 np0005532585.localdomain sudo[47427]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:53 np0005532585.localdomain sudo[47470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrqlixffgbbxkwzaiueotmjqryhuivvm ; /usr/bin/python3
Nov 23 07:52:53 np0005532585.localdomain sudo[47470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:53 np0005532585.localdomain sudo[47470]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:53 np0005532585.localdomain sudo[47500]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsgwjbjpowkizwffqhaeahxgwxskhzuo ; /usr/bin/python3
Nov 23 07:52:53 np0005532585.localdomain sudo[47500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:54 np0005532585.localdomain python3[47502]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 23 07:52:54 np0005532585.localdomain sudo[47500]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:56 np0005532585.localdomain sudo[47516]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auofdtzlclmrrowjkwucwopcftkkdhsb ; /usr/bin/python3
Nov 23 07:52:56 np0005532585.localdomain sudo[47516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:56 np0005532585.localdomain python3[47518]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:52:56 np0005532585.localdomain sudo[47516]: pam_unix(sudo:session): session closed for user root
Nov 23 07:52:57 np0005532585.localdomain sudo[47533]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsstuqsdiunzqdfpxghbxtjgtokixwxz ; /usr/bin/python3
Nov 23 07:52:57 np0005532585.localdomain sudo[47533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:52:57 np0005532585.localdomain python3[47535]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 07:53:01 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:53:01 np0005532585.localdomain dbus-broker-launch[18442]: Noticed file-system modification, trigger reload.
Nov 23 07:53:01 np0005532585.localdomain dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 23 07:53:01 np0005532585.localdomain dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 23 07:53:01 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:53:01 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:53:01 np0005532585.localdomain systemd[1]: Reexecuting.
Nov 23 07:53:01 np0005532585.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 23 07:53:01 np0005532585.localdomain systemd[1]: Detected virtualization kvm.
Nov 23 07:53:01 np0005532585.localdomain systemd[1]: Detected architecture x86-64.
Nov 23 07:53:01 np0005532585.localdomain systemd-sysv-generator[47593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:53:01 np0005532585.localdomain systemd-rc-local-generator[47589]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:53:01 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  Converting 2706 SID table entries...
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 07:53:09 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 07:53:09 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:53:09 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 23 07:53:09 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:53:11 np0005532585.localdomain systemd-sysv-generator[47692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:53:11 np0005532585.localdomain systemd-rc-local-generator[47689]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 07:53:11 np0005532585.localdomain systemd-journald[618]: Journal stopped
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Stopping Journal Service...
Nov 23 07:53:11 np0005532585.localdomain systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Stopped Journal Service.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: systemd-journald.service: Consumed 1.962s CPU time.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Starting Journal Service...
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: systemd-udevd.service: Consumed 2.852s CPU time.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 23 07:53:11 np0005532585.localdomain systemd-journald[48157]: Journal started
Nov 23 07:53:11 np0005532585.localdomain systemd-journald[48157]: Runtime Journal (/run/log/journal/6e0090cd4cf296f54418e234b90f721c) is 12.3M, max 314.7M, 302.4M free.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Started Journal Service.
Nov 23 07:53:11 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Nov 23 07:53:11 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 07:53:11 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 07:53:11 np0005532585.localdomain systemd-udevd[48163]: Using default interface naming scheme 'rhel-9.0'.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 23 07:53:11 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:53:11 np0005532585.localdomain systemd-rc-local-generator[48749]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:53:11 np0005532585.localdomain systemd-sysv-generator[48753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:53:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:53:12 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 07:53:12 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 07:53:12 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 07:53:12 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.109s CPU time.
Nov 23 07:53:12 np0005532585.localdomain systemd[1]: run-r3fdf4ffa0e604537bfc819f4c898c12f.service: Deactivated successfully.
Nov 23 07:53:12 np0005532585.localdomain systemd[1]: run-r655657c2d4a944c198df5810dd39d490.service: Deactivated successfully.
Nov 23 07:53:13 np0005532585.localdomain sudo[47533]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:13 np0005532585.localdomain sudo[49022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmfiddhsxmrwmrdqkdikmobvsuusmxio ; /usr/bin/python3
Nov 23 07:53:13 np0005532585.localdomain sudo[49022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:13 np0005532585.localdomain python3[49024]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Nov 23 07:53:13 np0005532585.localdomain sudo[49022]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:14 np0005532585.localdomain sudo[49041]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeiglobrlampuppbeiiozxvecopzeypo ; /usr/bin/python3
Nov 23 07:53:14 np0005532585.localdomain sudo[49041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:14 np0005532585.localdomain python3[49043]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 07:53:14 np0005532585.localdomain sudo[49041]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:15 np0005532585.localdomain sudo[49059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbyvfhcnbtlltcfhrrxwynskzezlevzl ; /usr/bin/python3
Nov 23 07:53:15 np0005532585.localdomain sudo[49059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:15 np0005532585.localdomain python3[49061]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:53:15 np0005532585.localdomain python3[49061]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Nov 23 07:53:15 np0005532585.localdomain python3[49061]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Nov 23 07:53:22 np0005532585.localdomain podman[49075]: 2025-11-23 07:53:15.775573476 +0000 UTC m=+0.029358083 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 23 07:53:22 np0005532585.localdomain python3[49061]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json
Nov 23 07:53:22 np0005532585.localdomain sudo[49059]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:22 np0005532585.localdomain sudo[49172]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjcylcyxijpncssgjbgjddqusrsprtqo ; /usr/bin/python3
Nov 23 07:53:22 np0005532585.localdomain sudo[49172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:23 np0005532585.localdomain python3[49174]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:53:23 np0005532585.localdomain python3[49174]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Nov 23 07:53:23 np0005532585.localdomain python3[49174]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Nov 23 07:53:29 np0005532585.localdomain podman[49186]: 2025-11-23 07:53:23.128911969 +0000 UTC m=+0.046555098 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 23 07:53:29 np0005532585.localdomain python3[49174]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json
Nov 23 07:53:29 np0005532585.localdomain sudo[49172]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:29 np0005532585.localdomain sudo[49286]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoajaenjjfusldzpytnpcagdrmdvgbzc ; /usr/bin/python3
Nov 23 07:53:29 np0005532585.localdomain sudo[49286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:30 np0005532585.localdomain python3[49288]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:53:30 np0005532585.localdomain python3[49288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Nov 23 07:53:30 np0005532585.localdomain python3[49288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Nov 23 07:53:40 np0005532585.localdomain sudo[49534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:53:40 np0005532585.localdomain sudo[49534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:53:40 np0005532585.localdomain sudo[49534]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:40 np0005532585.localdomain sudo[49549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 07:53:40 np0005532585.localdomain sudo[49549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:53:45 np0005532585.localdomain podman[49300]: 2025-11-23 07:53:30.293209837 +0000 UTC m=+0.047090522 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 07:53:45 np0005532585.localdomain python3[49288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json
Nov 23 07:53:45 np0005532585.localdomain sudo[49286]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:46 np0005532585.localdomain sudo[49656]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfrpcqvxexaoyozwnupunobneqdnwjrb ; /usr/bin/python3
Nov 23 07:53:46 np0005532585.localdomain sudo[49656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:46 np0005532585.localdomain systemd[1]: tmp-crun.Prr2vM.mount: Deactivated successfully.
Nov 23 07:53:46 np0005532585.localdomain python3[49667]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:53:46 np0005532585.localdomain python3[49667]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Nov 23 07:53:46 np0005532585.localdomain podman[49660]: 2025-11-23 07:53:46.301955832 +0000 UTC m=+0.193746378 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, release=553, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 07:53:46 np0005532585.localdomain python3[49667]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Nov 23 07:53:46 np0005532585.localdomain podman[49660]: 2025-11-23 07:53:46.404162044 +0000 UTC m=+0.295952600 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container)
Nov 23 07:53:46 np0005532585.localdomain sudo[49549]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:47 np0005532585.localdomain sudo[49751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:53:47 np0005532585.localdomain sudo[49751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:53:47 np0005532585.localdomain sudo[49751]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:47 np0005532585.localdomain sudo[49783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:53:47 np0005532585.localdomain sudo[49783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:53:48 np0005532585.localdomain sudo[49783]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:49 np0005532585.localdomain sudo[49839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:53:49 np0005532585.localdomain sudo[49839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:53:49 np0005532585.localdomain sudo[49839]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:59 np0005532585.localdomain podman[49692]: 2025-11-23 07:53:46.368989487 +0000 UTC m=+0.023558761 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 07:53:59 np0005532585.localdomain python3[49667]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json
Nov 23 07:53:59 np0005532585.localdomain sudo[49656]: pam_unix(sudo:session): session closed for user root
Nov 23 07:53:59 np0005532585.localdomain sudo[49931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmycebrpcveyifzczxonwrphyplrskxj ; /usr/bin/python3
Nov 23 07:53:59 np0005532585.localdomain sudo[49931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:53:59 np0005532585.localdomain python3[49933]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:53:59 np0005532585.localdomain python3[49933]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Nov 23 07:53:59 np0005532585.localdomain python3[49933]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Nov 23 07:54:07 np0005532585.localdomain podman[49945]: 2025-11-23 07:53:59.476287292 +0000 UTC m=+0.033104150 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 23 07:54:07 np0005532585.localdomain python3[49933]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json
Nov 23 07:54:07 np0005532585.localdomain sudo[49931]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:07 np0005532585.localdomain sudo[50103]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezpvlgmihxzjbwjoaqxzgwfmjnxbrxnb ; /usr/bin/python3
Nov 23 07:54:07 np0005532585.localdomain sudo[50103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:07 np0005532585.localdomain python3[50105]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:54:07 np0005532585.localdomain python3[50105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Nov 23 07:54:07 np0005532585.localdomain python3[50105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Nov 23 07:54:12 np0005532585.localdomain podman[50118]: 2025-11-23 07:54:07.879180255 +0000 UTC m=+0.050725529 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 23 07:54:12 np0005532585.localdomain python3[50105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json
Nov 23 07:54:12 np0005532585.localdomain sudo[50103]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:12 np0005532585.localdomain sudo[50193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wiomggujyfsnuhachjsuqaxyzhepaele ; /usr/bin/python3
Nov 23 07:54:12 np0005532585.localdomain sudo[50193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:12 np0005532585.localdomain python3[50195]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:54:12 np0005532585.localdomain python3[50195]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Nov 23 07:54:12 np0005532585.localdomain python3[50195]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Nov 23 07:54:14 np0005532585.localdomain podman[50209]: 2025-11-23 07:54:12.536076503 +0000 UTC m=+0.041477251 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 23 07:54:14 np0005532585.localdomain python3[50195]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json
Nov 23 07:54:14 np0005532585.localdomain sudo[50193]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:14 np0005532585.localdomain sudo[50286]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvcxhvrodfvexdmlvonjhpbhyindprpm ; /usr/bin/python3
Nov 23 07:54:14 np0005532585.localdomain sudo[50286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:15 np0005532585.localdomain python3[50288]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:54:15 np0005532585.localdomain python3[50288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Nov 23 07:54:15 np0005532585.localdomain python3[50288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Nov 23 07:54:17 np0005532585.localdomain podman[50300]: 2025-11-23 07:54:15.275129277 +0000 UTC m=+0.044555270 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 23 07:54:17 np0005532585.localdomain python3[50288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json
Nov 23 07:54:17 np0005532585.localdomain sudo[50286]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:18 np0005532585.localdomain sudo[50374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwqalbbqntifdwvkpwuchrwbcsnwtald ; /usr/bin/python3
Nov 23 07:54:18 np0005532585.localdomain sudo[50374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:18 np0005532585.localdomain python3[50376]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:54:18 np0005532585.localdomain python3[50376]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Nov 23 07:54:18 np0005532585.localdomain python3[50376]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Nov 23 07:54:20 np0005532585.localdomain podman[50388]: 2025-11-23 07:54:18.363131752 +0000 UTC m=+0.041716238 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 23 07:54:20 np0005532585.localdomain python3[50376]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json
Nov 23 07:54:20 np0005532585.localdomain sudo[50374]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:21 np0005532585.localdomain sudo[50465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lztomwlohfpfiwdfxrzsuimewburoahm ; /usr/bin/python3
Nov 23 07:54:21 np0005532585.localdomain sudo[50465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:21 np0005532585.localdomain python3[50467]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:54:21 np0005532585.localdomain python3[50467]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Nov 23 07:54:21 np0005532585.localdomain python3[50467]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Nov 23 07:54:25 np0005532585.localdomain podman[50480]: 2025-11-23 07:54:21.411231988 +0000 UTC m=+0.044006212 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 23 07:54:25 np0005532585.localdomain python3[50467]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json
Nov 23 07:54:25 np0005532585.localdomain sudo[50465]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:25 np0005532585.localdomain sudo[50568]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukvulmmgakkvmrafelboslwoxbfqqukq ; /usr/bin/python3
Nov 23 07:54:25 np0005532585.localdomain sudo[50568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:25 np0005532585.localdomain python3[50570]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 23 07:54:25 np0005532585.localdomain python3[50570]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Nov 23 07:54:25 np0005532585.localdomain python3[50570]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Nov 23 07:54:27 np0005532585.localdomain podman[50583]: 2025-11-23 07:54:25.473862262 +0000 UTC m=+0.041855693 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 23 07:54:27 np0005532585.localdomain python3[50570]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json
Nov 23 07:54:27 np0005532585.localdomain sudo[50568]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:27 np0005532585.localdomain sudo[50659]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxtdgovbtqxpomdakaosdtghimjsqief ; /usr/bin/python3
Nov 23 07:54:27 np0005532585.localdomain sudo[50659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:28 np0005532585.localdomain python3[50661]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:54:28 np0005532585.localdomain sudo[50659]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:28 np0005532585.localdomain sudo[50709]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlrnuhhcifmuyvxqxhpxouudccfxfmvw ; /usr/bin/python3
Nov 23 07:54:28 np0005532585.localdomain sudo[50709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:28 np0005532585.localdomain sudo[50709]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:29 np0005532585.localdomain sudo[50727]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lergraagyktbxeeiedstigcweycvyymh ; /usr/bin/python3
Nov 23 07:54:29 np0005532585.localdomain sudo[50727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:29 np0005532585.localdomain sudo[50727]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:29 np0005532585.localdomain sudo[50831]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjxzguaajzwpnsbyikbcxsvvxlhblgvz ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884469.4670262-83773-211607189789818/async_wrapper.py 702276533013 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884469.4670262-83773-211607189789818/AnsiballZ_command.py _
Nov 23 07:54:29 np0005532585.localdomain sudo[50831]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 07:54:30 np0005532585.localdomain ansible-async_wrapper.py[50833]: Invoked with 702276533013 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884469.4670262-83773-211607189789818/AnsiballZ_command.py _
Nov 23 07:54:30 np0005532585.localdomain ansible-async_wrapper.py[50836]: Starting module and watcher
Nov 23 07:54:30 np0005532585.localdomain ansible-async_wrapper.py[50836]: Start watching 50837 (3600)
Nov 23 07:54:30 np0005532585.localdomain ansible-async_wrapper.py[50837]: Start module (50837)
Nov 23 07:54:30 np0005532585.localdomain ansible-async_wrapper.py[50833]: Return async_wrapper task started.
Nov 23 07:54:30 np0005532585.localdomain sudo[50831]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:30 np0005532585.localdomain sudo[50855]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuntdlxrikqbpexokunzlvztqxxvisjj ; /usr/bin/python3
Nov 23 07:54:30 np0005532585.localdomain sudo[50855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:30 np0005532585.localdomain python3[50857]: ansible-ansible.legacy.async_status Invoked with jid=702276533013.50833 mode=status _async_dir=/tmp/.ansible_async
Nov 23 07:54:30 np0005532585.localdomain sudo[50855]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]:    (file & line not available)
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]:    (file & line not available)
Nov 23 07:54:33 np0005532585.localdomain puppet-user[50841]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.17 seconds
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Notice: Applied catalog in 0.05 seconds
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Application:
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:    Initial environment: production
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:    Converged environment: production
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:          Run mode: user
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Changes:
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:             Total: 3
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Events:
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:           Success: 3
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:             Total: 3
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Resources:
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:           Changed: 3
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:       Out of sync: 3
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:             Total: 10
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Time:
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:          Schedule: 0.00
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:              File: 0.00
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:            Augeas: 0.02
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:              Exec: 0.02
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:    Transaction evaluation: 0.05
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:    Catalog application: 0.05
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:    Config retrieval: 0.20
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:          Last run: 1763884474
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:        Filebucket: 0.00
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:             Total: 0.05
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]: Version:
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:            Config: 1763884473
Nov 23 07:54:34 np0005532585.localdomain puppet-user[50841]:            Puppet: 7.10.0
Nov 23 07:54:34 np0005532585.localdomain ansible-async_wrapper.py[50837]: Module complete (50837)
Nov 23 07:54:35 np0005532585.localdomain ansible-async_wrapper.py[50836]: Done in kid B.
Nov 23 07:54:40 np0005532585.localdomain sudo[50983]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlqkejxurormpjgyhacdovldxvsmecgz ; /usr/bin/python3
Nov 23 07:54:40 np0005532585.localdomain sudo[50983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:40 np0005532585.localdomain python3[50985]: ansible-ansible.legacy.async_status Invoked with jid=702276533013.50833 mode=status _async_dir=/tmp/.ansible_async
Nov 23 07:54:40 np0005532585.localdomain sudo[50983]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:41 np0005532585.localdomain sudo[50999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efhhyhbkdwimlzgiecsnbqmmyqushzgy ; /usr/bin/python3
Nov 23 07:54:41 np0005532585.localdomain sudo[50999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:41 np0005532585.localdomain python3[51001]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:54:41 np0005532585.localdomain sudo[50999]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:41 np0005532585.localdomain sudo[51015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfxznhjjghglucybilxumgyhbsoasmzx ; /usr/bin/python3
Nov 23 07:54:41 np0005532585.localdomain sudo[51015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:41 np0005532585.localdomain python3[51017]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:54:41 np0005532585.localdomain sudo[51015]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:42 np0005532585.localdomain sudo[51063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwfumdhsihyldkvsvjqmhafxeuuupzzj ; /usr/bin/python3
Nov 23 07:54:42 np0005532585.localdomain sudo[51063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:42 np0005532585.localdomain python3[51065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:54:42 np0005532585.localdomain sudo[51063]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:42 np0005532585.localdomain sudo[51106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmgwjaespwceqrrvyitywgliuqiubzzt ; /usr/bin/python3
Nov 23 07:54:42 np0005532585.localdomain sudo[51106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:42 np0005532585.localdomain python3[51108]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884481.9806762-84065-154904079588825/source _original_basename=tmpbxs3tk95 follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 07:54:42 np0005532585.localdomain sudo[51106]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:42 np0005532585.localdomain sudo[51136]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orzupeeimpwcuevejgzvpddlhbxwxnss ; /usr/bin/python3
Nov 23 07:54:42 np0005532585.localdomain sudo[51136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:43 np0005532585.localdomain python3[51138]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:54:43 np0005532585.localdomain sudo[51136]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:43 np0005532585.localdomain sudo[51152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffjltywfzrlalvpajakhhwwifqtcvnfz ; /usr/bin/python3
Nov 23 07:54:43 np0005532585.localdomain sudo[51152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:43 np0005532585.localdomain sudo[51152]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:44 np0005532585.localdomain sudo[51240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzztgikobdtneusanltbatvucrtyhrdt ; /usr/bin/python3
Nov 23 07:54:44 np0005532585.localdomain sudo[51240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:44 np0005532585.localdomain python3[51242]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 23 07:54:44 np0005532585.localdomain sudo[51240]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:44 np0005532585.localdomain sudo[51259]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlubmkujlmewgtycbxkfgvulzgxtjtbe ; /usr/bin/python3
Nov 23 07:54:44 np0005532585.localdomain sudo[51259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:44 np0005532585.localdomain python3[51261]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 07:54:44 np0005532585.localdomain sudo[51259]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:45 np0005532585.localdomain sudo[51275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuuwmmdipwejoelmiihmbinxohtodiye ; /usr/bin/python3
Nov 23 07:54:45 np0005532585.localdomain sudo[51275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:45 np0005532585.localdomain python3[51277]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005532585 step=1 update_config_hash_only=False
Nov 23 07:54:45 np0005532585.localdomain sudo[51275]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:45 np0005532585.localdomain sudo[51291]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujegkoegmxigodrrjtychasswwbzkuxe ; /usr/bin/python3
Nov 23 07:54:45 np0005532585.localdomain sudo[51291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:45 np0005532585.localdomain python3[51293]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:54:45 np0005532585.localdomain sudo[51291]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:46 np0005532585.localdomain sudo[51307]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tomggokfgbmisamaxlttygofhpvnwazz ; /usr/bin/python3
Nov 23 07:54:46 np0005532585.localdomain sudo[51307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:46 np0005532585.localdomain python3[51309]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 23 07:54:46 np0005532585.localdomain sudo[51307]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:47 np0005532585.localdomain sudo[51323]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwqbnefbyoeaeghnyezincsshnitcpbm ; /usr/bin/python3
Nov 23 07:54:47 np0005532585.localdomain sudo[51323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:47 np0005532585.localdomain python3[51325]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 07:54:47 np0005532585.localdomain sudo[51323]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:48 np0005532585.localdomain sudo[51365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgzsrcskrmmdeiudmokryedzxnsmzqvi ; /usr/bin/python3
Nov 23 07:54:48 np0005532585.localdomain sudo[51365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:54:48 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Nov 23 07:54:48 np0005532585.localdomain podman[51559]: 2025-11-23 07:54:48.649483492 +0000 UTC m=+0.058567890 container create f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=container-puppet-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 07:54:48 np0005532585.localdomain podman[51567]: 2025-11-23 07:54:48.671050564 +0000 UTC m=+0.071105092 container create 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_puppet_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, container_name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 23 07:54:48 np0005532585.localdomain systemd[1]: Started libpod-conmon-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope.
Nov 23 07:54:48 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:48 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:48 np0005532585.localdomain podman[51559]: 2025-11-23 07:54:48.702498863 +0000 UTC m=+0.111583261 container init f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd)
Nov 23 07:54:48 np0005532585.localdomain podman[51581]: 2025-11-23 07:54:48.708045331 +0000 UTC m=+0.095029970 container create 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, config_id=tripleo_puppet_step1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Nov 23 07:54:48 np0005532585.localdomain systemd[1]: Started libpod-conmon-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope.
Nov 23 07:54:48 np0005532585.localdomain podman[51559]: 2025-11-23 07:54:48.618533889 +0000 UTC m=+0.027618317 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 23 07:54:48 np0005532585.localdomain podman[51561]: 2025-11-23 07:54:48.72363081 +0000 UTC m=+0.125971142 container create 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, tcib_managed=true, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Nov 23 07:54:48 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:48 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:48 np0005532585.localdomain podman[51561]: 2025-11-23 07:54:48.632968972 +0000 UTC m=+0.035309304 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 07:54:48 np0005532585.localdomain systemd[1]: Started libpod-conmon-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope.
Nov 23 07:54:48 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:48 np0005532585.localdomain podman[51567]: 2025-11-23 07:54:48.641826206 +0000 UTC m=+0.041880754 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 23 07:54:48 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:48 np0005532585.localdomain podman[51582]: 2025-11-23 07:54:48.645555776 +0000 UTC m=+0.034992664 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 23 07:54:48 np0005532585.localdomain podman[51581]: 2025-11-23 07:54:48.650074241 +0000 UTC m=+0.037058910 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 23 07:54:49 np0005532585.localdomain sudo[51646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:54:49 np0005532585.localdomain sudo[51646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:54:49 np0005532585.localdomain sudo[51646]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:49 np0005532585.localdomain sudo[51661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:54:49 np0005532585.localdomain sudo[51661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:54:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope.
Nov 23 07:54:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:49 np0005532585.localdomain podman[51582]: 2025-11-23 07:54:49.86885633 +0000 UTC m=+1.258293248 container create 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, container_name=container-puppet-iscsid)
Nov 23 07:54:49 np0005532585.localdomain podman[51567]: 2025-11-23 07:54:49.901112285 +0000 UTC m=+1.301166823 container init 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, container_name=container-puppet-metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, config_id=tripleo_puppet_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 07:54:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope.
Nov 23 07:54:49 np0005532585.localdomain podman[51581]: 2025-11-23 07:54:49.922077657 +0000 UTC m=+1.309062366 container init 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20251118.1, name=rhosp17/openstack-cron, config_id=tripleo_puppet_step1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.12, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=)
Nov 23 07:54:49 np0005532585.localdomain podman[51581]: 2025-11-23 07:54:49.936237602 +0000 UTC m=+1.323222241 container start 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_puppet_step1, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-crond, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z)
Nov 23 07:54:49 np0005532585.localdomain podman[51581]: 2025-11-23 07:54:49.936457828 +0000 UTC m=+1.323442557 container attach 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, container_name=container-puppet-crond, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_puppet_step1, tcib_managed=true, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12)
Nov 23 07:54:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:49 np0005532585.localdomain podman[51582]: 2025-11-23 07:54:49.956956377 +0000 UTC m=+1.346393265 container init 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_puppet_step1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=)
Nov 23 07:54:49 np0005532585.localdomain podman[51567]: 2025-11-23 07:54:49.963459585 +0000 UTC m=+1.363514153 container start 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, container_name=container-puppet-metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 07:54:49 np0005532585.localdomain podman[51567]: 2025-11-23 07:54:49.963797726 +0000 UTC m=+1.363852294 container attach 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, container_name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1)
Nov 23 07:54:49 np0005532585.localdomain podman[51582]: 2025-11-23 07:54:49.968088474 +0000 UTC m=+1.357525352 container start 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_id=tripleo_puppet_step1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 07:54:49 np0005532585.localdomain podman[51582]: 2025-11-23 07:54:49.96826935 +0000 UTC m=+1.357706268 container attach 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 07:54:50 np0005532585.localdomain podman[51561]: 2025-11-23 07:54:50.003975295 +0000 UTC m=+1.406315637 container init 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, version=17.1.12, container_name=container-puppet-nova_libvirt, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt)
Nov 23 07:54:50 np0005532585.localdomain podman[51561]: 2025-11-23 07:54:50.010830235 +0000 UTC m=+1.413170567 container start 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, maintainer=OpenStack TripleO Team, container_name=container-puppet-nova_libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 07:54:50 np0005532585.localdomain podman[51561]: 2025-11-23 07:54:50.013017615 +0000 UTC m=+1.415357957 container attach 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 07:54:50 np0005532585.localdomain podman[51559]: 2025-11-23 07:54:50.022486529 +0000 UTC m=+1.431570937 container start f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, container_name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:51:28Z)
Nov 23 07:54:50 np0005532585.localdomain podman[51559]: 2025-11-23 07:54:50.022910452 +0000 UTC m=+1.431994870 container attach f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, container_name=container-puppet-collectd, config_id=tripleo_puppet_step1, vcs-type=git, name=rhosp17/openstack-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 23 07:54:50 np0005532585.localdomain sudo[51661]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:50 np0005532585.localdomain podman[51471]: 2025-11-23 07:54:48.535512185 +0000 UTC m=+0.034105895 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 23 07:54:50 np0005532585.localdomain podman[51833]: 2025-11-23 07:54:50.733796308 +0000 UTC m=+0.025984784 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 23 07:54:50 np0005532585.localdomain podman[51833]: 2025-11-23 07:54:50.760472955 +0000 UTC m=+0.052661401 container create 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, build-date=2025-11-19T00:11:59Z, description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 23 07:54:50 np0005532585.localdomain systemd[1]: Started libpod-conmon-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope.
Nov 23 07:54:50 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4af87402ffde9ee67f7ec4afd129c011169c73f7a6b2401b07ae2b65c68e82e0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:50 np0005532585.localdomain podman[51833]: 2025-11-23 07:54:50.802418019 +0000 UTC m=+0.094606505 container init 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=container-puppet-ceilometer, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Nov 23 07:54:50 np0005532585.localdomain podman[51833]: 2025-11-23 07:54:50.812953597 +0000 UTC m=+0.105142093 container start 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, container_name=container-puppet-ceilometer, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4)
Nov 23 07:54:50 np0005532585.localdomain podman[51833]: 2025-11-23 07:54:50.813172744 +0000 UTC m=+0.105361230 container attach 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:11:59Z, name=rhosp17/openstack-ceilometer-central, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-central-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51721]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51721]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51721]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51721]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain ovs-vsctl[52057]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51721]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51721]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.08 seconds
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: Accepting previously invalid value for target type 'Integer'
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Nov 23 07:54:51 np0005532585.localdomain crontab[52183]: (root) LIST (root)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.13 seconds
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51787]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51787]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:51 np0005532585.localdomain crontab[52187]: (root) REPLACE (root)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51787]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51787]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Notice: Applied catalog in 0.05 seconds
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Application:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    Initial environment: production
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    Converged environment: production
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:          Run mode: user
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Changes:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:             Total: 2
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Events:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:           Success: 2
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:             Total: 2
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Resources:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:           Changed: 2
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:       Out of sync: 2
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:           Skipped: 7
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:             Total: 9
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Time:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:              Cron: 0.01
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:              File: 0.02
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    Transaction evaluation: 0.05
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    Catalog application: 0.05
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:    Config retrieval: 0.10
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:          Last run: 1763884491
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:             Total: 0.05
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]: Version:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:            Config: 1763884491
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51761]:            Puppet: 7.10.0
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Nov 23 07:54:51 np0005532585.localdomain sudo[52184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:54:51 np0005532585.localdomain sudo[52184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51787]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51787]:    (file & line not available)
Nov 23 07:54:51 np0005532585.localdomain sudo[52184]: pam_unix(sudo:session): session closed for user root
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}6cc35a9378c83bbb3e141511ca4580116e7dbe45274752dd8576577f368bbe29'
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Notice: Applied catalog in 0.03 seconds
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Application:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    Initial environment: production
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    Converged environment: production
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:          Run mode: user
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Changes:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:             Total: 7
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Events:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:           Success: 7
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:             Total: 7
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Resources:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:           Skipped: 13
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:           Changed: 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:       Out of sync: 5
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:             Total: 20
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Time:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:              File: 0.02
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    Transaction evaluation: 0.03
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    Catalog application: 0.03
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:    Config retrieval: 0.16
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:          Last run: 1763884491
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:             Total: 0.03
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]: Version:
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:            Config: 1763884491
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51752]:            Puppet: 7.10.0
Nov 23 07:54:51 np0005532585.localdomain puppet-user[51770]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.10 seconds
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.37 seconds
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: in a future release. Use nova::cinder::os_region_name instead
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: in a future release. Use nova::cinder::catalog_info instead
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope: Consumed 2.053s CPU time.
Nov 23 07:54:52 np0005532585.localdomain podman[51581]: 2025-11-23 07:54:52.140854487 +0000 UTC m=+3.527839136 container died 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, container_name=container-puppet-crond, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: tmp-crun.pLZQoj.mount: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope: Consumed 2.128s CPU time.
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b-merged.mount: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Nov 23 07:54:52 np0005532585.localdomain podman[52267]: 2025-11-23 07:54:52.260274909 +0000 UTC m=+0.111551610 container cleanup 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_puppet_step1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-conmon-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Nov 23 07:54:52 np0005532585.localdomain podman[51567]: 2025-11-23 07:54:52.270814357 +0000 UTC m=+3.670868895 container died 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, distribution-scope=public)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}0d4e701b7b2398bbf396579a0713d46d3c496c79edc52f2e260456f359c9a46c'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Nov 23 07:54:52 np0005532585.localdomain podman[52293]: 2025-11-23 07:54:52.315292504 +0000 UTC m=+0.088281053 container cleanup 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:46Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-conmon-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Nov 23 07:54:52 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Notice: Applied catalog in 0.24 seconds
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Application:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:    Initial environment: production
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:    Converged environment: production
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:          Run mode: user
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Changes:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:             Total: 43
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Events:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:           Success: 43
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:             Total: 43
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Resources:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:           Skipped: 14
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:           Changed: 38
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:       Out of sync: 38
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:             Total: 82
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Time:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:       Concat file: 0.00
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:              File: 0.10
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:    Transaction evaluation: 0.24
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:    Catalog application: 0.24
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:    Config retrieval: 0.45
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:          Last run: 1763884492
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:    Concat fragment: 0.00
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:             Total: 0.24
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]: Version:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:            Config: 1763884491
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51721]:            Puppet: 7.10.0
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Notice: Applied catalog in 0.48 seconds
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Application:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:    Initial environment: production
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:    Converged environment: production
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:          Run mode: user
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Changes:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:             Total: 4
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Events:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:           Success: 4
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:             Total: 4
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Resources:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:           Changed: 4
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:       Out of sync: 4
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:           Skipped: 8
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:             Total: 13
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Time:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:              File: 0.00
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:              Exec: 0.05
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:    Config retrieval: 0.13
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:            Augeas: 0.42
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:    Transaction evaluation: 0.47
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:    Catalog application: 0.48
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:          Last run: 1763884492
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:             Total: 0.48
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]: Version:
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:            Config: 1763884491
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51770]:            Puppet: 7.10.0
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51787]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87-merged.mount: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain podman[52438]: 2025-11-23 07:54:52.696582826 +0000 UTC m=+0.075484903 container create 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, release=1761123044, config_id=tripleo_puppet_step1, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope: Consumed 2.695s CPU time.
Nov 23 07:54:52 np0005532585.localdomain podman[51559]: 2025-11-23 07:54:52.720806153 +0000 UTC m=+4.129890581 container died f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: Started libpod-conmon-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope: Consumed 2.638s CPU time.
Nov 23 07:54:52 np0005532585.localdomain podman[52452]: 2025-11-23 07:54:52.73691419 +0000 UTC m=+0.102080906 container create a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=container-puppet-ovn_controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true)
Nov 23 07:54:52 np0005532585.localdomain podman[51582]: 2025-11-23 07:54:52.739873414 +0000 UTC m=+4.129310302 container died 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, com.redhat.component=openstack-iscsid-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=container-puppet-iscsid, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:52 np0005532585.localdomain podman[52438]: 2025-11-23 07:54:52.648429551 +0000 UTC m=+0.027331658 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 23 07:54:52 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:52 np0005532585.localdomain podman[52438]: 2025-11-23 07:54:52.7575047 +0000 UTC m=+0.136406767 container init 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, container_name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, config_id=tripleo_puppet_step1, tcib_managed=true, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team)
Nov 23 07:54:52 np0005532585.localdomain podman[52438]: 2025-11-23 07:54:52.767090328 +0000 UTC m=+0.145992395 container start 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 07:54:52 np0005532585.localdomain podman[52438]: 2025-11-23 07:54:52.767290884 +0000 UTC m=+0.146192951 container attach 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog)
Nov 23 07:54:52 np0005532585.localdomain podman[52452]: 2025-11-23 07:54:52.689604992 +0000 UTC m=+0.054771758 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51866]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51866]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51866]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51866]:    (file & line not available)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: Started libpod-conmon-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope.
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:52 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:52 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51866]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:52 np0005532585.localdomain puppet-user[51866]:    (file & line not available)
Nov 23 07:54:52 np0005532585.localdomain podman[52452]: 2025-11-23 07:54:52.873843252 +0000 UTC m=+0.239009968 container init a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 23 07:54:52 np0005532585.localdomain podman[52530]: 2025-11-23 07:54:52.877987185 +0000 UTC m=+0.128451721 container cleanup 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, container_name=container-puppet-iscsid, release=1761123044)
Nov 23 07:54:52 np0005532585.localdomain podman[52452]: 2025-11-23 07:54:52.882515181 +0000 UTC m=+0.247681887 container start a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, distribution-scope=public, container_name=container-puppet-ovn_controller, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 07:54:52 np0005532585.localdomain podman[52452]: 2025-11-23 07:54:52.882732627 +0000 UTC m=+0.247899373 container attach a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-conmon-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 23 07:54:52 np0005532585.localdomain podman[52511]: 2025-11-23 07:54:52.915794398 +0000 UTC m=+0.183221269 container cleanup f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-collectd, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64)
Nov 23 07:54:52 np0005532585.localdomain systemd[1]: libpod-conmon-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope: Deactivated successfully.
Nov 23 07:54:52 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 1.33 seconds
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.42 seconds
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}3fd4b82820ca431560a9101649124ba519ce5d6bf5755c5a232928b76e10eb6c'
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Warning: Empty environment setting 'TLS_PASSWORD'
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}bf4205704c2ce3336692c7289c134cb4f34ad9637d3b2e0917c09fb097bf6f77'
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246-merged.mount: Deactivated successfully.
Nov 23 07:54:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310-merged.mount: Deactivated successfully.
Nov 23 07:54:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Notice: Applied catalog in 0.43 seconds
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Application:
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:    Initial environment: production
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:    Converged environment: production
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:          Run mode: user
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Changes:
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:             Total: 31
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Events:
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:           Success: 31
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:             Total: 31
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Resources:
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:           Skipped: 22
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:           Changed: 31
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:       Out of sync: 31
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:             Total: 151
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Time:
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:           Package: 0.02
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:    Ceilometer config: 0.34
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:    Transaction evaluation: 0.42
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:    Catalog application: 0.43
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:    Config retrieval: 0.49
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:          Last run: 1763884493
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:         Resources: 0.00
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:             Total: 0.43
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]: Version:
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:            Config: 1763884492
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51866]:            Puppet: 7.10.0
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Nov 23 07:54:53 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain systemd[1]: libpod-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope: Deactivated successfully.
Nov 23 07:54:54 np0005532585.localdomain systemd[1]: libpod-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope: Consumed 3.129s CPU time.
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain podman[52750]: 2025-11-23 07:54:54.247098517 +0000 UTC m=+0.041662978 container died 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ceilometer, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, architecture=x86_64)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain systemd[1]: tmp-crun.ALByks.mount: Deactivated successfully.
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain podman[52750]: 2025-11-23 07:54:54.303846787 +0000 UTC m=+0.098411228 container cleanup 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-central-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:59Z, name=rhosp17/openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 07:54:54 np0005532585.localdomain systemd[1]: libpod-conmon-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope: Deactivated successfully.
Nov 23 07:54:54 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    (file & line not available)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    (file & line not available)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4af87402ffde9ee67f7ec4afd129c011169c73f7a6b2401b07ae2b65c68e82e0-merged.mount: Deactivated successfully.
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.24 seconds
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52641]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52641]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52641]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52641]:    (file & line not available)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52641]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52641]:    (file & line not available)
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}0ddcdfaeaf89f6f6daa2ee30146631a4c926f7b57df70d985d0c7a45c4b18db9'
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Notice: Applied catalog in 0.12 seconds
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Application:
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    Initial environment: production
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    Converged environment: production
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:          Run mode: user
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Changes:
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:             Total: 3
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Events:
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:           Success: 3
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:             Total: 3
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Resources:
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:           Skipped: 11
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:           Changed: 3
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:       Out of sync: 3
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:             Total: 25
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Time:
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:       Concat file: 0.00
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    Concat fragment: 0.00
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:              File: 0.01
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    Transaction evaluation: 0.11
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    Catalog application: 0.12
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:    Config retrieval: 0.29
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:          Last run: 1763884494
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:             Total: 0.12
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]: Version:
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:            Config: 1763884494
Nov 23 07:54:54 np0005532585.localdomain puppet-user[52612]:            Puppet: 7.10.0
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Nov 23 07:54:54 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.22 seconds
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52932]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52934]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}7457979272b158ac88adf13552cc58cb87586b19a7b8e2158301712e847fdf72'
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52947]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.107
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52953]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005532585.localdomain
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005532585.novalocal' to 'np0005532585.localdomain'
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52960]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52962]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: libpod-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope: Deactivated successfully.
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: libpod-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope: Consumed 2.348s CPU time.
Nov 23 07:54:55 np0005532585.localdomain podman[52438]: 2025-11-23 07:54:55.281699457 +0000 UTC m=+2.660601544 container died 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, vcs-type=git, architecture=x86_64, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, container_name=container-puppet-rsyslog)
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52976]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52983]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: tmp-crun.XMSLnW.mount: Deactivated successfully.
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52986]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52988]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain podman[52969]: 2025-11-23 07:54:55.41953637 +0000 UTC m=+0.129827306 container cleanup 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-rsyslog, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52990]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:25:6a:6e
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: libpod-conmon-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope: Deactivated successfully.
Nov 23 07:54:55 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[52998]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[53005]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain ovs-vsctl[53016]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Notice: Applied catalog in 0.48 seconds
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Application:
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:    Initial environment: production
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:    Converged environment: production
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:          Run mode: user
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Changes:
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:             Total: 14
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Events:
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:           Success: 14
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:             Total: 14
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Resources:
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:           Skipped: 12
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:           Changed: 14
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:       Out of sync: 14
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:             Total: 29
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Time:
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:              Exec: 0.02
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:    Config retrieval: 0.26
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:         Vs config: 0.40
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:    Transaction evaluation: 0.47
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:    Catalog application: 0.48
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:          Last run: 1763884495
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:             Total: 0.48
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]: Version:
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:            Config: 1763884494
Nov 23 07:54:55 np0005532585.localdomain puppet-user[52641]:            Puppet: 7.10.0
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01-merged.mount: Deactivated successfully.
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: libpod-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope: Deactivated successfully.
Nov 23 07:54:55 np0005532585.localdomain systemd[1]: libpod-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope: Consumed 2.855s CPU time.
Nov 23 07:54:55 np0005532585.localdomain podman[52452]: 2025-11-23 07:54:55.922264747 +0000 UTC m=+3.287431493 container died a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=container-puppet-ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1)
Nov 23 07:54:56 np0005532585.localdomain podman[52689]: 2025-11-23 07:54:53.14124415 +0000 UTC m=+0.033800895 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 23 07:54:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227-merged.mount: Deactivated successfully.
Nov 23 07:54:56 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Nov 23 07:54:56 np0005532585.localdomain podman[53059]: 2025-11-23 07:54:56.51457074 +0000 UTC m=+0.580111963 container cleanup a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ovn-controller-container, container_name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 07:54:56 np0005532585.localdomain systemd[1]: libpod-conmon-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope: Deactivated successfully.
Nov 23 07:54:56 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 23 07:54:56 np0005532585.localdomain podman[53117]: 2025-11-23 07:54:56.72970117 +0000 UTC m=+0.118863603 container create 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, version=17.1.12, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.buildah.version=1.41.4, container_name=container-puppet-neutron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-server, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-server-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 07:54:56 np0005532585.localdomain podman[53117]: 2025-11-23 07:54:56.670697548 +0000 UTC m=+0.059860061 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 23 07:54:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope.
Nov 23 07:54:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:54:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 23 07:54:56 np0005532585.localdomain podman[53117]: 2025-11-23 07:54:56.800957657 +0000 UTC m=+0.190120090 container init 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, build-date=2025-11-19T00:23:27Z, name=rhosp17/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron)
Nov 23 07:54:56 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Nov 23 07:54:56 np0005532585.localdomain podman[53117]: 2025-11-23 07:54:56.810224274 +0000 UTC m=+0.199386727 container start 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, config_id=tripleo_puppet_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:23:27Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 07:54:56 np0005532585.localdomain podman[53117]: 2025-11-23 07:54:56.810521393 +0000 UTC m=+0.199683836 container attach 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, build-date=2025-11-19T00:23:27Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, version=17.1.12, container_name=container-puppet-neutron, summary=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-server, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-neutron-server-container, io.buildah.version=1.41.4)
Nov 23 07:54:56 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Nov 23 07:54:56 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Nov 23 07:54:56 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3a12438802493a75725c4f7704f2af6db1ef72af396369e5de28f6f4d6a7ed98'
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Notice: Applied catalog in 4.36 seconds
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Application:
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Initial environment: production
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Converged environment: production
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:          Run mode: user
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Changes:
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:             Total: 183
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Events:
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:           Success: 183
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:             Total: 183
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Resources:
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:           Changed: 183
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:       Out of sync: 183
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:           Skipped: 57
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:             Total: 487
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Time:
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:       Concat file: 0.00
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Concat fragment: 0.00
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:            Anchor: 0.00
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:         File line: 0.00
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Virtlogd config: 0.00
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Virtstoraged config: 0.01
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:              Exec: 0.01
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Virtqemud config: 0.02
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Virtnodedevd config: 0.02
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Virtsecretd config: 0.02
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:              File: 0.03
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:           Package: 0.03
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Virtproxyd config: 0.04
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:            Augeas: 1.03
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Config retrieval: 1.59
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:          Last run: 1763884497
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:       Nova config: 2.94
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Transaction evaluation: 4.34
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:    Catalog application: 4.36
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:         Resources: 0.00
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:             Total: 4.36
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]: Version:
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:            Config: 1763884491
Nov 23 07:54:57 np0005532585.localdomain puppet-user[51787]:            Puppet: 7.10.0
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Nov 23 07:54:58 np0005532585.localdomain systemd[1]: libpod-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope: Deactivated successfully.
Nov 23 07:54:58 np0005532585.localdomain systemd[1]: libpod-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope: Consumed 8.520s CPU time.
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]:    (file: /etc/puppet/hiera.yaml)
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]: Warning: Undefined variable '::deploy_config_name';
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]:    (file & line not available)
Nov 23 07:54:58 np0005532585.localdomain podman[53272]: 2025-11-23 07:54:58.763601109 +0000 UTC m=+0.035563981 container died 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=container-puppet-nova_libvirt, architecture=x86_64, tcib_managed=true)
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]:    (file & line not available)
Nov 23 07:54:58 np0005532585.localdomain systemd[1]: tmp-crun.BV9njP.mount: Deactivated successfully.
Nov 23 07:54:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8-userdata-shm.mount: Deactivated successfully.
Nov 23 07:54:58 np0005532585.localdomain puppet-user[53160]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Nov 23 07:54:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398-merged.mount: Deactivated successfully.
Nov 23 07:54:58 np0005532585.localdomain podman[53272]: 2025-11-23 07:54:58.873838206 +0000 UTC m=+0.145801078 container cleanup 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vcs-type=git, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 07:54:58 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 07:54:58 np0005532585.localdomain systemd[1]: libpod-conmon-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope: Deactivated successfully.
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.63 seconds
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Notice: Applied catalog in 0.47 seconds
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Application:
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Initial environment: production
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Converged environment: production
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:          Run mode: user
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Changes:
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:             Total: 33
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Events:
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:           Success: 33
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:             Total: 33
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Resources:
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:           Skipped: 21
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:           Changed: 33
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:       Out of sync: 33
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:             Total: 155
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Time:
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:         Resources: 0.00
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Ovn metadata agent config: 0.02
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Neutron config: 0.37
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Transaction evaluation: 0.46
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Catalog application: 0.47
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:    Config retrieval: 0.70
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:          Last run: 1763884499
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:             Total: 0.47
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]: Version:
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:            Config: 1763884498
Nov 23 07:54:59 np0005532585.localdomain puppet-user[53160]:            Puppet: 7.10.0
Nov 23 07:55:00 np0005532585.localdomain systemd[1]: libpod-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope: Deactivated successfully.
Nov 23 07:55:00 np0005532585.localdomain systemd[1]: libpod-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope: Consumed 3.554s CPU time.
Nov 23 07:55:00 np0005532585.localdomain podman[53117]: 2025-11-23 07:55:00.431137706 +0000 UTC m=+3.820300219 container died 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=container-puppet-neutron, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 23 07:55:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7-userdata-shm.mount: Deactivated successfully.
Nov 23 07:55:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0-merged.mount: Deactivated successfully.
Nov 23 07:55:00 np0005532585.localdomain podman[53343]: 2025-11-23 07:55:00.538754558 +0000 UTC m=+0.101810587 container cleanup 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, maintainer=OpenStack TripleO Team, container_name=container-puppet-neutron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:23:27Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, io.openshift.expose-services=)
Nov 23 07:55:00 np0005532585.localdomain systemd[1]: libpod-conmon-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope: Deactivated successfully.
Nov 23 07:55:00 np0005532585.localdomain python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 23 07:55:00 np0005532585.localdomain sudo[51365]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:01 np0005532585.localdomain sudo[53394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blyayrkucjuevsyvrvzdnmuhcewzvfuf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:01 np0005532585.localdomain sudo[53394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:01 np0005532585.localdomain python3[53396]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:01 np0005532585.localdomain sudo[53394]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:01 np0005532585.localdomain sudo[53410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlyxnbdjzvrqyzncogafhpprhnaxbzmj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:01 np0005532585.localdomain sudo[53410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:01 np0005532585.localdomain sudo[53410]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:02 np0005532585.localdomain sudo[53426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzyhmetpyrfmzzlngwsioinvycagmxrh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:02 np0005532585.localdomain sudo[53426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:02 np0005532585.localdomain python3[53428]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:55:02 np0005532585.localdomain sudo[53426]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:02 np0005532585.localdomain sudo[53476]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhkeumcwkjrwvxgjywakinuuiebfbnia ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:02 np0005532585.localdomain sudo[53476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:03 np0005532585.localdomain python3[53478]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:55:03 np0005532585.localdomain sudo[53476]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:03 np0005532585.localdomain sudo[53519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhlhfgbrurkudupkabhmzqwweuplajid ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:03 np0005532585.localdomain sudo[53519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:03 np0005532585.localdomain python3[53521]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884502.7450478-84577-143656042359276/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:03 np0005532585.localdomain sudo[53519]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:03 np0005532585.localdomain sudo[53581]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oflxpyknifdsnivejlubcihdrkfmshux ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:03 np0005532585.localdomain sudo[53581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:03 np0005532585.localdomain python3[53583]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:55:03 np0005532585.localdomain sudo[53581]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:04 np0005532585.localdomain sudo[53624]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tryjlmqcrbkxdhppiaudzswjjkuunizk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:04 np0005532585.localdomain sudo[53624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:04 np0005532585.localdomain python3[53626]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884503.6684618-84577-135099077175030/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:04 np0005532585.localdomain sudo[53624]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:04 np0005532585.localdomain sudo[53686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkbgbmpjatilquobhuwljpymihpioynq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:04 np0005532585.localdomain sudo[53686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:04 np0005532585.localdomain python3[53688]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:55:04 np0005532585.localdomain sudo[53686]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:05 np0005532585.localdomain sudo[53729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnaetwvsrbqbmrngwtmnnaqehepivvpu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:05 np0005532585.localdomain sudo[53729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:05 np0005532585.localdomain python3[53731]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884504.5930176-84688-263958110146196/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:05 np0005532585.localdomain sudo[53729]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:05 np0005532585.localdomain sudo[53791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvzzwknlifteaicfhztotnuzmpopyitu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:05 np0005532585.localdomain sudo[53791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:05 np0005532585.localdomain python3[53793]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:55:05 np0005532585.localdomain sudo[53791]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:05 np0005532585.localdomain sudo[53834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tudpnszoqmxwfkorwywktdrdgamzzqpq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:05 np0005532585.localdomain sudo[53834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:06 np0005532585.localdomain python3[53836]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884505.4656067-84750-275205959474268/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:06 np0005532585.localdomain sudo[53834]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:06 np0005532585.localdomain sudo[53864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fflaubkjgnawrdhriwnbftcmwapgbtyb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:06 np0005532585.localdomain sudo[53864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:06 np0005532585.localdomain python3[53866]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:55:06 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:55:06 np0005532585.localdomain systemd-rc-local-generator[53891]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:55:06 np0005532585.localdomain systemd-sysv-generator[53894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:55:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:55:07 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:55:07 np0005532585.localdomain systemd-rc-local-generator[53925]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:55:07 np0005532585.localdomain systemd-sysv-generator[53929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:55:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:55:07 np0005532585.localdomain systemd[1]: Starting TripleO Container Shutdown...
Nov 23 07:55:07 np0005532585.localdomain systemd[1]: Finished TripleO Container Shutdown.
Nov 23 07:55:07 np0005532585.localdomain sudo[53864]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:07 np0005532585.localdomain sudo[53988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxntfumlzkweskruxdzubjukbmpjvxpw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:07 np0005532585.localdomain sudo[53988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:07 np0005532585.localdomain python3[53990]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:55:07 np0005532585.localdomain sudo[53988]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:07 np0005532585.localdomain sudo[54031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyvqetduleksnwqlzicofmrswonfotqc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:07 np0005532585.localdomain sudo[54031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:08 np0005532585.localdomain python3[54033]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884507.3838098-84793-279804636602052/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:08 np0005532585.localdomain sudo[54031]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:08 np0005532585.localdomain sudo[54093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbqpcplehutbfdictngvxpywbfjttvfl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:08 np0005532585.localdomain sudo[54093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:08 np0005532585.localdomain python3[54095]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 07:55:08 np0005532585.localdomain sudo[54093]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:08 np0005532585.localdomain sudo[54136]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rofyvgedocxufvwiikdtxvfngrkmglyc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:08 np0005532585.localdomain sudo[54136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:09 np0005532585.localdomain python3[54138]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884508.3237157-84818-36849067434011/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:09 np0005532585.localdomain sudo[54136]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:09 np0005532585.localdomain sudo[54166]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncbndnkzaqjvmhbjtpuwgqglcgmawsrd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:09 np0005532585.localdomain sudo[54166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:09 np0005532585.localdomain python3[54168]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:55:09 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:55:09 np0005532585.localdomain systemd-rc-local-generator[54193]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:55:09 np0005532585.localdomain systemd-sysv-generator[54198]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:55:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:55:09 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:55:09 np0005532585.localdomain systemd-rc-local-generator[54230]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:55:09 np0005532585.localdomain systemd-sysv-generator[54235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:55:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:55:10 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 07:55:10 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 07:55:10 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 07:55:10 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 07:55:10 np0005532585.localdomain sudo[54166]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:10 np0005532585.localdomain sudo[54258]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qugojhkemljbhzldlcdopyojilthnsmr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:10 np0005532585.localdomain sudo[54258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 64da22351939caf7431a331d2f0c888a
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 4767aaabc3de112d8791c290aa2b669d
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 67452ffc3d9e727585009ffc9989a224
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 7238f2997345c97f4c6ab424e622dc1b
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 1bd1f352f264f24512a1a2440e47a1f5
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 1bd1f352f264f24512a1a2440e47a1f5
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: a43bf0e2ecc9c9d02be7a27eac338b4c
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 39370c45b6a27bfda1ebe1fb9d328c43
Nov 23 07:55:10 np0005532585.localdomain sudo[54258]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:10 np0005532585.localdomain sudo[54274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwmrurkiitjsjsdwbcglqvudbykgdblf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:10 np0005532585.localdomain sudo[54274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:11 np0005532585.localdomain sudo[54274]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:12 np0005532585.localdomain sudo[54316]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgchkijtkosglpemwnxuvbtefnannnlw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:12 np0005532585.localdomain sudo[54316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:12 np0005532585.localdomain python3[54318]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 23 07:55:12 np0005532585.localdomain podman[54355]: 2025-11-23 07:55:12.555873584 +0000 UTC m=+0.064872692 container create 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 07:55:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d.scope.
Nov 23 07:55:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:55:12 np0005532585.localdomain podman[54355]: 2025-11-23 07:55:12.521148939 +0000 UTC m=+0.030148037 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 23 07:55:12 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/102a6fceace8ab36236a8b21c9b9829f6b100c83ca8f9b2c07c3282080c29222/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 23 07:55:12 np0005532585.localdomain podman[54355]: 2025-11-23 07:55:12.630225719 +0000 UTC m=+0.139224817 container init 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 23 07:55:12 np0005532585.localdomain podman[54355]: 2025-11-23 07:55:12.641925624 +0000 UTC m=+0.150924732 container start 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4)
Nov 23 07:55:12 np0005532585.localdomain podman[54355]: 2025-11-23 07:55:12.642178473 +0000 UTC m=+0.151177581 container attach 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, config_id=tripleo_step1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 07:55:12 np0005532585.localdomain systemd[1]: libpod-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d.scope: Deactivated successfully.
Nov 23 07:55:12 np0005532585.localdomain podman[54355]: 2025-11-23 07:55:12.649567089 +0000 UTC m=+0.158566187 container died 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 07:55:12 np0005532585.localdomain podman[54374]: 2025-11-23 07:55:12.734731632 +0000 UTC m=+0.075311757 container cleanup 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr_init_logs, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 23 07:55:12 np0005532585.localdomain systemd[1]: libpod-conmon-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d.scope: Deactivated successfully.
Nov 23 07:55:12 np0005532585.localdomain python3[54318]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Nov 23 07:55:13 np0005532585.localdomain podman[54451]: 2025-11-23 07:55:13.159655873 +0000 UTC m=+0.080072159 container create 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: Started libpod-conmon-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope.
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 07:55:13 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 23 07:55:13 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 23 07:55:13 np0005532585.localdomain podman[54451]: 2025-11-23 07:55:13.12372727 +0000 UTC m=+0.044143556 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:55:13 np0005532585.localdomain podman[54451]: 2025-11-23 07:55:13.243290826 +0000 UTC m=+0.163707162 container init 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, name=rhosp17/openstack-qdrouterd)
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:55:13 np0005532585.localdomain sudo[54471]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 07:55:13 np0005532585.localdomain sudo[54471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Nov 23 07:55:13 np0005532585.localdomain podman[54451]: 2025-11-23 07:55:13.276239443 +0000 UTC m=+0.196655719 container start 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 07:55:13 np0005532585.localdomain python3[54318]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=64da22351939caf7431a331d2f0c888a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 23 07:55:13 np0005532585.localdomain sudo[54471]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:13 np0005532585.localdomain podman[54472]: 2025-11-23 07:55:13.356195858 +0000 UTC m=+0.068943682 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 07:55:13 np0005532585.localdomain sudo[54316]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-102a6fceace8ab36236a8b21c9b9829f6b100c83ca8f9b2c07c3282080c29222-merged.mount: Deactivated successfully.
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d-userdata-shm.mount: Deactivated successfully.
Nov 23 07:55:13 np0005532585.localdomain podman[54472]: 2025-11-23 07:55:13.57225775 +0000 UTC m=+0.285005504 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z)
Nov 23 07:55:13 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:55:13 np0005532585.localdomain sudo[54546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydckozrtybbhjyuojpqbbyyvjjaqzkvv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:13 np0005532585.localdomain sudo[54546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:13 np0005532585.localdomain python3[54548]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:13 np0005532585.localdomain sudo[54546]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:13 np0005532585.localdomain sudo[54562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxgrmzcyzhcosvxcxfqcrisuqqhsunvr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:13 np0005532585.localdomain sudo[54562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:14 np0005532585.localdomain python3[54564]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 07:55:14 np0005532585.localdomain sudo[54562]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:14 np0005532585.localdomain sudo[54623]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfqzwhuucrgsatoiphuifvjyrxcmicgj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:14 np0005532585.localdomain sudo[54623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:14 np0005532585.localdomain python3[54625]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884514.179765-85027-156441799830369/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:14 np0005532585.localdomain sudo[54623]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:14 np0005532585.localdomain sudo[54639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egrexveqgoxbyhpftkjlolvuyomqsxhm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:14 np0005532585.localdomain sudo[54639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:15 np0005532585.localdomain python3[54641]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 07:55:15 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:55:15 np0005532585.localdomain systemd-sysv-generator[54668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:55:15 np0005532585.localdomain systemd-rc-local-generator[54664]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:55:15 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:55:15 np0005532585.localdomain sudo[54639]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:15 np0005532585.localdomain sudo[54690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqbbhozyrvgfkqgbvitnnkfjqgnzyriq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 07:55:15 np0005532585.localdomain sudo[54690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:15 np0005532585.localdomain python3[54692]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 07:55:15 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 07:55:16 np0005532585.localdomain systemd-rc-local-generator[54725]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 07:55:16 np0005532585.localdomain systemd-sysv-generator[54729]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 07:55:16 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 07:55:16 np0005532585.localdomain systemd[1]: Starting metrics_qdr container...
Nov 23 07:55:16 np0005532585.localdomain systemd[1]: Started metrics_qdr container.
Nov 23 07:55:16 np0005532585.localdomain sudo[54690]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:16 np0005532585.localdomain sudo[54771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dopxkznpglszfcyarnntupesauxbrymi ; /usr/bin/python3
Nov 23 07:55:16 np0005532585.localdomain sudo[54771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:16 np0005532585.localdomain python3[54773]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:16 np0005532585.localdomain sudo[54771]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:17 np0005532585.localdomain sudo[54819]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkzeiiraunqhhednllvxptbukkqbelrs ; /usr/bin/python3
Nov 23 07:55:17 np0005532585.localdomain sudo[54819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:17 np0005532585.localdomain sudo[54819]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:17 np0005532585.localdomain sudo[54862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rimffhochhmsnwcyyflfsdciygwnejlg ; /usr/bin/python3
Nov 23 07:55:17 np0005532585.localdomain sudo[54862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:17 np0005532585.localdomain sudo[54862]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:17 np0005532585.localdomain sudo[54892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjuoxupkbhbfqnmoxgnwgyfgkohisuzr ; /usr/bin/python3
Nov 23 07:55:17 np0005532585.localdomain sudo[54892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:18 np0005532585.localdomain python3[54894]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005532585 step=1 update_config_hash_only=False
Nov 23 07:55:18 np0005532585.localdomain sudo[54892]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:18 np0005532585.localdomain sudo[54908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbntvilaxigipymdgwsewhkojtuolgsz ; /usr/bin/python3
Nov 23 07:55:18 np0005532585.localdomain sudo[54908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:19 np0005532585.localdomain python3[54910]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 07:55:19 np0005532585.localdomain sudo[54908]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:19 np0005532585.localdomain sudo[54924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yodeqbbxwttzoneendfgmrqvzylvzlja ; /usr/bin/python3
Nov 23 07:55:19 np0005532585.localdomain sudo[54924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 07:55:19 np0005532585.localdomain python3[54926]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 23 07:55:19 np0005532585.localdomain sudo[54924]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:55:44 np0005532585.localdomain systemd[1]: tmp-crun.OUOSvj.mount: Deactivated successfully.
Nov 23 07:55:44 np0005532585.localdomain podman[54927]: 2025-11-23 07:55:44.036336094 +0000 UTC m=+0.091849698 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:49:46Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, version=17.1.12, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 07:55:44 np0005532585.localdomain podman[54927]: 2025-11-23 07:55:44.195644645 +0000 UTC m=+0.251158329 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 07:55:44 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:55:52 np0005532585.localdomain sudo[54955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:55:52 np0005532585.localdomain sudo[54955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:55:52 np0005532585.localdomain sudo[54955]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:52 np0005532585.localdomain sudo[54970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:55:52 np0005532585.localdomain sudo[54970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:55:52 np0005532585.localdomain sudo[54970]: pam_unix(sudo:session): session closed for user root
Nov 23 07:55:53 np0005532585.localdomain sudo[55017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:55:53 np0005532585.localdomain sudo[55017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:55:53 np0005532585.localdomain sudo[55017]: pam_unix(sudo:session): session closed for user root
Nov 23 07:56:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:56:15 np0005532585.localdomain podman[55032]: 2025-11-23 07:56:15.028368736 +0000 UTC m=+0.083855551 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 07:56:15 np0005532585.localdomain podman[55032]: 2025-11-23 07:56:15.221230644 +0000 UTC m=+0.276717419 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 07:56:15 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:56:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:56:46 np0005532585.localdomain podman[55060]: 2025-11-23 07:56:46.021331657 +0000 UTC m=+0.080689729 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible)
Nov 23 07:56:46 np0005532585.localdomain podman[55060]: 2025-11-23 07:56:46.224205124 +0000 UTC m=+0.283563206 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 07:56:46 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:56:53 np0005532585.localdomain sudo[55088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:56:53 np0005532585.localdomain sudo[55088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:56:53 np0005532585.localdomain sudo[55088]: pam_unix(sudo:session): session closed for user root
Nov 23 07:56:53 np0005532585.localdomain sudo[55103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:56:53 np0005532585.localdomain sudo[55103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:56:54 np0005532585.localdomain sudo[55103]: pam_unix(sudo:session): session closed for user root
Nov 23 07:56:54 np0005532585.localdomain sudo[55151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:56:54 np0005532585.localdomain sudo[55151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:56:54 np0005532585.localdomain sudo[55151]: pam_unix(sudo:session): session closed for user root
Nov 23 07:57:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:57:17 np0005532585.localdomain podman[55166]: 2025-11-23 07:57:17.036821229 +0000 UTC m=+0.079452579 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible)
Nov 23 07:57:17 np0005532585.localdomain podman[55166]: 2025-11-23 07:57:17.208407241 +0000 UTC m=+0.251038601 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 07:57:17 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:57:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:57:48 np0005532585.localdomain podman[55196]: 2025-11-23 07:57:48.018527678 +0000 UTC m=+0.072114660 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 07:57:48 np0005532585.localdomain podman[55196]: 2025-11-23 07:57:48.210309721 +0000 UTC m=+0.263896733 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible)
Nov 23 07:57:48 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:57:54 np0005532585.localdomain sudo[55225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:57:54 np0005532585.localdomain sudo[55225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:57:54 np0005532585.localdomain sudo[55225]: pam_unix(sudo:session): session closed for user root
Nov 23 07:57:54 np0005532585.localdomain sudo[55240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:57:54 np0005532585.localdomain sudo[55240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:57:55 np0005532585.localdomain sudo[55240]: pam_unix(sudo:session): session closed for user root
Nov 23 07:57:56 np0005532585.localdomain sudo[55287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:57:56 np0005532585.localdomain sudo[55287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:57:56 np0005532585.localdomain sudo[55287]: pam_unix(sudo:session): session closed for user root
Nov 23 07:58:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:58:19 np0005532585.localdomain podman[55302]: 2025-11-23 07:58:19.022254565 +0000 UTC m=+0.079031677 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, release=1761123044, name=rhosp17/openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 23 07:58:19 np0005532585.localdomain podman[55302]: 2025-11-23 07:58:19.214091078 +0000 UTC m=+0.270868130 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr)
Nov 23 07:58:19 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:58:25 np0005532585.localdomain sshd[55331]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 07:58:27 np0005532585.localdomain sshd[55331]: Invalid user system from 80.94.95.115 port 45530
Nov 23 07:58:27 np0005532585.localdomain sshd[55331]: Connection closed by invalid user system 80.94.95.115 port 45530 [preauth]
Nov 23 07:58:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:58:50 np0005532585.localdomain systemd[1]: tmp-crun.IXHW85.mount: Deactivated successfully.
Nov 23 07:58:50 np0005532585.localdomain podman[55333]: 2025-11-23 07:58:50.016920475 +0000 UTC m=+0.077552253 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git)
Nov 23 07:58:50 np0005532585.localdomain podman[55333]: 2025-11-23 07:58:50.231359482 +0000 UTC m=+0.291991220 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 07:58:50 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:58:56 np0005532585.localdomain sudo[55362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:58:56 np0005532585.localdomain sudo[55362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:58:56 np0005532585.localdomain sudo[55362]: pam_unix(sudo:session): session closed for user root
Nov 23 07:58:56 np0005532585.localdomain sudo[55377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:58:56 np0005532585.localdomain sudo[55377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:58:57 np0005532585.localdomain sudo[55377]: pam_unix(sudo:session): session closed for user root
Nov 23 07:58:57 np0005532585.localdomain sudo[55423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:58:57 np0005532585.localdomain sudo[55423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:58:57 np0005532585.localdomain sudo[55423]: pam_unix(sudo:session): session closed for user root
Nov 23 07:59:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:59:21 np0005532585.localdomain systemd[1]: tmp-crun.IqKNu7.mount: Deactivated successfully.
Nov 23 07:59:21 np0005532585.localdomain podman[55439]: 2025-11-23 07:59:21.021402781 +0000 UTC m=+0.081961362 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 23 07:59:21 np0005532585.localdomain podman[55439]: 2025-11-23 07:59:21.240118014 +0000 UTC m=+0.300676575 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git)
Nov 23 07:59:21 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:59:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 07:59:52 np0005532585.localdomain systemd[1]: tmp-crun.fhscWl.mount: Deactivated successfully.
Nov 23 07:59:52 np0005532585.localdomain podman[55469]: 2025-11-23 07:59:52.012752839 +0000 UTC m=+0.068946119 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Nov 23 07:59:52 np0005532585.localdomain podman[55469]: 2025-11-23 07:59:52.228979209 +0000 UTC m=+0.285172439 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Nov 23 07:59:52 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 07:59:57 np0005532585.localdomain sudo[55498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 07:59:57 np0005532585.localdomain sudo[55498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:59:57 np0005532585.localdomain sudo[55498]: pam_unix(sudo:session): session closed for user root
Nov 23 07:59:57 np0005532585.localdomain sudo[55513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 07:59:57 np0005532585.localdomain sudo[55513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:59:58 np0005532585.localdomain sudo[55513]: pam_unix(sudo:session): session closed for user root
Nov 23 07:59:59 np0005532585.localdomain sudo[55559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 07:59:59 np0005532585.localdomain sudo[55559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 07:59:59 np0005532585.localdomain sudo[55559]: pam_unix(sudo:session): session closed for user root
Nov 23 08:00:01 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 19 pg[2.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2,1,3] r=2 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:04 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 21 pg[3.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [1,2,0] r=2 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:04 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 23 pg[4.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [3,5,1] r=0 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:05 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 24 pg[4.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [3,5,1] r=0 lpr=23 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:07 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 25 pg[5.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [4,3,2] r=1 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:00:23 np0005532585.localdomain podman[55576]: 2025-11-23 08:00:23.017845811 +0000 UTC m=+0.079096078 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:00:23 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 31 pg[6.0( empty local-lis/les=0/0 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [0,5,1] r=0 lpr=31 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:23 np0005532585.localdomain podman[55576]: 2025-11-23 08:00:23.239622815 +0000 UTC m=+0.300873032 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, distribution-scope=public, container_name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:00:23 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:00:23 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 32 pg[6.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [0,5,1] r=0 lpr=31 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:25 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 33 pg[7.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [5,1,3] r=2 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:27 np0005532585.localdomain sudo[55606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:00:27 np0005532585.localdomain sudo[55606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:00:27 np0005532585.localdomain sudo[55606]: pam_unix(sudo:session): session closed for user root
Nov 23 08:00:29 np0005532585.localdomain sudo[55621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:00:29 np0005532585.localdomain sudo[55621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:00:29 np0005532585.localdomain sudo[55621]: pam_unix(sudo:session): session closed for user root
Nov 23 08:00:29 np0005532585.localdomain sudo[55636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:00:29 np0005532585.localdomain sudo[55636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:00:29 np0005532585.localdomain sudo[55636]: pam_unix(sudo:session): session closed for user root
Nov 23 08:00:33 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 38 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=38 pruub=8.044471741s) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 active pruub 1116.652099609s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,3], acting [2,1,3] -> [2,1,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:33 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 38 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=38 pruub=8.041690826s) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1116.652099609s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:33 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 38 pg[3.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=38 pruub=10.719461441s) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 active pruub 1123.247802734s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,0], acting [1,2,0] -> [1,2,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:33 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 38 pg[3.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=38 pruub=10.716773033s) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.247802734s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.16( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.15( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.17( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.18( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.14( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.13( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.12( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.11( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.f( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.10( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.d( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.e( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.c( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.b( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.3( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.a( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.7( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.2( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.4( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.19( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1f( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1e( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1d( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1c( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1b( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1a( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.8( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.5( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.6( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.9( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1a( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.18( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.19( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.15( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.16( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.14( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.13( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.12( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.11( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.f( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.10( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.e( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.c( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.d( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.5( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.3( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.2( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.4( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.6( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.7( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.9( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.a( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.b( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1d( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1e( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1f( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1c( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:34 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.17( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:35 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 40 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=40 pruub=12.422865868s) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 active pruub 1123.047607422s@ mbc={}] start_peering_interval up [4,3,2] -> [4,3,2], acting [4,3,2] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:35 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 40 pg[4.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=40 pruub=10.230877876s) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active pruub 1120.855712891s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,1], acting [3,5,1] -> [3,5,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:35 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 40 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=40 pruub=12.419507980s) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.047607422s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:35 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 40 pg[4.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=40 pruub=10.230877876s) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1120.855712891s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1f( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.11( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.11( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.13( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.14( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.12( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.17( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.15( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.8( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.10( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.16( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.a( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.b( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.c( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.d( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.e( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.4( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.7( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.2( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.9( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.6( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.5( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.f( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1e( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1d( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.3( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1c( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1b( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1a( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.19( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.18( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.0( empty local-lis/les=40/41 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Nov 23 08:00:36 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Nov 23 08:00:37 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 42 pg[7.0( v 35'39 (0'0,35'39] local-lis/les=33/34 n=22 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=42 pruub=12.373711586s) [5,1,3] r=2 lpr=42 pi=[33,42)/1 luod=0'0 lua=35'37 crt=35'39 lcod 35'38 mlcod 0'0 active pruub 1125.082519531s@ mbc={}] start_peering_interval up [5,1,3] -> [5,1,3], acting [5,1,3] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:37 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 42 pg[7.0( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=42 pruub=12.371913910s) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 lcod 35'38 mlcod 0'0 unknown NOTIFY pruub 1125.082519531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:37 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 42 pg[6.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=42 pruub=10.341920853s) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active pruub 1126.960327148s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,1], acting [0,5,1] -> [0,5,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:37 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 42 pg[6.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=42 pruub=10.341920853s) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown pruub 1126.960327148s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1f( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1e( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1d( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.12( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1c( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.13( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.10( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.11( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.16( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.15( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.14( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.a( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.b( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.17( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.8( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.9( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.5( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.4( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.6( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.7( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.2( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.3( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.d( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.c( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.f( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.e( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.19( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.18( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1b( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1a( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.b( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.a( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.8( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.9( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.e( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.f( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.c( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.6( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.5( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.3( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.2( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.4( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.7( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.d( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=42/43 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:38 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:40 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Nov 23 08:00:40 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Nov 23 08:00:42 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.e scrub starts
Nov 23 08:00:42 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.e scrub ok
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835800171s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.540893555s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,3], acting [1,2,0] -> [4,5,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937299728s) [1,3,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642456055s@ mbc={}] start_peering_interval up [0,5,1] -> [1,3,5], acting [0,5,1] -> [1,3,5], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835736275s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541015625s@ mbc={}] start_peering_interval up [1,2,0] -> [4,3,2], acting [1,2,0] -> [4,3,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.938068390s) [4,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [4,5,3], acting [0,5,1] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835564613s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.540893555s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835627556s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541015625s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937961578s) [4,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643432617s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834791183s) [0,2,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.540405273s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,1], acting [1,2,0] -> [0,2,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834791183s) [0,2,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.540405273s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942682266s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648803711s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.936487198s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642700195s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942566872s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648803711s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.936418533s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642700195s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834320068s) [3,1,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.540649414s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,5], acting [1,2,0] -> [3,1,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.936368942s) [1,3,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642456055s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834265709s) [3,1,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.540649414s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.845931053s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552246094s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.845931053s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.552246094s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934622765s) [4,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [0,5,1] -> [4,2,0], acting [0,5,1] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934531212s) [4,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934263229s) [3,4,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,2], acting [0,5,1] -> [3,4,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834345818s) [2,4,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541259766s@ mbc={}] start_peering_interval up [1,2,0] -> [2,4,0], acting [1,2,0] -> [2,4,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834661484s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541503906s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,4], acting [1,2,0] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934223175s) [3,4,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834317207s) [2,4,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541259766s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834564209s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541503906s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942143440s) [3,1,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.649047852s@ mbc={}] start_peering_interval up [0,5,1] -> [3,1,2], acting [0,5,1] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834403992s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541381836s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,1], acting [1,2,0] -> [2,3,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934181213s) [0,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641235352s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,4], acting [0,5,1] -> [0,5,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834381104s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541381836s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934181213s) [0,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641235352s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942028999s) [3,1,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.649047852s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833991051s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541259766s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934215546s) [1,0,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641479492s@ mbc={}] start_peering_interval up [0,5,1] -> [1,0,2], acting [0,5,1] -> [1,0,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933979988s) [0,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,2], acting [0,5,1] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934185028s) [1,0,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641479492s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833991051s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.541259766s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933952332s) [3,5,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641357422s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,4], acting [0,5,1] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834877014s) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542358398s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,0], acting [1,2,0] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933979988s) [0,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641113281s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933888435s) [3,5,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641357422s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834819794s) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542358398s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834176064s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541625977s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934815407s) [2,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642333984s@ mbc={}] start_peering_interval up [0,5,1] -> [2,4,0], acting [0,5,1] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834150314s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541625977s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834087372s) [2,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541748047s@ mbc={}] start_peering_interval up [1,2,0] -> [2,1,0], acting [1,2,0] -> [2,1,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934787750s) [2,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642333984s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834007263s) [2,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541748047s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833943367s) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541748047s@ mbc={}] start_peering_interval up [1,2,0] -> [1,5,0], acting [1,2,0] -> [1,5,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933747292s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641601562s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833916664s) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541748047s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934185028s) [5,0,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642089844s@ mbc={}] start_peering_interval up [0,5,1] -> [5,0,4], acting [0,5,1] -> [5,0,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933710098s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641601562s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833460808s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541748047s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834133148s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542114258s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934127808s) [5,0,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642089844s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833368301s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541748047s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933013916s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641601562s@ mbc={}] start_peering_interval up [0,5,1] -> [2,1,3], acting [0,5,1] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932913780s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641601562s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933200836s) [5,1,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642089844s@ mbc={}] start_peering_interval up [0,5,1] -> [5,1,0], acting [0,5,1] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932395935s) [0,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641479492s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,5], acting [0,5,1] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933135986s) [5,1,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642089844s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932395935s) [0,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641479492s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833307266s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542602539s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,4], acting [1,2,0] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841951370s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832960129s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542358398s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933698654s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643066406s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833233833s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833282471s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542114258s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841876030s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933634758s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643066406s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832960129s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.542358398s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933800697s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643310547s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933767319s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643310547s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841523170s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832893372s) [2,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542602539s@ mbc={}] start_peering_interval up [1,2,0] -> [2,0,4], acting [1,2,0] -> [2,0,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933731079s) [1,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [1,5,3], acting [0,5,1] -> [1,5,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933674812s) [3,5,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,1], acting [0,5,1] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933703423s) [1,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643432617s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832807541s) [2,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933616638s) [3,5,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643432617s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832571030s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542602539s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932530403s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642456055s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832571030s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.542602539s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841611862s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551635742s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932500839s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642456055s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.931669235s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641845703s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.931645393s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641845703s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841684341s) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552001953s@ mbc={}] start_peering_interval up [1,2,0] -> [4,0,5], acting [1,2,0] -> [4,0,5], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840876579s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841662407s) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.552001953s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841556549s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551635742s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932093620s) [3,2,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642700195s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,4], acting [0,5,1] -> [3,2,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840692520s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932069778s) [3,2,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642700195s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.931115150s) [2,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641723633s@ mbc={}] start_peering_interval up [0,5,1] -> [2,3,1], acting [0,5,1] -> [2,3,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840462685s) [4,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [4,2,3], acting [1,2,0] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840587616s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930770874s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641479492s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841069221s) [3,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551879883s@ mbc={}] start_peering_interval up [1,2,0] -> [3,5,1], acting [1,2,0] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930669785s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641479492s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840985298s) [3,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551879883s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930966377s) [2,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641723633s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840362549s) [4,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930923462s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641845703s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841176033s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552246094s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930890083s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641845703s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841149330s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.552246094s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937814713s) [5,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648925781s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,1], acting [0,5,1] -> [5,3,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840806961s) [1,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552001953s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,3], acting [1,2,0] -> [1,2,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937576294s) [1,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648803711s@ mbc={}] start_peering_interval up [0,5,1] -> [1,2,0], acting [0,5,1] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840649605s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551879883s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937548637s) [1,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648803711s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932242393s) [0,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [0,2,4], acting [0,5,1] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840758324s) [1,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.552001953s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840600014s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551879883s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937674522s) [5,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648925781s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932242393s) [0,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.643432617s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840812683s) [0,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552246094s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,4], acting [1,2,0] -> [0,5,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840812683s) [0,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.552246094s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937287331s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648803711s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1f( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937143326s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648803711s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.1e( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.a( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1d( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.5( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.18( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837604523s) [2,4,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.648315430s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,3], acting [4,3,2] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.851902008s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662719727s@ mbc={}] start_peering_interval up [3,5,1] -> [0,5,1], acting [3,5,1] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.851874352s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662719727s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828904152s) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639770508s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,3], acting [2,1,3] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828863144s) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639770508s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849290848s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849265099s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.850394249s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661621094s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,4], acting [3,5,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.850394249s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.661621094s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845753670s) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.657104492s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,2], acting [3,5,1] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828304291s) [5,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639770508s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,0], acting [2,1,3] -> [5,1,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828281403s) [5,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639770508s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845753670s) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.657104492s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847958565s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659545898s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847936630s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659545898s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828187943s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639892578s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828160286s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639892578s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849582672s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661254883s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849558830s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661254883s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847456932s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659301758s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.827984810s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639892578s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,0], acting [2,1,3] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.827827454s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639892578s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847299576s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659301758s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837570190s) [2,4,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.648315430s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844738007s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656982422s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844702721s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656982422s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847362518s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659301758s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.1e( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846982956s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659423828s@ mbc={}] start_peering_interval up [4,3,2] -> [3,4,5], acting [4,3,2] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846982956s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.659423828s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826711655s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639404297s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826678276s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639404297s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847678185s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826443672s) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639282227s@ mbc={}] start_peering_interval up [2,1,3] -> [4,3,2], acting [2,1,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847647667s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.12( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826419830s) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639282227s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846818924s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659301758s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848485947s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661376953s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,5], acting [3,5,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848455429s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661376953s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847059250s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660034180s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847031593s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660034180s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843997955s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.657104492s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826216698s) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639404297s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,4], acting [2,1,3] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843966484s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.657104492s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826193810s) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639404297s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.13( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845967293s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659423828s@ mbc={}] start_peering_interval up [4,3,2] -> [3,1,5], acting [4,3,2] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845967293s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.659423828s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825808525s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639526367s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848221779s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661987305s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824745178s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.638427734s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,5], acting [2,1,3] -> [4,0,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843224525s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656982422s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,4], acting [3,5,1] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848189354s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661987305s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824672699s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.638427734s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843224525s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.656982422s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825201988s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639282227s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825174332s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639282227s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925047874s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.739135742s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925019264s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.739135742s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846148491s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660522461s@ mbc={}] start_peering_interval up [4,3,2] -> [5,4,0], acting [4,3,2] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824141502s) [3,4,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.638427734s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,2], acting [2,1,3] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824141502s) [3,4,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.638427734s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846114159s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660522461s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.16( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845313072s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660034180s@ mbc={}] start_peering_interval up [4,3,2] -> [1,0,5], acting [4,3,2] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845283508s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660034180s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848176956s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662841797s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825737000s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639526367s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848143578s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662841797s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844613075s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659545898s@ mbc={}] start_peering_interval up [4,3,2] -> [0,1,2], acting [4,3,2] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844579697s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659545898s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846802711s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661865234s@ mbc={}] start_peering_interval up [3,5,1] -> [1,0,2], acting [3,5,1] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846767426s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661865234s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848936081s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.664062500s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,4], acting [3,5,1] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848896980s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.664062500s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925550461s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740722656s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925510406s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740722656s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.11( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816704750s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.632202148s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.19( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816674232s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.632202148s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823850632s) [1,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639648438s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,5], acting [2,1,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844244003s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660156250s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823821068s) [1,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639648438s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847523689s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.663452148s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,1], acting [3,5,1] -> [2,0,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844212532s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660156250s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847485542s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.663452148s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823812485s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639770508s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,0], acting [2,1,3] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843764305s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659790039s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840842247s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656860352s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823677063s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639770508s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922594070s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738769531s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840815544s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656860352s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922490120s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.738769531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843998909s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [4,5,0], acting [4,3,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823447227s) [1,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639892578s@ mbc={}] start_peering_interval up [2,1,3] -> [1,3,2], acting [2,1,3] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843973160s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823421478s) [1,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639892578s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.14( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824110985s) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [2,1,3] -> [3,5,4], acting [2,1,3] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840005875s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656982422s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,5], acting [3,5,1] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843177795s) [4,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660156250s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,2], acting [4,3,2] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824110985s) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641113281s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840005875s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.656982422s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843148232s) [4,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660156250s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839732170s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656860352s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,1], acting [3,5,1] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839672089s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656860352s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.b( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.b( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842562675s) [1,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [1,3,5], acting [4,3,2] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.822620392s) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640502930s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,1], acting [2,1,3] -> [5,3,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842535019s) [1,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.822591782s) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640502930s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.3( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922136307s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740722656s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843249321s) [5,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661865234s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,0], acting [3,5,1] -> [5,1,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922107697s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740722656s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.2( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843209267s) [5,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661865234s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841743469s) [5,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660522461s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,4], acting [4,3,2] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841720581s) [5,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660522461s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843566895s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662475586s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922024727s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740844727s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843536377s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662475586s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841497421s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660522461s@ mbc={}] start_peering_interval up [4,3,2] -> [2,3,1], acting [4,3,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.4( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922002792s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740844727s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841475487s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660522461s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.821928024s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641357422s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.821928024s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641357422s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820111275s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842794418s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662597656s@ mbc={}] start_peering_interval up [3,5,1] -> [4,2,0], acting [3,5,1] -> [4,2,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820067406s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640014648s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.6( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842749596s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662597656s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843764305s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.659790039s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840354919s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [5,0,1], acting [4,3,2] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820435524s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640869141s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840326309s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660766602s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842167854s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662597656s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820395470s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640869141s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842133522s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662597656s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.4( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.819142342s) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,5], acting [2,1,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839834213s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [3,5,4], acting [4,3,2] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842031479s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662963867s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.819142342s) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640014648s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839834213s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.660766602s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841982841s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662963867s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.919782639s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740966797s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839818001s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661132812s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,1], acting [4,3,2] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.919662476s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740966797s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839791298s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661132812s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818497658s) [5,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [5,0,4], acting [2,1,3] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818475723s) [5,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640014648s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840746880s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662353516s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840718269s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662353516s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.1f( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839011192s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660888672s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838981628s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660888672s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818644524s) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640502930s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,2], acting [2,1,3] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.918968201s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740844727s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.916552544s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738525391s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818644524s) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640502930s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.918909073s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740844727s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.7( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838333130s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660644531s@ mbc={}] start_peering_interval up [4,3,2] -> [5,1,3], acting [4,3,2] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838297844s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660644531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.834403992s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656494141s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818717957s) [2,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641235352s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,0], acting [2,1,3] -> [2,1,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.916527748s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.738525391s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.834202766s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656860352s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,0], acting [3,5,1] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.834152222s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656860352s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840023041s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662719727s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839991570s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662719727s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817165375s) [3,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817165375s) [3,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640014648s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837698936s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837549210s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660644531s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,4], acting [4,3,2] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839958191s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662963867s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839901924s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662963867s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818294525s) [2,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641235352s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837414742s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660644531s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817717552s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,3], acting [2,1,3] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817695618s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837395668s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817572594s) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [2,1,3] -> [1,2,3], acting [2,1,3] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839689255s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.663085938s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837361336s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660766602s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817535400s) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839632034s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.663085938s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837226868s) [2,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660888672s@ mbc={}] start_peering_interval up [4,3,2] -> [2,0,4], acting [4,3,2] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838188171s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661987305s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,4], acting [3,5,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816642761s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640380859s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838162422s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661987305s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837698936s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.660766602s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816599846s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640380859s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836881638s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836853981s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660766602s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838805199s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662719727s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836980820s) [2,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660888672s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838712692s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662719727s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836898804s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661010742s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836870193s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661010742s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816055298s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640258789s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837521553s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661743164s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816055298s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640258789s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837487221s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661743164s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836463928s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660888672s@ mbc={}] start_peering_interval up [4,3,2] -> [2,1,3], acting [4,3,2] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836436272s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660888672s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837371826s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661865234s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837341309s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661865234s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.815620422s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640136719s@ mbc={}] start_peering_interval up [2,1,3] -> [0,2,4], acting [2,1,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816117287s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640869141s@ mbc={}] start_peering_interval up [2,1,3] -> [4,5,0], acting [2,1,3] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.4( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816083908s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640869141s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.7( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.815547943s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640136719s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.c( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.b( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.831289291s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656494141s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.f( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:43 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1e( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.16( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1b( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1e( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,5,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1c( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [1,3,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.1b( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.1c( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.15( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.10( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.a( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.8( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.c( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.9( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.e( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.c( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.9( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.2( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.5( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.7( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.2( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.2( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.5( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.16( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1c( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.b( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.5( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.19( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.d( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [1,5,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.10( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.14( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.13( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.e( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.1c( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1d( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.1d( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.d( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.17( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.b( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.13( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.1e( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.12( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.9( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.1( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.3( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.8( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.f( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.19( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.a( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.8( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.16( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.1f( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.14( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.11( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.10( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.1e( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.19( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.10( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.12( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.16( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.a( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.7( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.b( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.1( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.17( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.b( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.6( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.6( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.18( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.1e( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.4( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.5( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.6( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[2.1f( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.18( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.d( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.b( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.7( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.11( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.4( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.4( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.4( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.9( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1a( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.f( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.14( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.13( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.c( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.2( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.1d( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.1e( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.1f( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.1d( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.19( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.6( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.f( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.1( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.c( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.e( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.10( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.17( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.9( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.14( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:44 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.11( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886891365s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738647461s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886891365s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738647461s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886839867s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738647461s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886839867s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738647461s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886330605s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738891602s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886330605s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738891602s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.885553360s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738525391s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:45 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.885553360s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738525391s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:46 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:46 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:46 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:46 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:48 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Nov 23 08:00:50 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Nov 23 08:00:52 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts
Nov 23 08:00:52 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Nov 23 08:00:53 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.e deep-scrub starts
Nov 23 08:00:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:00:54 np0005532585.localdomain podman[55651]: 2025-11-23 08:00:54.002010845 +0000 UTC m=+0.065936750 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd)
Nov 23 08:00:54 np0005532585.localdomain podman[55651]: 2025-11-23 08:00:54.226085776 +0000 UTC m=+0.290011701 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 23 08:00:54 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:00:54 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.6 deep-scrub ok
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.981581688s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.866577148s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.981581688s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.866577148s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.978737831s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.864257812s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.978737831s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.864257812s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980515480s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.866577148s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980515480s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.866577148s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980977058s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.867309570s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:00:55 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980977058s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.867309570s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:00:56 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:56 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=48/49 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:56 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:56 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:00:57 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Nov 23 08:00:57 np0005532585.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=193.163.125.189 DST=38.102.83.198 LEN=44 TOS=0x00 PREC=0x00 TTL=244 ID=10013 PROTO=TCP SPT=34503 DPT=9090 SEQ=2446406591 ACK=0 WINDOW=14600 RES=0x00 SYN URGP=0 OPT (020405B4) 
Nov 23 08:00:58 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Nov 23 08:00:58 np0005532585.localdomain sudo[55694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipewqqfdyqquvpywmsdrkspdstqbcjkz ; /usr/bin/python3
Nov 23 08:00:58 np0005532585.localdomain sudo[55694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:00:58 np0005532585.localdomain python3[55696]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:00:58 np0005532585.localdomain sudo[55694]: pam_unix(sudo:session): session closed for user root
Nov 23 08:00:59 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Nov 23 08:00:59 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Nov 23 08:00:59 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Nov 23 08:00:59 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Nov 23 08:01:00 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.16 scrub starts
Nov 23 08:01:00 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.16 scrub ok
Nov 23 08:01:00 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Nov 23 08:01:00 np0005532585.localdomain sudo[55710]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hczpibqforpfsonrohaqfuguvbjemdht ; /usr/bin/python3
Nov 23 08:01:00 np0005532585.localdomain sudo[55710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:00 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Nov 23 08:01:00 np0005532585.localdomain python3[55712]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:00 np0005532585.localdomain sudo[55710]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:01 np0005532585.localdomain CROND[55714]: (root) CMD (run-parts /etc/cron.hourly)
Nov 23 08:01:01 np0005532585.localdomain run-parts[55717]: (/etc/cron.hourly) starting 0anacron
Nov 23 08:01:01 np0005532585.localdomain run-parts[55723]: (/etc/cron.hourly) finished 0anacron
Nov 23 08:01:01 np0005532585.localdomain CROND[55713]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 23 08:01:02 np0005532585.localdomain sudo[55737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scfubqihhkeoaapvperwefepuoyvlhyf ; /usr/bin/python3
Nov 23 08:01:02 np0005532585.localdomain sudo[55737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Nov 23 08:01:02 np0005532585.localdomain python3[55739]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:02 np0005532585.localdomain sudo[55737]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.4( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.724631310s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1153.739013672s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.4( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.724288940s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1153.739013672s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.723724365s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1153.738647461s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.723650932s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1153.738647461s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 50 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:02 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 50 pg[7.4( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:03 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Nov 23 08:01:03 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Nov 23 08:01:03 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 51 pg[7.c( v 35'39 lc 35'16 (0'0,35'39] local-lis/les=50/51 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:03 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 51 pg[7.4( v 35'39 lc 35'15 (0'0,35'39] local-lis/les=50/51 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(1+2)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:04 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803336143s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1151.865112305s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:04 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803246498s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1151.865112305s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:04 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803723335s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1151.866088867s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:04 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803501129s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1151.866088867s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:05 np0005532585.localdomain sudo[55785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrqnigmegjkbitrkdskrcpfvfzzmisoe ; /usr/bin/python3
Nov 23 08:01:05 np0005532585.localdomain sudo[55785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:05 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.10 deep-scrub starts
Nov 23 08:01:05 np0005532585.localdomain python3[55787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:05 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.10 deep-scrub ok
Nov 23 08:01:05 np0005532585.localdomain sudo[55785]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:05 np0005532585.localdomain sudo[55828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnwetkqjngfmeytwrlfqfgrloppomwmq ; /usr/bin/python3
Nov 23 08:01:05 np0005532585.localdomain sudo[55828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:05 np0005532585.localdomain python3[55830]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884864.877191-92282-189973165291993/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=5f137984986c8cf5df5aec7749430e0dc129d0db backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:05 np0005532585.localdomain sudo[55828]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:05 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 52 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52) [4,0,2] r=1 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:05 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 52 pg[7.5( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52) [4,0,2] r=1 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:06 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Nov 23 08:01:06 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1f deep-scrub starts
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1f deep-scrub ok
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.279025078s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1153.944335938s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.278964996s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1153.944335938s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.355095863s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1154.021118164s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.354991913s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1154.021118164s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 54 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:07 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 54 pg[7.6( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:08 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.17 deep-scrub starts
Nov 23 08:01:08 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.17 deep-scrub ok
Nov 23 08:01:08 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 55 pg[7.6( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=54/55 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=35'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:08 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 55 pg[7.e( v 35'39 lc 35'17 (0'0,35'39] local-lis/les=54/55 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:09 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.385271072s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1156.089965820s@ mbc={255={}}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:09 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.385170937s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1156.089965820s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:09 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.384820938s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1156.089965820s@ mbc={255={}}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:09 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.384694099s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1156.089965820s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:10 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Nov 23 08:01:10 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Nov 23 08:01:10 np0005532585.localdomain sudo[55890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqyekieujqrnxeylnghpobhxcmpicagm ; /usr/bin/python3
Nov 23 08:01:10 np0005532585.localdomain sudo[55890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:10 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.14 deep-scrub starts
Nov 23 08:01:10 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.14 deep-scrub ok
Nov 23 08:01:10 np0005532585.localdomain python3[55892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:10 np0005532585.localdomain sudo[55890]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:10 np0005532585.localdomain sudo[55933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-covzhqdvbmoydtusukuumjmcwltpzkdd ; /usr/bin/python3
Nov 23 08:01:10 np0005532585.localdomain sudo[55933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:10 np0005532585.localdomain python3[55935]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884870.2241414-92282-101044522076672/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=8a18e979d41caf333cb312628abb5051e6d0049c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:10 np0005532585.localdomain sudo[55933]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:11 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 58 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=58 pruub=15.010090828s) [3,2,1] r=0 lpr=58 pi=[42,58)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1161.739379883s@ mbc={}] start_peering_interval up [5,1,3] -> [3,2,1], acting [5,1,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:11 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 58 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=58 pruub=15.010090828s) [3,2,1] r=0 lpr=58 pi=[42,58)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1161.739379883s@ mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:12 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Nov 23 08:01:12 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Nov 23 08:01:12 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.14 deep-scrub starts
Nov 23 08:01:12 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.14 deep-scrub ok
Nov 23 08:01:12 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 59 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=58/59 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=58) [3,2,1] r=0 lpr=58 pi=[42,58)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:13 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 60 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=11.104165077s) [0,4,2] r=-1 lpr=60 pi=[44,60)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1159.868652344s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,2], acting [2,1,3] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:13 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 60 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=11.103945732s) [0,4,2] r=-1 lpr=60 pi=[44,60)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1159.868652344s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:13 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 60 pg[7.9( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60) [0,4,2] r=0 lpr=60 pi=[44,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:14 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Nov 23 08:01:14 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Nov 23 08:01:14 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Nov 23 08:01:14 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Nov 23 08:01:14 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 61 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=60/61 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60) [0,4,2] r=0 lpr=60 pi=[44,60)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:15 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.9 deep-scrub starts
Nov 23 08:01:15 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.9 deep-scrub ok
Nov 23 08:01:15 np0005532585.localdomain sudo[55996]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysfeakdynsmhgkgndspkfynofztkhvqm ; /usr/bin/python3
Nov 23 08:01:15 np0005532585.localdomain sudo[55996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:15 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 62 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=11.047682762s) [4,0,5] r=-1 lpr=62 pi=[46,62)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1161.944580078s@ mbc={}] start_peering_interval up [3,1,5] -> [4,0,5], acting [3,1,5] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:15 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 62 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=11.047437668s) [4,0,5] r=-1 lpr=62 pi=[46,62)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1161.944580078s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:15 np0005532585.localdomain python3[55998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:15 np0005532585.localdomain sudo[55996]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:15 np0005532585.localdomain sudo[56039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btrcounkoncslwhrcpljayndfwhymoat ; /usr/bin/python3
Nov 23 08:01:15 np0005532585.localdomain sudo[56039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:16 np0005532585.localdomain python3[56041]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884875.3758018-92282-218969318650132/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=ae43e71821d6a319ccba3331b262b98567ce770b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:16 np0005532585.localdomain sudo[56039]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:16 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Nov 23 08:01:16 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Nov 23 08:01:17 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 62 pg[7.a( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=62) [4,0,5] r=1 lpr=62 pi=[46,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:18 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1e deep-scrub starts
Nov 23 08:01:18 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1e deep-scrub ok
Nov 23 08:01:19 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.b scrub starts
Nov 23 08:01:19 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.b scrub ok
Nov 23 08:01:20 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.18 deep-scrub starts
Nov 23 08:01:20 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.18 deep-scrub ok
Nov 23 08:01:21 np0005532585.localdomain sudo[56101]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtmorcyslahkmflhtxigqdrkirzqigym ; /usr/bin/python3
Nov 23 08:01:21 np0005532585.localdomain sudo[56101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:21 np0005532585.localdomain python3[56103]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:21 np0005532585.localdomain sudo[56101]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:21 np0005532585.localdomain sudo[56146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axznnlcgwiaujheybkjqtoyaqvhvmlfm ; /usr/bin/python3
Nov 23 08:01:21 np0005532585.localdomain sudo[56146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:21 np0005532585.localdomain python3[56148]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884881.2254841-92640-206212653887107/source _original_basename=tmpj1qt_tov follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:21 np0005532585.localdomain sudo[56146]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:22 np0005532585.localdomain sudo[56208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggodufwoaujbfuwdtxomaakmwhmeglcq ; /usr/bin/python3
Nov 23 08:01:22 np0005532585.localdomain sudo[56208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:22 np0005532585.localdomain python3[56210]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:22 np0005532585.localdomain sudo[56208]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:23 np0005532585.localdomain sudo[56251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkuebkdvrivxdulklowdawvrtsznpjcd ; /usr/bin/python3
Nov 23 08:01:23 np0005532585.localdomain sudo[56251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:23 np0005532585.localdomain python3[56253]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884882.685829-92819-36251608125809/source _original_basename=tmp0gqx0hk1 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:23 np0005532585.localdomain sudo[56251]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:23 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 64 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=50/51 n=1 ec=42/33 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=12.218671799s) [2,3,4] r=-1 lpr=64 pi=[50,64)/1 crt=35'39 mlcod 0'0 active pruub 1174.969726562s@ mbc={255={}}] start_peering_interval up [0,5,4] -> [2,3,4], acting [0,5,4] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:23 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 64 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=50/51 n=1 ec=42/33 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=12.218530655s) [2,3,4] r=-1 lpr=64 pi=[50,64)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1174.969726562s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:23 np0005532585.localdomain sudo[56281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uupevoomybyfynjzpbqvtqnquwutfvoh ; /usr/bin/python3
Nov 23 08:01:23 np0005532585.localdomain sudo[56281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:23 np0005532585.localdomain python3[56283]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Nov 23 08:01:23 np0005532585.localdomain crontab[56284]: (root) LIST (root)
Nov 23 08:01:23 np0005532585.localdomain crontab[56285]: (root) REPLACE (root)
Nov 23 08:01:23 np0005532585.localdomain sudo[56281]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:23 np0005532585.localdomain sudo[56299]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dymqncgvbizwkfwsesknqsudvqxodwvj ; /usr/bin/python3
Nov 23 08:01:23 np0005532585.localdomain sudo[56299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:24 np0005532585.localdomain python3[56301]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:01:24 np0005532585.localdomain sudo[56299]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:24 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Nov 23 08:01:24 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Nov 23 08:01:24 np0005532585.localdomain sudo[56349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvtoxziqxgwioqxjbfweituktkyyuidz ; /usr/bin/python3
Nov 23 08:01:24 np0005532585.localdomain sudo[56349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:01:24 np0005532585.localdomain podman[56351]: 2025-11-23 08:01:24.605757137 +0000 UTC m=+0.075081507 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd)
Nov 23 08:01:24 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 64 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=50/50 les/c/f=51/51/0 sis=64) [2,3,4] r=1 lpr=64 pi=[50,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:24 np0005532585.localdomain sudo[56349]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:24 np0005532585.localdomain podman[56351]: 2025-11-23 08:01:24.773954578 +0000 UTC m=+0.243278918 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:01:24 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:01:24 np0005532585.localdomain sudo[56396]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqimbpghrfzqmamakjadpmxcouhkndvu ; /usr/bin/python3
Nov 23 08:01:24 np0005532585.localdomain sudo[56396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:24 np0005532585.localdomain sudo[56396]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:25 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Nov 23 08:01:25 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Nov 23 08:01:25 np0005532585.localdomain sudo[56501]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yshykwaqnahgewfhfixilreenpfzvvix ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884885.1060352-92926-116155042352680/async_wrapper.py 660982539111 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884885.1060352-92926-116155042352680/AnsiballZ_command.py _
Nov 23 08:01:25 np0005532585.localdomain sudo[56501]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 08:01:25 np0005532585.localdomain ansible-async_wrapper.py[56503]: Invoked with 660982539111 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884885.1060352-92926-116155042352680/AnsiballZ_command.py _
Nov 23 08:01:25 np0005532585.localdomain ansible-async_wrapper.py[56506]: Starting module and watcher
Nov 23 08:01:25 np0005532585.localdomain ansible-async_wrapper.py[56506]: Start watching 56507 (3600)
Nov 23 08:01:25 np0005532585.localdomain ansible-async_wrapper.py[56507]: Start module (56507)
Nov 23 08:01:25 np0005532585.localdomain ansible-async_wrapper.py[56503]: Return async_wrapper task started.
Nov 23 08:01:25 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 66 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=52/53 n=1 ec=42/33 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=12.301605225s) [2,3,1] r=-1 lpr=66 pi=[52,66)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1177.067016602s@ mbc={}] start_peering_interval up [4,0,2] -> [2,3,1], acting [4,0,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:25 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 66 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=52/53 n=1 ec=42/33 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=12.301326752s) [2,3,1] r=-1 lpr=66 pi=[52,66)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1177.067016602s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:25 np0005532585.localdomain sudo[56501]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:25 np0005532585.localdomain sudo[56522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqrcejflhuxbogdsjqjagplxeyvzvfgc ; /usr/bin/python3
Nov 23 08:01:25 np0005532585.localdomain sudo[56522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:25 np0005532585.localdomain python3[56527]: ansible-ansible.legacy.async_status Invoked with jid=660982539111.56503 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:01:25 np0005532585.localdomain sudo[56522]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:26 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.f scrub starts
Nov 23 08:01:26 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.f scrub ok
Nov 23 08:01:26 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 66 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=52/52 les/c/f=53/53/0 sis=66) [2,3,1] r=1 lpr=66 pi=[52,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:27 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Nov 23 08:01:27 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Nov 23 08:01:27 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 68 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68) [3,1,5] r=0 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:27 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 68 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=54/55 n=1 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=12.778108597s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 crt=35'39 mlcod 35'39 active pruub 1179.634155273s@ mbc={255={}}] start_peering_interval up [0,2,4] -> [3,1,5], acting [0,2,4] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:27 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 68 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=54/55 n=1 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=12.778008461s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1179.634155273s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:28 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Nov 23 08:01:28 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Nov 23 08:01:28 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 69 pg[7.e( v 35'39 lc 35'17 (0'0,35'39] local-lis/les=68/69 n=1 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68) [3,1,5] r=0 lpr=68 pi=[54,68)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    (file: /etc/puppet/hiera.yaml)
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Warning: Undefined variable '::deploy_config_name';
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    (file & line not available)
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    (file & line not available)
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.12 seconds
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Notice: Applied catalog in 0.03 seconds
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Application:
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    Initial environment: production
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    Converged environment: production
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:          Run mode: user
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Changes:
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Events:
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Resources:
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:             Total: 10
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Time:
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:          Schedule: 0.00
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:              File: 0.00
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:              Exec: 0.01
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:            Augeas: 0.01
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    Transaction evaluation: 0.03
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    Catalog application: 0.03
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:    Config retrieval: 0.15
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:          Last run: 1763884889
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:        Filebucket: 0.00
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:             Total: 0.04
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]: Version:
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:            Config: 1763884889
Nov 23 08:01:29 np0005532585.localdomain puppet-user[56525]:            Puppet: 7.10.0
Nov 23 08:01:29 np0005532585.localdomain ansible-async_wrapper.py[56507]: Module complete (56507)
Nov 23 08:01:29 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 70 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=56/57 n=1 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.056187630s) [0,4,5] r=-1 lpr=70 pi=[56,70)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1178.083862305s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,5], acting [2,1,3] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 23 08:01:29 np0005532585.localdomain ceph-osd[32858]: osd.3 pg_epoch: 70 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=56/57 n=1 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.055901527s) [0,4,5] r=-1 lpr=70 pi=[56,70)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1178.083862305s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 08:01:29 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 70 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70) [0,4,5] r=0 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 08:01:29 np0005532585.localdomain sudo[56639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:01:29 np0005532585.localdomain sudo[56639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:01:29 np0005532585.localdomain sudo[56639]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:29 np0005532585.localdomain sudo[56654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:01:29 np0005532585.localdomain sudo[56654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:01:30 np0005532585.localdomain sudo[56654]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:30 np0005532585.localdomain ansible-async_wrapper.py[56506]: Done in kid B.
Nov 23 08:01:30 np0005532585.localdomain ceph-osd[31905]: osd.0 pg_epoch: 71 pg[7.f( v 35'39 lc 35'1 (0'0,35'39] local-lis/les=70/71 n=1 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70) [0,4,5] r=0 lpr=70 pi=[56,70)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+3)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 08:01:31 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Nov 23 08:01:31 np0005532585.localdomain sudo[56700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:01:31 np0005532585.localdomain sudo[56700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:01:31 np0005532585.localdomain sudo[56700]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:31 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Nov 23 08:01:32 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Nov 23 08:01:32 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Nov 23 08:01:33 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Nov 23 08:01:35 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.f deep-scrub starts
Nov 23 08:01:35 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.f deep-scrub ok
Nov 23 08:01:36 np0005532585.localdomain sudo[56728]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkqusmqxozjqyohwfqdqlxfyvpwufirg ; /usr/bin/python3
Nov 23 08:01:36 np0005532585.localdomain sudo[56728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:36 np0005532585.localdomain python3[56730]: ansible-ansible.legacy.async_status Invoked with jid=660982539111.56503 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:01:36 np0005532585.localdomain sudo[56728]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:36 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.b deep-scrub starts
Nov 23 08:01:36 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.b deep-scrub ok
Nov 23 08:01:36 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.a scrub starts
Nov 23 08:01:36 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.a scrub ok
Nov 23 08:01:36 np0005532585.localdomain sudo[56744]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iriyltodrjgjgwhutbpeobmglrizyxlq ; /usr/bin/python3
Nov 23 08:01:36 np0005532585.localdomain sudo[56744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:36 np0005532585.localdomain python3[56746]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:01:36 np0005532585.localdomain sudo[56744]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:37 np0005532585.localdomain sudo[56760]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glocuwqbmoacrqhqvtraabqjsmegkiho ; /usr/bin/python3
Nov 23 08:01:37 np0005532585.localdomain sudo[56760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:37 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Nov 23 08:01:37 np0005532585.localdomain python3[56762]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:01:37 np0005532585.localdomain sudo[56760]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:37 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Nov 23 08:01:37 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Nov 23 08:01:37 np0005532585.localdomain sudo[56810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aglxgvlrhhwgrosjjcxzddjgdtlrwdnm ; /usr/bin/python3
Nov 23 08:01:37 np0005532585.localdomain sudo[56810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:37 np0005532585.localdomain python3[56812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:37 np0005532585.localdomain sudo[56810]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:37 np0005532585.localdomain sudo[56828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-praituzsmapegchblfvflvzdvypfniqq ; /usr/bin/python3
Nov 23 08:01:37 np0005532585.localdomain sudo[56828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:38 np0005532585.localdomain python3[56830]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpnb1s9nrx recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:01:38 np0005532585.localdomain sudo[56828]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:38 np0005532585.localdomain sudo[56858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxkjjocfopvvytprfrfdpialqnvcgroz ; /usr/bin/python3
Nov 23 08:01:38 np0005532585.localdomain sudo[56858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:38 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.19 deep-scrub starts
Nov 23 08:01:38 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.19 deep-scrub ok
Nov 23 08:01:38 np0005532585.localdomain python3[56860]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:38 np0005532585.localdomain sudo[56858]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:38 np0005532585.localdomain sudo[56874]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djyjbpbzqbxxofnjrgyljfsdshjytizn ; /usr/bin/python3
Nov 23 08:01:38 np0005532585.localdomain sudo[56874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:39 np0005532585.localdomain sudo[56874]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:39 np0005532585.localdomain sudo[56962]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rznqtjwyzbzsiytaqxdwbzjmdgbuthaa ; /usr/bin/python3
Nov 23 08:01:39 np0005532585.localdomain sudo[56962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:39 np0005532585.localdomain python3[56964]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 23 08:01:39 np0005532585.localdomain sudo[56962]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:40 np0005532585.localdomain sudo[56981]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gafvrjhmtitbdmmjmxkoxulnplfiriwf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:40 np0005532585.localdomain sudo[56981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:40 np0005532585.localdomain python3[56983]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:40 np0005532585.localdomain sudo[56981]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:40 np0005532585.localdomain sudo[56997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oinapfjpxkigdxgenawoixftebsqzvgl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:40 np0005532585.localdomain sudo[56997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:40 np0005532585.localdomain sudo[56997]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:40 np0005532585.localdomain sudo[57013]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmxafsafrvhpkfgejmqoowuetgewddis ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:41 np0005532585.localdomain sudo[57013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:41 np0005532585.localdomain python3[57015]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:01:41 np0005532585.localdomain sudo[57013]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:41 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.c scrub starts
Nov 23 08:01:41 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.c scrub ok
Nov 23 08:01:41 np0005532585.localdomain sudo[57063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbpzgrrconqzgbacuyzkcpzohlwjwunm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:41 np0005532585.localdomain sudo[57063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:41 np0005532585.localdomain python3[57065]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:41 np0005532585.localdomain sudo[57063]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:41 np0005532585.localdomain sudo[57081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esccavhqrjixugmhlfbdczqslonsyruu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:41 np0005532585.localdomain sudo[57081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:41 np0005532585.localdomain python3[57083]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:41 np0005532585.localdomain sudo[57081]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:42 np0005532585.localdomain sudo[57143]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uibedblvbcoqmtxmomkhrjoqllubpbig ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:42 np0005532585.localdomain sudo[57143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:42 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Nov 23 08:01:42 np0005532585.localdomain python3[57145]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:42 np0005532585.localdomain sudo[57143]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:42 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Nov 23 08:01:42 np0005532585.localdomain sudo[57161]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duvojywnatjteegextowdkefivbgbbex ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:42 np0005532585.localdomain sudo[57161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:42 np0005532585.localdomain python3[57163]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:42 np0005532585.localdomain sudo[57161]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:42 np0005532585.localdomain sudo[57223]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prkspthldfsezjrvkekzsehaubafzxqr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:42 np0005532585.localdomain sudo[57223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:43 np0005532585.localdomain python3[57225]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:43 np0005532585.localdomain sudo[57223]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:43 np0005532585.localdomain sudo[57241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwzprteycxuvlrkvxymtofyvpcvocjcq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:43 np0005532585.localdomain sudo[57241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:43 np0005532585.localdomain python3[57243]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:43 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Nov 23 08:01:43 np0005532585.localdomain sudo[57241]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:43 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Nov 23 08:01:43 np0005532585.localdomain sudo[57303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwvzqkkbgellxkwjxizhpztvambkwpvr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:43 np0005532585.localdomain sudo[57303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:43 np0005532585.localdomain python3[57305]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:43 np0005532585.localdomain sudo[57303]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:43 np0005532585.localdomain sudo[57321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmyojysnftjeimjociiqhwazbgisince ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:43 np0005532585.localdomain sudo[57321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:44 np0005532585.localdomain python3[57323]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:44 np0005532585.localdomain sudo[57321]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:44 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Nov 23 08:01:44 np0005532585.localdomain sudo[57351]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvxtvpanbwjytymffvctbncheqmsencm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:44 np0005532585.localdomain sudo[57351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:44 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Nov 23 08:01:44 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.4 deep-scrub starts
Nov 23 08:01:44 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.4 deep-scrub ok
Nov 23 08:01:44 np0005532585.localdomain python3[57353]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:01:44 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:01:44 np0005532585.localdomain systemd-rc-local-generator[57375]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:01:44 np0005532585.localdomain systemd-sysv-generator[57381]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:01:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:01:44 np0005532585.localdomain sudo[57351]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:45 np0005532585.localdomain sudo[57436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvutizcvpywaghegfgtbblrrllwzlwum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:45 np0005532585.localdomain sudo[57436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:45 np0005532585.localdomain python3[57438]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:45 np0005532585.localdomain sudo[57436]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:45 np0005532585.localdomain sudo[57454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laiptthcfsoeocempvysdlbwdmcquhnd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:45 np0005532585.localdomain sudo[57454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:45 np0005532585.localdomain python3[57456]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:45 np0005532585.localdomain sudo[57454]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:46 np0005532585.localdomain sudo[57516]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zykrxzmnfmaayncjdqcscyrvlxzxftci ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:46 np0005532585.localdomain sudo[57516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:46 np0005532585.localdomain python3[57518]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:01:46 np0005532585.localdomain sudo[57516]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:46 np0005532585.localdomain sudo[57534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rplzawemwoxrljzpyvbzpbwjbrvammzs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:46 np0005532585.localdomain sudo[57534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:46 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Nov 23 08:01:46 np0005532585.localdomain python3[57536]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:46 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Nov 23 08:01:46 np0005532585.localdomain sudo[57534]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:46 np0005532585.localdomain sudo[57564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjgqwdqtkgloainsdqkdzxmfixusijln ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:46 np0005532585.localdomain sudo[57564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:46 np0005532585.localdomain python3[57566]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:01:46 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:01:47 np0005532585.localdomain systemd-rc-local-generator[57587]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:01:47 np0005532585.localdomain systemd-sysv-generator[57590]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:01:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:01:47 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 08:01:47 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 08:01:47 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 08:01:47 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 08:01:47 np0005532585.localdomain sudo[57564]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:47 np0005532585.localdomain sudo[57621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjtqiehzdimyiwhfkahhqphttdrvqmre ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:47 np0005532585.localdomain sudo[57621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:47 np0005532585.localdomain python3[57623]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 23 08:01:47 np0005532585.localdomain sudo[57621]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:48 np0005532585.localdomain sudo[57637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egzrlnpnnlfufgwyptyfbdyxgottumyr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:48 np0005532585.localdomain sudo[57637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:48 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Nov 23 08:01:48 np0005532585.localdomain sudo[57637]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:48 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Nov 23 08:01:48 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Nov 23 08:01:48 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Nov 23 08:01:48 np0005532585.localdomain sudo[57680]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldlcskgpytlfijazxqgqovifmxjsedht ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:01:48 np0005532585.localdomain sudo[57680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:49 np0005532585.localdomain python3[57682]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 23 08:01:49 np0005532585.localdomain podman[57754]: 2025-11-23 08:01:49.467780956 +0000 UTC m=+0.075042696 container create 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 23 08:01:49 np0005532585.localdomain podman[57761]: 2025-11-23 08:01:49.500796988 +0000 UTC m=+0.096265255 container create 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_id=tripleo_step2, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_virtqemud_init_logs, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1)
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175.scope.
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a.scope.
Nov 23 08:01:49 np0005532585.localdomain podman[57754]: 2025-11-23 08:01:49.432494988 +0000 UTC m=+0.039756728 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:01:49 np0005532585.localdomain podman[57761]: 2025-11-23 08:01:49.43635206 +0000 UTC m=+0.031820317 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:01:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff643707b40d1f4d028ddd9677b38d4c09285d4aaee2a2759e81caea717feb00/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:01:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Nov 23 08:01:49 np0005532585.localdomain podman[57754]: 2025-11-23 08:01:49.55544624 +0000 UTC m=+0.162707980 container init 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_id=tripleo_step2, container_name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:01:49 np0005532585.localdomain podman[57754]: 2025-11-23 08:01:49.564151263 +0000 UTC m=+0.171413003 container start 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 23 08:01:49 np0005532585.localdomain python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: libpod-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175.scope: Deactivated successfully.
Nov 23 08:01:49 np0005532585.localdomain podman[57761]: 2025-11-23 08:01:49.607105934 +0000 UTC m=+0.202574191 container init 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, tcib_managed=true, container_name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044)
Nov 23 08:01:49 np0005532585.localdomain podman[57761]: 2025-11-23 08:01:49.614101469 +0000 UTC m=+0.209569756 container start 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step2, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:01:49 np0005532585.localdomain python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: libpod-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a.scope: Deactivated successfully.
Nov 23 08:01:49 np0005532585.localdomain podman[57790]: 2025-11-23 08:01:49.640986771 +0000 UTC m=+0.059970207 container died 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, container_name=nova_compute_init_log, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:01:49 np0005532585.localdomain podman[57814]: 2025-11-23 08:01:49.679991507 +0000 UTC m=+0.043124386 container died 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_virtqemud_init_logs, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']})
Nov 23 08:01:49 np0005532585.localdomain podman[57791]: 2025-11-23 08:01:49.710377112 +0000 UTC m=+0.127374610 container cleanup 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: libpod-conmon-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175.scope: Deactivated successfully.
Nov 23 08:01:49 np0005532585.localdomain podman[57814]: 2025-11-23 08:01:49.750307535 +0000 UTC m=+0.113440374 container cleanup 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=nova_virtqemud_init_logs, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com)
Nov 23 08:01:49 np0005532585.localdomain systemd[1]: libpod-conmon-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a.scope: Deactivated successfully.
Nov 23 08:01:50 np0005532585.localdomain podman[57932]: 2025-11-23 08:01:50.036197453 +0000 UTC m=+0.058274319 container create 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container)
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: Started libpod-conmon-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope.
Nov 23 08:01:50 np0005532585.localdomain podman[57947]: 2025-11-23 08:01:50.072905602 +0000 UTC m=+0.070193286 container create 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, version=17.1.12, container_name=create_haproxy_wrapper, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:01:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cc40e2e89ec91462a29529e884582c68bbf460125466873b1904744d0a186a6/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 23 08:01:50 np0005532585.localdomain podman[57932]: 2025-11-23 08:01:50.090538506 +0000 UTC m=+0.112615382 container init 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: Started libpod-conmon-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope.
Nov 23 08:01:50 np0005532585.localdomain podman[57932]: 2025-11-23 08:01:50.098643242 +0000 UTC m=+0.120720108 container start 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.41.4, container_name=create_virtlogd_wrapper, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1)
Nov 23 08:01:50 np0005532585.localdomain podman[57932]: 2025-11-23 08:01:50.098757346 +0000 UTC m=+0.120834212 container attach 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.component=openstack-nova-libvirt-container, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 23 08:01:50 np0005532585.localdomain podman[57932]: 2025-11-23 08:01:50.007561789 +0000 UTC m=+0.029638665 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:01:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 08:01:50 np0005532585.localdomain podman[57947]: 2025-11-23 08:01:50.128643525 +0000 UTC m=+0.125931209 container init 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, vcs-type=git, container_name=create_haproxy_wrapper, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:01:50 np0005532585.localdomain podman[57947]: 2025-11-23 08:01:50.033515714 +0000 UTC m=+0.030803468 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 23 08:01:50 np0005532585.localdomain podman[57947]: 2025-11-23 08:01:50.134096464 +0000 UTC m=+0.131384178 container start 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=create_haproxy_wrapper, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:01:50 np0005532585.localdomain podman[57947]: 2025-11-23 08:01:50.134375592 +0000 UTC m=+0.131663346 container attach 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=create_haproxy_wrapper, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:01:50 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.f scrub starts
Nov 23 08:01:50 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.f scrub ok
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22-merged.mount: Deactivated successfully.
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a-userdata-shm.mount: Deactivated successfully.
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ff643707b40d1f4d028ddd9677b38d4c09285d4aaee2a2759e81caea717feb00-merged.mount: Deactivated successfully.
Nov 23 08:01:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175-userdata-shm.mount: Deactivated successfully.
Nov 23 08:01:51 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Nov 23 08:01:51 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Nov 23 08:01:51 np0005532585.localdomain ovs-vsctl[58038]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Nov 23 08:01:52 np0005532585.localdomain systemd[1]: libpod-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope: Deactivated successfully.
Nov 23 08:01:52 np0005532585.localdomain systemd[1]: libpod-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope: Consumed 2.141s CPU time.
Nov 23 08:01:52 np0005532585.localdomain podman[57932]: 2025-11-23 08:01:52.246208216 +0000 UTC m=+2.268285142 container died 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:35:22Z, container_name=create_virtlogd_wrapper)
Nov 23 08:01:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f-userdata-shm.mount: Deactivated successfully.
Nov 23 08:01:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1cc40e2e89ec91462a29529e884582c68bbf460125466873b1904744d0a186a6-merged.mount: Deactivated successfully.
Nov 23 08:01:52 np0005532585.localdomain podman[58187]: 2025-11-23 08:01:52.349148645 +0000 UTC m=+0.088693205 container cleanup 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, release=1761123044, container_name=create_virtlogd_wrapper, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:01:52 np0005532585.localdomain systemd[1]: libpod-conmon-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope: Deactivated successfully.
Nov 23 08:01:52 np0005532585.localdomain python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Nov 23 08:01:52 np0005532585.localdomain systemd[1]: libpod-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope: Deactivated successfully.
Nov 23 08:01:53 np0005532585.localdomain systemd[1]: libpod-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope: Consumed 2.149s CPU time.
Nov 23 08:01:53 np0005532585.localdomain podman[57947]: 2025-11-23 08:01:53.001604149 +0000 UTC m=+2.998891893 container died 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, container_name=create_haproxy_wrapper, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:01:53 np0005532585.localdomain podman[58227]: 2025-11-23 08:01:53.09225526 +0000 UTC m=+0.081914787 container cleanup 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, container_name=create_haproxy_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step2, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:01:53 np0005532585.localdomain systemd[1]: libpod-conmon-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope: Deactivated successfully.
Nov 23 08:01:53 np0005532585.localdomain python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Nov 23 08:01:53 np0005532585.localdomain sudo[57680]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473-merged.mount: Deactivated successfully.
Nov 23 08:01:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218-userdata-shm.mount: Deactivated successfully.
Nov 23 08:01:53 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts
Nov 23 08:01:53 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok
Nov 23 08:01:53 np0005532585.localdomain sudo[58281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxqcglqjghzjshcrkrpupgjuoaucbaaf ; /usr/bin/python3
Nov 23 08:01:53 np0005532585.localdomain sudo[58281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:53 np0005532585.localdomain python3[58283]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:53 np0005532585.localdomain sudo[58281]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:54 np0005532585.localdomain sudo[58329]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkkgzpjjnxxefcbjdwcwcxpcxtjqhuem ; /usr/bin/python3
Nov 23 08:01:54 np0005532585.localdomain sudo[58329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:54 np0005532585.localdomain sudo[58329]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:54 np0005532585.localdomain sudo[58372]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhnewrovevoaskvsymvxeaoxelznltww ; /usr/bin/python3
Nov 23 08:01:54 np0005532585.localdomain sudo[58372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:54 np0005532585.localdomain sudo[58372]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:01:55 np0005532585.localdomain sudo[58413]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daaoduzcmimmxgrojpmbfdtlyotragjg ; /usr/bin/python3
Nov 23 08:01:55 np0005532585.localdomain systemd[1]: tmp-crun.qR1JJo.mount: Deactivated successfully.
Nov 23 08:01:55 np0005532585.localdomain sudo[58413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:55 np0005532585.localdomain podman[58389]: 2025-11-23 08:01:55.053402754 +0000 UTC m=+0.102601469 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc.)
Nov 23 08:01:55 np0005532585.localdomain python3[58421]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005532585 step=2 update_config_hash_only=False
Nov 23 08:01:55 np0005532585.localdomain sudo[58413]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:55 np0005532585.localdomain podman[58389]: 2025-11-23 08:01:55.254871102 +0000 UTC m=+0.304069757 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 23 08:01:55 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:01:55 np0005532585.localdomain sudo[58448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqbtxbjqimntldusgovyebvofpalwmqh ; /usr/bin/python3
Nov 23 08:01:55 np0005532585.localdomain sudo[58448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:55 np0005532585.localdomain python3[58450]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:01:55 np0005532585.localdomain sudo[58448]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:55 np0005532585.localdomain sudo[58464]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amhtdduvosgcpqucknvohvhejvkxjewg ; /usr/bin/python3
Nov 23 08:01:55 np0005532585.localdomain sudo[58464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:01:56 np0005532585.localdomain python3[58466]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 23 08:01:56 np0005532585.localdomain sudo[58464]: pam_unix(sudo:session): session closed for user root
Nov 23 08:01:56 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Nov 23 08:01:56 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Nov 23 08:01:56 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts
Nov 23 08:01:56 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok
Nov 23 08:01:57 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Nov 23 08:01:57 np0005532585.localdomain ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Nov 23 08:01:57 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Nov 23 08:01:57 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Nov 23 08:01:59 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.b scrub starts
Nov 23 08:01:59 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.b scrub ok
Nov 23 08:02:00 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Nov 23 08:02:00 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Nov 23 08:02:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:02:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4438 writes, 20K keys, 4438 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4438 writes, 447 syncs, 9.93 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1185 writes, 4110 keys, 1185 commit groups, 1.0 writes per commit group, ingest: 2.02 MB, 0.00 MB/s
                                                          Interval WAL: 1185 writes, 305 syncs, 3.89 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 08:02:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:02:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 5124 writes, 22K keys, 5124 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5123 writes, 575 syncs, 8.91 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1737 writes, 6023 keys, 1737 commit groups, 1.0 writes per commit group, ingest: 2.57 MB, 0.00 MB/s
                                                          Interval WAL: 1736 writes, 377 syncs, 4.60 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 08:02:07 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Nov 23 08:02:15 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Nov 23 08:02:15 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Nov 23 08:02:24 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Nov 23 08:02:24 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Nov 23 08:02:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:02:26 np0005532585.localdomain podman[58467]: 2025-11-23 08:02:26.021648679 +0000 UTC m=+0.080233258 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:02:26 np0005532585.localdomain podman[58467]: 2025-11-23 08:02:26.205266837 +0000 UTC m=+0.263851416 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:02:26 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:02:30 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.b scrub starts
Nov 23 08:02:30 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.b scrub ok
Nov 23 08:02:31 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Nov 23 08:02:31 np0005532585.localdomain sshd[58496]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:02:31 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Nov 23 08:02:31 np0005532585.localdomain sudo[58497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:02:31 np0005532585.localdomain sudo[58497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:02:31 np0005532585.localdomain sudo[58497]: pam_unix(sudo:session): session closed for user root
Nov 23 08:02:31 np0005532585.localdomain sudo[58512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 08:02:31 np0005532585.localdomain sudo[58512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:02:32 np0005532585.localdomain sudo[58512]: pam_unix(sudo:session): session closed for user root
Nov 23 08:02:32 np0005532585.localdomain sudo[58548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:02:32 np0005532585.localdomain sudo[58548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:02:32 np0005532585.localdomain sudo[58548]: pam_unix(sudo:session): session closed for user root
Nov 23 08:02:32 np0005532585.localdomain sudo[58563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:02:32 np0005532585.localdomain sudo[58563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:02:32 np0005532585.localdomain sudo[58563]: pam_unix(sudo:session): session closed for user root
Nov 23 08:02:33 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Nov 23 08:02:33 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Nov 23 08:02:33 np0005532585.localdomain sudo[58610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:02:33 np0005532585.localdomain sudo[58610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:02:33 np0005532585.localdomain sudo[58610]: pam_unix(sudo:session): session closed for user root
Nov 23 08:02:34 np0005532585.localdomain sshd[58496]: Invalid user blank from 177.174.105.113 port 43631
Nov 23 08:02:34 np0005532585.localdomain sshd[58496]: Connection closed by invalid user blank 177.174.105.113 port 43631 [preauth]
Nov 23 08:02:37 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.e scrub starts
Nov 23 08:02:37 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.e scrub ok
Nov 23 08:02:44 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Nov 23 08:02:44 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Nov 23 08:02:49 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.e scrub starts
Nov 23 08:02:49 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.e scrub ok
Nov 23 08:02:53 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Nov 23 08:02:53 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Nov 23 08:02:54 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Nov 23 08:02:54 np0005532585.localdomain ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Nov 23 08:02:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:02:57 np0005532585.localdomain podman[58625]: 2025-11-23 08:02:57.01799793 +0000 UTC m=+0.071809842 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:02:57 np0005532585.localdomain podman[58625]: 2025-11-23 08:02:57.224495095 +0000 UTC m=+0.278306987 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=)
Nov 23 08:02:57 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:03:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:03:28 np0005532585.localdomain podman[58654]: 2025-11-23 08:03:28.023699633 +0000 UTC m=+0.085225492 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:03:28 np0005532585.localdomain podman[58654]: 2025-11-23 08:03:28.238293827 +0000 UTC m=+0.299819696 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public)
Nov 23 08:03:28 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:03:33 np0005532585.localdomain sudo[58683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:03:33 np0005532585.localdomain sudo[58683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:03:33 np0005532585.localdomain sudo[58683]: pam_unix(sudo:session): session closed for user root
Nov 23 08:03:33 np0005532585.localdomain sudo[58698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:03:33 np0005532585.localdomain sudo[58698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:03:34 np0005532585.localdomain sudo[58698]: pam_unix(sudo:session): session closed for user root
Nov 23 08:03:35 np0005532585.localdomain sudo[58744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:03:35 np0005532585.localdomain sudo[58744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:03:35 np0005532585.localdomain sudo[58744]: pam_unix(sudo:session): session closed for user root
Nov 23 08:03:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:03:59 np0005532585.localdomain podman[58759]: 2025-11-23 08:03:59.01972001 +0000 UTC m=+0.081536332 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:03:59 np0005532585.localdomain podman[58759]: 2025-11-23 08:03:59.232460068 +0000 UTC m=+0.294276410 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:03:59 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:04:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:04:30 np0005532585.localdomain systemd[1]: tmp-crun.yoIh0o.mount: Deactivated successfully.
Nov 23 08:04:30 np0005532585.localdomain podman[58788]: 2025-11-23 08:04:30.027750836 +0000 UTC m=+0.085415848 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:04:30 np0005532585.localdomain podman[58788]: 2025-11-23 08:04:30.253264837 +0000 UTC m=+0.310929839 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Nov 23 08:04:30 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:04:35 np0005532585.localdomain sudo[58817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:04:35 np0005532585.localdomain sudo[58817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:04:35 np0005532585.localdomain sudo[58817]: pam_unix(sudo:session): session closed for user root
Nov 23 08:04:35 np0005532585.localdomain sudo[58832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 08:04:35 np0005532585.localdomain sudo[58832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:04:36 np0005532585.localdomain podman[58915]: 2025-11-23 08:04:36.207777544 +0000 UTC m=+0.092108379 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container)
Nov 23 08:04:36 np0005532585.localdomain podman[58915]: 2025-11-23 08:04:36.340535007 +0000 UTC m=+0.224865852 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main)
Nov 23 08:04:36 np0005532585.localdomain sudo[58832]: pam_unix(sudo:session): session closed for user root
Nov 23 08:04:36 np0005532585.localdomain sudo[58981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:04:36 np0005532585.localdomain sudo[58981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:04:36 np0005532585.localdomain sudo[58981]: pam_unix(sudo:session): session closed for user root
Nov 23 08:04:36 np0005532585.localdomain sudo[58996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:04:36 np0005532585.localdomain sudo[58996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:04:37 np0005532585.localdomain sudo[58996]: pam_unix(sudo:session): session closed for user root
Nov 23 08:04:37 np0005532585.localdomain sudo[59042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:04:37 np0005532585.localdomain sudo[59042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:04:37 np0005532585.localdomain sudo[59042]: pam_unix(sudo:session): session closed for user root
Nov 23 08:05:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:05:01 np0005532585.localdomain systemd[1]: tmp-crun.w39ePM.mount: Deactivated successfully.
Nov 23 08:05:01 np0005532585.localdomain podman[59057]: 2025-11-23 08:05:01.035150313 +0000 UTC m=+0.093427777 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z)
Nov 23 08:05:01 np0005532585.localdomain podman[59057]: 2025-11-23 08:05:01.251375845 +0000 UTC m=+0.309653319 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:05:01 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:05:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:05:32 np0005532585.localdomain systemd[1]: tmp-crun.xgp8Kq.mount: Deactivated successfully.
Nov 23 08:05:32 np0005532585.localdomain podman[59086]: 2025-11-23 08:05:32.024195129 +0000 UTC m=+0.082386967 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4)
Nov 23 08:05:32 np0005532585.localdomain podman[59086]: 2025-11-23 08:05:32.218211167 +0000 UTC m=+0.276403015 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Nov 23 08:05:32 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:05:32 np0005532585.localdomain sshd[59115]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:05:33 np0005532585.localdomain sshd[59115]: Invalid user pi from 185.156.73.233 port 28076
Nov 23 08:05:33 np0005532585.localdomain sshd[59115]: Connection closed by invalid user pi 185.156.73.233 port 28076 [preauth]
Nov 23 08:05:38 np0005532585.localdomain sudo[59117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:05:38 np0005532585.localdomain sudo[59117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:05:38 np0005532585.localdomain sudo[59117]: pam_unix(sudo:session): session closed for user root
Nov 23 08:05:38 np0005532585.localdomain sudo[59132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:05:38 np0005532585.localdomain sudo[59132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:05:38 np0005532585.localdomain sudo[59132]: pam_unix(sudo:session): session closed for user root
Nov 23 08:05:39 np0005532585.localdomain sudo[59179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:05:39 np0005532585.localdomain sudo[59179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:05:39 np0005532585.localdomain sudo[59179]: pam_unix(sudo:session): session closed for user root
Nov 23 08:05:46 np0005532585.localdomain sshd[59194]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:05:46 np0005532585.localdomain sshd[59194]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 08:05:46 np0005532585.localdomain sshd[59194]: Connection closed by 117.5.148.56 port 33122
Nov 23 08:06:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:06:03 np0005532585.localdomain systemd[1]: tmp-crun.82xQiu.mount: Deactivated successfully.
Nov 23 08:06:03 np0005532585.localdomain podman[59195]: 2025-11-23 08:06:03.030134502 +0000 UTC m=+0.085942354 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:06:03 np0005532585.localdomain podman[59195]: 2025-11-23 08:06:03.221451298 +0000 UTC m=+0.277259080 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 23 08:06:03 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:06:29 np0005532585.localdomain sudo[59269]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpqzogrgoenhjoricisegqqvmxhiiivc ; /usr/bin/python3
Nov 23 08:06:29 np0005532585.localdomain sudo[59269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:29 np0005532585.localdomain python3[59271]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:29 np0005532585.localdomain sudo[59269]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:29 np0005532585.localdomain sudo[59314]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-birlmcvfanjfqqudodrvkuqbblrequbr ; /usr/bin/python3
Nov 23 08:06:29 np0005532585.localdomain sudo[59314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:29 np0005532585.localdomain python3[59316]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885188.984136-98999-32296130414344/source _original_basename=tmp6hnezlw9 follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:29 np0005532585.localdomain sudo[59314]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:30 np0005532585.localdomain sudo[59344]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gywjakaflsfajktbumasazcweelwlhnw ; /usr/bin/python3
Nov 23 08:06:30 np0005532585.localdomain sudo[59344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:30 np0005532585.localdomain python3[59346]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:06:30 np0005532585.localdomain sudo[59344]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:31 np0005532585.localdomain sudo[59394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqapmnxsmzckhdwelqytryhmyqhleuxs ; /usr/bin/python3
Nov 23 08:06:31 np0005532585.localdomain sudo[59394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:31 np0005532585.localdomain sudo[59394]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:31 np0005532585.localdomain sudo[59412]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfdzxdsithafiwpzsgmmdryeeluhivvq ; /usr/bin/python3
Nov 23 08:06:31 np0005532585.localdomain sudo[59412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:31 np0005532585.localdomain sudo[59412]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:32 np0005532585.localdomain sudo[59516]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwaldfuzznxwnkohjpztgextcmcbclqa ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885191.729863-99301-101695479229977/async_wrapper.py 70114099573 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885191.729863-99301-101695479229977/AnsiballZ_command.py _
Nov 23 08:06:32 np0005532585.localdomain sudo[59516]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 08:06:32 np0005532585.localdomain ansible-async_wrapper.py[59518]: Invoked with 70114099573 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885191.729863-99301-101695479229977/AnsiballZ_command.py _
Nov 23 08:06:32 np0005532585.localdomain ansible-async_wrapper.py[59521]: Starting module and watcher
Nov 23 08:06:32 np0005532585.localdomain ansible-async_wrapper.py[59521]: Start watching 59522 (3600)
Nov 23 08:06:32 np0005532585.localdomain ansible-async_wrapper.py[59522]: Start module (59522)
Nov 23 08:06:32 np0005532585.localdomain ansible-async_wrapper.py[59518]: Return async_wrapper task started.
Nov 23 08:06:32 np0005532585.localdomain sudo[59516]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:32 np0005532585.localdomain sudo[59540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kijuraeqkfvzqkszajragzungyvfnzny ; /usr/bin/python3
Nov 23 08:06:32 np0005532585.localdomain sudo[59540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:32 np0005532585.localdomain python3[59542]: ansible-ansible.legacy.async_status Invoked with jid=70114099573.59518 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:06:32 np0005532585.localdomain sudo[59540]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:06:33 np0005532585.localdomain systemd[1]: tmp-crun.RHE8K2.mount: Deactivated successfully.
Nov 23 08:06:34 np0005532585.localdomain podman[59560]: 2025-11-23 08:06:33.999434325 +0000 UTC m=+0.062064125 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:06:34 np0005532585.localdomain podman[59560]: 2025-11-23 08:06:34.166799307 +0000 UTC m=+0.229429097 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:06:34 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]:    (file: /etc/puppet/hiera.yaml)
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]: Warning: Undefined variable '::deploy_config_name';
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]:    (file & line not available)
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]:    (file & line not available)
Nov 23 08:06:35 np0005532585.localdomain puppet-user[59526]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.11 seconds
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Notice: Applied catalog in 0.03 seconds
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Application:
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:    Initial environment: production
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:    Converged environment: production
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:          Run mode: user
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Changes:
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Events:
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Resources:
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:             Total: 10
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Time:
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:          Schedule: 0.00
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:              File: 0.00
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:              Exec: 0.01
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:            Augeas: 0.01
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:    Transaction evaluation: 0.03
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:    Catalog application: 0.03
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:    Config retrieval: 0.15
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:          Last run: 1763885196
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:        Filebucket: 0.00
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:             Total: 0.04
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]: Version:
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:            Config: 1763885195
Nov 23 08:06:36 np0005532585.localdomain puppet-user[59526]:            Puppet: 7.10.0
Nov 23 08:06:36 np0005532585.localdomain ansible-async_wrapper.py[59522]: Module complete (59522)
Nov 23 08:06:37 np0005532585.localdomain ansible-async_wrapper.py[59521]: Done in kid B.
Nov 23 08:06:39 np0005532585.localdomain sudo[59684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:06:39 np0005532585.localdomain sudo[59684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:06:39 np0005532585.localdomain sudo[59684]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:39 np0005532585.localdomain sudo[59699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:06:39 np0005532585.localdomain sudo[59699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:06:40 np0005532585.localdomain sudo[59699]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:40 np0005532585.localdomain sudo[59746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:06:40 np0005532585.localdomain sudo[59746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:06:40 np0005532585.localdomain sudo[59746]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:42 np0005532585.localdomain sudo[59774]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-earmigftmlmjeordtxuemdprwudvlyux ; /usr/bin/python3
Nov 23 08:06:42 np0005532585.localdomain sudo[59774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:42 np0005532585.localdomain python3[59776]: ansible-ansible.legacy.async_status Invoked with jid=70114099573.59518 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:06:42 np0005532585.localdomain sudo[59774]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:43 np0005532585.localdomain sudo[59790]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngsuvqexslmupqokxxhxwonxuseuwrhs ; /usr/bin/python3
Nov 23 08:06:43 np0005532585.localdomain sudo[59790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:43 np0005532585.localdomain python3[59792]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:06:43 np0005532585.localdomain sudo[59790]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:43 np0005532585.localdomain sudo[59806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bddaxknrxchzpqxenujmdpttosgkeoux ; /usr/bin/python3
Nov 23 08:06:43 np0005532585.localdomain sudo[59806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:43 np0005532585.localdomain python3[59808]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:06:44 np0005532585.localdomain sudo[59806]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:44 np0005532585.localdomain sudo[59856]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjigvmfgibhcgsuscaitdoxhfghqxxja ; /usr/bin/python3
Nov 23 08:06:44 np0005532585.localdomain sudo[59856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:44 np0005532585.localdomain python3[59858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:44 np0005532585.localdomain sudo[59856]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:44 np0005532585.localdomain sudo[59874]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtbzfcealdoxqjcexgreizqsbecgnqyi ; /usr/bin/python3
Nov 23 08:06:44 np0005532585.localdomain sudo[59874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:44 np0005532585.localdomain python3[59876]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpnq9niry8 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:06:44 np0005532585.localdomain sudo[59874]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:45 np0005532585.localdomain sudo[59904]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbanpsiaubzfudgqohcoasdmoizqxcrv ; /usr/bin/python3
Nov 23 08:06:45 np0005532585.localdomain sudo[59904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:45 np0005532585.localdomain python3[59906]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:45 np0005532585.localdomain sudo[59904]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:45 np0005532585.localdomain sudo[59920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqgidajkrptqftpzspabnffkkpssuifv ; /usr/bin/python3
Nov 23 08:06:45 np0005532585.localdomain sudo[59920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:45 np0005532585.localdomain sudo[59920]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:46 np0005532585.localdomain sudo[60007]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huphgmkhrwamhtxkztxobrhoegbpzywc ; /usr/bin/python3
Nov 23 08:06:46 np0005532585.localdomain sudo[60007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:46 np0005532585.localdomain python3[60009]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 23 08:06:46 np0005532585.localdomain sudo[60007]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:47 np0005532585.localdomain sudo[60026]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzheidiihaiajxmvfhqtnrpfkleevtyl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:47 np0005532585.localdomain sudo[60026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:47 np0005532585.localdomain python3[60028]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:47 np0005532585.localdomain sudo[60026]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:47 np0005532585.localdomain sudo[60042]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytazyesuydggxyonxhjqryubcsdnxkxn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:47 np0005532585.localdomain sudo[60042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:47 np0005532585.localdomain sudo[60042]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:48 np0005532585.localdomain sudo[60059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxiowmsrktdtwomrbvljzwzmuuhtlzip ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:48 np0005532585.localdomain sudo[60059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:48 np0005532585.localdomain python3[60061]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:06:48 np0005532585.localdomain sudo[60059]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:48 np0005532585.localdomain sudo[60109]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onlzgkttcwlawphlpsopvtphqqfodnkj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:48 np0005532585.localdomain sudo[60109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:48 np0005532585.localdomain python3[60111]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:48 np0005532585.localdomain sudo[60109]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:49 np0005532585.localdomain sudo[60127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyfcrkwoogikzifckupqkoqrmlsexnxz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:49 np0005532585.localdomain sudo[60127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:49 np0005532585.localdomain python3[60129]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:49 np0005532585.localdomain sudo[60127]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:49 np0005532585.localdomain sudo[60189]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhcjsteratdyzfhviwhbbikqxzdfoxug ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:49 np0005532585.localdomain sudo[60189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:49 np0005532585.localdomain python3[60191]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:49 np0005532585.localdomain sudo[60189]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:49 np0005532585.localdomain sudo[60207]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjynrpaoahidbssatrmrqneaqzynuxdz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:49 np0005532585.localdomain sudo[60207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:49 np0005532585.localdomain python3[60209]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:49 np0005532585.localdomain sudo[60207]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:50 np0005532585.localdomain sudo[60269]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfoyiefozvbcznmrzlnlswaeljdrdazt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:50 np0005532585.localdomain sudo[60269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:50 np0005532585.localdomain python3[60271]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:50 np0005532585.localdomain sudo[60269]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:50 np0005532585.localdomain sudo[60287]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxcurgjewmpsozlageqbnxnpnfmfwcfl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:50 np0005532585.localdomain sudo[60287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:50 np0005532585.localdomain python3[60289]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:50 np0005532585.localdomain sudo[60287]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:51 np0005532585.localdomain sudo[60349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-reiqhtokwtrdfkthkzvrljaahtkawahi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:51 np0005532585.localdomain sudo[60349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:51 np0005532585.localdomain python3[60351]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:51 np0005532585.localdomain sudo[60349]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:51 np0005532585.localdomain sudo[60367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khkmfpzgljslomuvmidlmkinobmlkzdn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:51 np0005532585.localdomain sudo[60367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:51 np0005532585.localdomain python3[60369]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:51 np0005532585.localdomain sudo[60367]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:51 np0005532585.localdomain sudo[60397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhibszlzqnekjskiqfadmtzafvxomdwg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:51 np0005532585.localdomain sudo[60397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:51 np0005532585.localdomain python3[60399]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:06:51 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:06:52 np0005532585.localdomain systemd-sysv-generator[60428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:06:52 np0005532585.localdomain systemd-rc-local-generator[60423]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:06:52 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:06:52 np0005532585.localdomain sudo[60397]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:52 np0005532585.localdomain sudo[60482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uazznaqumqkwqjhbegxvjihvzqqrcytr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:52 np0005532585.localdomain sudo[60482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:52 np0005532585.localdomain python3[60484]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:52 np0005532585.localdomain sudo[60482]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:52 np0005532585.localdomain sudo[60500]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aomrlxemuewqjvzvenylyilajmympdfh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:52 np0005532585.localdomain sudo[60500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:52 np0005532585.localdomain python3[60502]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:53 np0005532585.localdomain sudo[60500]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:53 np0005532585.localdomain sudo[60562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvhxnxtiitcdjofjpvvtngyuyxnmfdzc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:53 np0005532585.localdomain sudo[60562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:53 np0005532585.localdomain python3[60564]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:06:53 np0005532585.localdomain sudo[60562]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:53 np0005532585.localdomain sudo[60580]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpxebilxqhwpoyjtuvvzosvuencxurny ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:53 np0005532585.localdomain sudo[60580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:53 np0005532585.localdomain python3[60582]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:06:53 np0005532585.localdomain sudo[60580]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:54 np0005532585.localdomain sudo[60610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnjnopzkcjvulxyiudqqgmgornrlnxke ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:54 np0005532585.localdomain sudo[60610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:54 np0005532585.localdomain python3[60612]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:06:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:06:54 np0005532585.localdomain systemd-sysv-generator[60636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:06:54 np0005532585.localdomain systemd-rc-local-generator[60632]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:06:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:06:54 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 08:06:54 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 08:06:54 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 08:06:54 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 08:06:54 np0005532585.localdomain sudo[60610]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:55 np0005532585.localdomain sudo[60668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twxreipwqgtsoqgvmhwqshukwusvzdpe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:55 np0005532585.localdomain sudo[60668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:55 np0005532585.localdomain python3[60670]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 23 08:06:55 np0005532585.localdomain sudo[60668]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:55 np0005532585.localdomain sudo[60684]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luuemephxokgrtsvrbwjtwxdencdqpxv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:55 np0005532585.localdomain sudo[60684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:55 np0005532585.localdomain sudo[60684]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:56 np0005532585.localdomain sudo[60726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqjnymbkzeczdhpreiiwhbjbzwysfqnd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:06:56 np0005532585.localdomain sudo[60726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:06:56 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 23 08:06:57 np0005532585.localdomain podman[60884]: 2025-11-23 08:06:57.174063813 +0000 UTC m=+0.068027434 container create 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 23 08:06:57 np0005532585.localdomain podman[60913]: 2025-11-23 08:06:57.198972985 +0000 UTC m=+0.066878070 container create 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:35:22Z, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public)
Nov 23 08:06:57 np0005532585.localdomain podman[60912]: 2025-11-23 08:06:57.221834365 +0000 UTC m=+0.086982636 container create 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, container_name=nova_statedir_owner, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']})
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11.scope.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad/merged/scripts supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain podman[60914]: 2025-11-23 08:06:57.240722146 +0000 UTC m=+0.099057002 container create 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:57 np0005532585.localdomain podman[60884]: 2025-11-23 08:06:57.143633685 +0000 UTC m=+0.037597336 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f.scope.
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain podman[60913]: 2025-11-23 08:06:57.252939744 +0000 UTC m=+0.120844839 container init 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain podman[60913]: 2025-11-23 08:06:57.260202894 +0000 UTC m=+0.128107979 container start 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, container_name=nova_virtlogd_wrapper, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step3, release=1761123044)
Nov 23 08:06:57 np0005532585.localdomain podman[60913]: 2025-11-23 08:06:57.166879936 +0000 UTC m=+0.034785041 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:57 np0005532585.localdomain podman[60912]: 2025-11-23 08:06:57.262676788 +0000 UTC m=+0.127825059 container init 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, build-date=2025-11-19T00:36:58Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613.scope.
Nov 23 08:06:57 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:57 np0005532585.localdomain podman[60912]: 2025-11-23 08:06:57.1680026 +0000 UTC m=+0.033150891 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:06:57 np0005532585.localdomain podman[60912]: 2025-11-23 08:06:57.269174134 +0000 UTC m=+0.134322395 container start 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Nov 23 08:06:57 np0005532585.localdomain podman[60912]: 2025-11-23 08:06:57.26937296 +0000 UTC m=+0.134521261 container attach 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step3)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d5e4e29a67d68eb763c810cf5dda69eb1a37e523f562c6e80552f33c1fd3c8b/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain podman[60914]: 2025-11-23 08:06:57.281358622 +0000 UTC m=+0.139693468 container init 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_init_log, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step3, build-date=2025-11-19T00:12:45Z)
Nov 23 08:06:57 np0005532585.localdomain sudo[60979]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:57 np0005532585.localdomain podman[60914]: 2025-11-23 08:06:57.187285933 +0000 UTC m=+0.045620799 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 23 08:06:57 np0005532585.localdomain podman[60914]: 2025-11-23 08:06:57.287372693 +0000 UTC m=+0.145707539 container start 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step3, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 23 08:06:57 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: libpod-348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Nov 23 08:06:57 np0005532585.localdomain podman[60933]: 2025-11-23 08:06:57.195339495 +0000 UTC m=+0.041495763 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 23 08:06:57 np0005532585.localdomain podman[60933]: 2025-11-23 08:06:57.304752768 +0000 UTC m=+0.150909036 container create 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=rsyslog, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: libpod-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 23 08:06:57 np0005532585.localdomain podman[60912]: 2025-11-23 08:06:57.321719021 +0000 UTC m=+0.186867302 container died 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_statedir_owner, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, build-date=2025-11-19T00:36:58Z, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 23 08:06:57 np0005532585.localdomain podman[60998]: 2025-11-23 08:06:57.348747196 +0000 UTC m=+0.040132932 container died 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, container_name=ceilometer_init_log, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain podman[61002]: 2025-11-23 08:06:57.427003748 +0000 UTC m=+0.118012733 container cleanup 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_init_log, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: libpod-conmon-348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain podman[61022]: 2025-11-23 08:06:57.444669531 +0000 UTC m=+0.108048422 container cleanup 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, container_name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: libpod-conmon-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Queued start job for default target Main User Target.
Nov 23 08:06:57 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Created slice User Application Slice.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Reached target Paths.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Reached target Timers.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Starting D-Bus User Message Bus Socket...
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Starting Create User's Volatile Files and Directories...
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:06:57 np0005532585.localdomain podman[60884]: 2025-11-23 08:06:57.469093719 +0000 UTC m=+0.363057380 container init 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-collectd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git)
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Finished Create User's Volatile Files and Directories.
Nov 23 08:06:57 np0005532585.localdomain podman[60933]: 2025-11-23 08:06:57.477225234 +0000 UTC m=+0.323381482 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, container_name=rsyslog, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Listening on D-Bus User Message Bus Socket.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Reached target Sockets.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Reached target Basic System.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Reached target Main User Target.
Nov 23 08:06:57 np0005532585.localdomain systemd[61020]: Startup finished in 117ms.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started User Manager for UID 0.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started Session c1 of User root.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:06:57 np0005532585.localdomain sudo[61095]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:57 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:57 np0005532585.localdomain sudo[60979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:57 np0005532585.localdomain podman[60884]: 2025-11-23 08:06:57.49629591 +0000 UTC m=+0.390259551 container start 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started Session c2 of User root.
Nov 23 08:06:57 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4767aaabc3de112d8791c290aa2b669d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 23 08:06:57 np0005532585.localdomain podman[60933]: 2025-11-23 08:06:57.502669653 +0000 UTC m=+0.348825891 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=rsyslog, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog)
Nov 23 08:06:57 np0005532585.localdomain sudo[61095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:57 np0005532585.localdomain sudo[61108]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:57 np0005532585.localdomain sudo[61108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:57 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=7238f2997345c97f4c6ab424e622dc1b --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 23 08:06:57 np0005532585.localdomain sudo[61108]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain sudo[61095]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain sudo[60979]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain podman[61115]: 2025-11-23 08:06:57.596610839 +0000 UTC m=+0.079520602 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:06:57 np0005532585.localdomain podman[61100]: 2025-11-23 08:06:57.569562472 +0000 UTC m=+0.076323925 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:06:57 np0005532585.localdomain podman[61100]: 2025-11-23 08:06:57.698954568 +0000 UTC m=+0.205716021 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:06:57 np0005532585.localdomain podman[61100]: unhealthy
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed with result 'exit-code'.
Nov 23 08:06:57 np0005532585.localdomain podman[61146]: 2025-11-23 08:06:57.844737738 +0000 UTC m=+0.281244270 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, container_name=rsyslog, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: libpod-conmon-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:06:57 np0005532585.localdomain podman[61250]: 2025-11-23 08:06:57.878980602 +0000 UTC m=+0.099070971 container create 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1.scope.
Nov 23 08:06:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:57 np0005532585.localdomain podman[61250]: 2025-11-23 08:06:57.828811578 +0000 UTC m=+0.048901967 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:57 np0005532585.localdomain podman[61250]: 2025-11-23 08:06:57.945352316 +0000 UTC m=+0.165442655 container init 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, release=1761123044, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:57 np0005532585.localdomain podman[61250]: 2025-11-23 08:06:57.954819662 +0000 UTC m=+0.174910031 container start 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:58 np0005532585.localdomain podman[61310]: 2025-11-23 08:06:58.053577453 +0000 UTC m=+0.079314236 container create a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, container_name=nova_virtsecretd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03.scope.
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain podman[61310]: 2025-11-23 08:06:58.110683626 +0000 UTC m=+0.136420429 container init a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, vcs-type=git, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtsecretd)
Nov 23 08:06:58 np0005532585.localdomain podman[61310]: 2025-11-23 08:06:58.017617077 +0000 UTC m=+0.043353920 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:58 np0005532585.localdomain podman[61310]: 2025-11-23 08:06:58.119287206 +0000 UTC m=+0.145024009 container start a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:35:22Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:58 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:58 np0005532585.localdomain sudo[61330]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:58 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started Session c3 of User root.
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f-userdata-shm.mount: Deactivated successfully.
Nov 23 08:06:58 np0005532585.localdomain sudo[61330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:58 np0005532585.localdomain sudo[61330]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Nov 23 08:06:58 np0005532585.localdomain podman[61447]: 2025-11-23 08:06:58.5616601 +0000 UTC m=+0.086066529 container create 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:06:58 np0005532585.localdomain podman[61464]: 2025-11-23 08:06:58.604430171 +0000 UTC m=+0.085113140 container create aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope.
Nov 23 08:06:58 np0005532585.localdomain podman[61447]: 2025-11-23 08:06:58.522349813 +0000 UTC m=+0.046756222 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope.
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain podman[61464]: 2025-11-23 08:06:58.551870525 +0000 UTC m=+0.032553534 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:58 np0005532585.localdomain podman[61464]: 2025-11-23 08:06:58.690840469 +0000 UTC m=+0.171523438 container init aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:06:58 np0005532585.localdomain podman[61447]: 2025-11-23 08:06:58.696361496 +0000 UTC m=+0.220767965 container init 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:06:58 np0005532585.localdomain podman[61464]: 2025-11-23 08:06:58.700137521 +0000 UTC m=+0.180820490 container start aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 23 08:06:58 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:58 np0005532585.localdomain sudo[61494]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:06:58 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:58 np0005532585.localdomain podman[61447]: 2025-11-23 08:06:58.730214068 +0000 UTC m=+0.254620497 container start 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started Session c4 of User root.
Nov 23 08:06:58 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=67452ffc3d9e727585009ffc9989a224 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 23 08:06:58 np0005532585.localdomain sudo[61491]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:58 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:58 np0005532585.localdomain sudo[61494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: Started Session c5 of User root.
Nov 23 08:06:58 np0005532585.localdomain sudo[61491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:58 np0005532585.localdomain sudo[61494]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Nov 23 08:06:58 np0005532585.localdomain kernel: Loading iSCSI transport class v2.0-870.
Nov 23 08:06:58 np0005532585.localdomain sudo[61491]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Nov 23 08:06:58 np0005532585.localdomain podman[61501]: 2025-11-23 08:06:58.878476924 +0000 UTC m=+0.142625187 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:06:58 np0005532585.localdomain podman[61501]: 2025-11-23 08:06:58.962210441 +0000 UTC m=+0.226358714 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=)
Nov 23 08:06:58 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:06:59 np0005532585.localdomain podman[61632]: 2025-11-23 08:06:59.228650224 +0000 UTC m=+0.073543491 container create 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible)
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: Started libpod-conmon-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a.scope.
Nov 23 08:06:59 np0005532585.localdomain podman[61632]: 2025-11-23 08:06:59.185665857 +0000 UTC m=+0.030559144 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain podman[61632]: 2025-11-23 08:06:59.307820385 +0000 UTC m=+0.152713652 container init 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z)
Nov 23 08:06:59 np0005532585.localdomain podman[61632]: 2025-11-23 08:06:59.315746103 +0000 UTC m=+0.160639310 container start 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtstoraged, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:06:59 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:59 np0005532585.localdomain sudo[61652]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:59 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: Started Session c6 of User root.
Nov 23 08:06:59 np0005532585.localdomain sudo[61652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:59 np0005532585.localdomain sudo[61652]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Nov 23 08:06:59 np0005532585.localdomain podman[61738]: 2025-11-23 08:06:59.708340774 +0000 UTC m=+0.071793887 container create 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtqemud, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: Started libpod-conmon-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope.
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:06:59 np0005532585.localdomain podman[61738]: 2025-11-23 08:06:59.666078489 +0000 UTC m=+0.029531682 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:06:59 np0005532585.localdomain podman[61738]: 2025-11-23 08:06:59.772362177 +0000 UTC m=+0.135815290 container init 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, container_name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:06:59 np0005532585.localdomain podman[61738]: 2025-11-23 08:06:59.78173316 +0000 UTC m=+0.145186273 container start 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_virtqemud, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:06:59 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:06:59 np0005532585.localdomain sudo[61757]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:06:59 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: Started Session c7 of User root.
Nov 23 08:06:59 np0005532585.localdomain sudo[61757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:06:59 np0005532585.localdomain sudo[61757]: pam_unix(sudo:session): session closed for user root
Nov 23 08:06:59 np0005532585.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Nov 23 08:07:00 np0005532585.localdomain podman[61842]: 2025-11-23 08:07:00.142846151 +0000 UTC m=+0.082082079 container create 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtproxyd, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 08:07:00 np0005532585.localdomain systemd[1]: Started libpod-conmon-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08.scope.
Nov 23 08:07:00 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:07:00 np0005532585.localdomain podman[61842]: 2025-11-23 08:07:00.103509614 +0000 UTC m=+0.042745502 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:00 np0005532585.localdomain podman[61842]: 2025-11-23 08:07:00.213422622 +0000 UTC m=+0.152658500 container init 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, container_name=nova_virtproxyd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible)
Nov 23 08:07:00 np0005532585.localdomain podman[61842]: 2025-11-23 08:07:00.22427988 +0000 UTC m=+0.163515758 container start 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:35:22Z, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:07:00 np0005532585.localdomain python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:07:00 np0005532585.localdomain sudo[61861]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:07:00 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:07:00 np0005532585.localdomain systemd[1]: Started Session c8 of User root.
Nov 23 08:07:00 np0005532585.localdomain sudo[61861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:07:00 np0005532585.localdomain sudo[61861]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:00 np0005532585.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Nov 23 08:07:00 np0005532585.localdomain sudo[60726]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:00 np0005532585.localdomain sudo[61919]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bimwzvuiuqtcuexlompoejdrwzbhzpnr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:00 np0005532585.localdomain sudo[61919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:00 np0005532585.localdomain python3[61921]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:00 np0005532585.localdomain sudo[61919]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:00 np0005532585.localdomain sudo[61935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klcdiilmznnsntbidyblnlorcyrnqvbn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:00 np0005532585.localdomain sudo[61935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:01 np0005532585.localdomain python3[61937]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:01 np0005532585.localdomain sudo[61935]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:01 np0005532585.localdomain sudo[61951]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppaiwxifqqidbxmlwwyimrfxqkchfutz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:01 np0005532585.localdomain sudo[61951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:01 np0005532585.localdomain python3[61953]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:01 np0005532585.localdomain sudo[61951]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:01 np0005532585.localdomain sudo[61967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsqpychbpzahyomfbuoiziyytbdcfvni ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:01 np0005532585.localdomain sudo[61967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:01 np0005532585.localdomain python3[61969]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:01 np0005532585.localdomain sudo[61967]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:01 np0005532585.localdomain sudo[61983]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwiipyommpqthrttcabicpucphctldlk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:01 np0005532585.localdomain sudo[61983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:01 np0005532585.localdomain python3[61985]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:01 np0005532585.localdomain sudo[61983]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:01 np0005532585.localdomain sudo[61999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlflufwbjzzowmdpjpqbjgtycdcjfnkz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:01 np0005532585.localdomain sudo[61999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:02 np0005532585.localdomain python3[62001]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:02 np0005532585.localdomain sudo[61999]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:02 np0005532585.localdomain sudo[62015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axanrqwvlnlfxwqwlgtgkouziucbosfx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:02 np0005532585.localdomain sudo[62015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:02 np0005532585.localdomain python3[62017]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:02 np0005532585.localdomain sudo[62015]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:02 np0005532585.localdomain sudo[62031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgkdjzlwulcmexqzjpmsmarssdnugtmf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:02 np0005532585.localdomain sudo[62031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:02 np0005532585.localdomain python3[62033]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:02 np0005532585.localdomain sudo[62031]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:02 np0005532585.localdomain sudo[62047]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvtzyiobsaknasfswnyjlgriaubffczt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:02 np0005532585.localdomain sudo[62047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:02 np0005532585.localdomain python3[62049]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:02 np0005532585.localdomain sudo[62047]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:03 np0005532585.localdomain sudo[62063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijklbfjcubmrmgbrouycaiiqzfrohnlj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:03 np0005532585.localdomain sudo[62063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:03 np0005532585.localdomain python3[62065]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:03 np0005532585.localdomain sudo[62063]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:03 np0005532585.localdomain sudo[62079]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwvsmgdrjfbuabsnufqrbbryvefkinfp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:03 np0005532585.localdomain sudo[62079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:03 np0005532585.localdomain python3[62081]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:03 np0005532585.localdomain sudo[62079]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:03 np0005532585.localdomain sudo[62095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haktzhtxbhbbuasalltqnplrwonnxzda ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:03 np0005532585.localdomain sudo[62095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:03 np0005532585.localdomain python3[62097]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:03 np0005532585.localdomain sudo[62095]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:03 np0005532585.localdomain sudo[62111]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydrpppamsfnfytagkeabjkoxznoovord ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:03 np0005532585.localdomain sudo[62111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:03 np0005532585.localdomain python3[62113]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:03 np0005532585.localdomain sudo[62111]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:04 np0005532585.localdomain sudo[62127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjdzjbkrvegppyqybctshloeuvyynzig ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:04 np0005532585.localdomain sudo[62127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:04 np0005532585.localdomain python3[62129]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:04 np0005532585.localdomain sudo[62127]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:04 np0005532585.localdomain sudo[62143]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wultkcrdpxookkabnliadcjedhdrygby ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:04 np0005532585.localdomain sudo[62143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:07:04 np0005532585.localdomain systemd[1]: tmp-crun.80Ricb.mount: Deactivated successfully.
Nov 23 08:07:04 np0005532585.localdomain podman[62146]: 2025-11-23 08:07:04.397515806 +0000 UTC m=+0.091024199 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 23 08:07:04 np0005532585.localdomain python3[62145]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:04 np0005532585.localdomain sudo[62143]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:04 np0005532585.localdomain sudo[62188]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfbbateelpvgwrrgmvrqhcgohbguqmim ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:04 np0005532585.localdomain sudo[62188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:04 np0005532585.localdomain podman[62146]: 2025-11-23 08:07:04.584783559 +0000 UTC m=+0.278291902 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git)
Nov 23 08:07:04 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:07:04 np0005532585.localdomain python3[62190]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:04 np0005532585.localdomain sudo[62188]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:04 np0005532585.localdomain sudo[62205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twuwphtmxoopnhnxwxmffnsfyqdinmhk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:04 np0005532585.localdomain sudo[62205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:04 np0005532585.localdomain python3[62207]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:04 np0005532585.localdomain sudo[62205]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:05 np0005532585.localdomain sudo[62221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylcsmrnrxgpujbrxetssfcimlbpiqnhq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:05 np0005532585.localdomain sudo[62221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:05 np0005532585.localdomain python3[62223]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:07:05 np0005532585.localdomain sudo[62221]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:05 np0005532585.localdomain sudo[62282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-augimtlwzrruiyqqyqjhhhkehoanrcqy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:05 np0005532585.localdomain sudo[62282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:05 np0005532585.localdomain python3[62284]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:05 np0005532585.localdomain sudo[62282]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:06 np0005532585.localdomain sudo[62311]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lthvtowvncsgmnpxsnrymkpcbolidots ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:06 np0005532585.localdomain sudo[62311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:06 np0005532585.localdomain python3[62313]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:06 np0005532585.localdomain sudo[62311]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:06 np0005532585.localdomain sudo[62340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhigzqnlkginehxibiqqohsumqmcpwnm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:06 np0005532585.localdomain sudo[62340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:06 np0005532585.localdomain python3[62342]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:06 np0005532585.localdomain sudo[62340]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:07 np0005532585.localdomain sudo[62369]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhjvslyjxoqcionafwxqfcvpzehtvcnd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:07 np0005532585.localdomain sudo[62369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:07 np0005532585.localdomain python3[62371]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:07 np0005532585.localdomain sudo[62369]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:07 np0005532585.localdomain sudo[62398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aondnqakjcikqhstmttulvkxomnzejsb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:07 np0005532585.localdomain sudo[62398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:07 np0005532585.localdomain python3[62400]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:07 np0005532585.localdomain sudo[62398]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:08 np0005532585.localdomain sudo[62427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahemehptqcuvryfrnobgojujrkiqrhuj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:08 np0005532585.localdomain sudo[62427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:08 np0005532585.localdomain python3[62429]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:08 np0005532585.localdomain sudo[62427]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:08 np0005532585.localdomain sudo[62456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrobrmspvudbffpkkohzmddphvsdaxxi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:08 np0005532585.localdomain sudo[62456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:08 np0005532585.localdomain python3[62458]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:08 np0005532585.localdomain sudo[62456]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:09 np0005532585.localdomain sudo[62485]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdtprztofplffouastetvbsgszljgopd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:09 np0005532585.localdomain sudo[62485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:09 np0005532585.localdomain python3[62487]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:09 np0005532585.localdomain sudo[62485]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:09 np0005532585.localdomain sudo[62514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obmobnmcjljhahcxdhyqkfohoemtjtsw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:09 np0005532585.localdomain sudo[62514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:09 np0005532585.localdomain python3[62516]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:09 np0005532585.localdomain sudo[62514]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:09 np0005532585.localdomain sudo[62530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhjheylspvsqnyabnhsnhybhaamxdbwl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:09 np0005532585.localdomain sudo[62530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:10 np0005532585.localdomain python3[62532]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:10 np0005532585.localdomain systemd-rc-local-generator[62557]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:10 np0005532585.localdomain systemd-sysv-generator[62562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Activating special unit Exit the Session...
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped target Main User Target.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped target Basic System.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped target Paths.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped target Sockets.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped target Timers.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Closed D-Bus User Message Bus Socket.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Stopped Create User's Volatile Files and Directories.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Removed slice User Application Slice.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Reached target Shutdown.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Finished Exit the Session.
Nov 23 08:07:10 np0005532585.localdomain systemd[61020]: Reached target Exit the Session.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 23 08:07:10 np0005532585.localdomain sudo[62530]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 23 08:07:10 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 23 08:07:10 np0005532585.localdomain sudo[62583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yofbqaiffnecdeuwxojgdyhqtwlmozxp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:10 np0005532585.localdomain sudo[62583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:10 np0005532585.localdomain python3[62585]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:11 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:11 np0005532585.localdomain systemd-sysv-generator[62613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:11 np0005532585.localdomain systemd-rc-local-generator[62609]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:11 np0005532585.localdomain systemd[1]: Starting collectd container...
Nov 23 08:07:11 np0005532585.localdomain systemd[1]: Started collectd container.
Nov 23 08:07:11 np0005532585.localdomain sudo[62583]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:11 np0005532585.localdomain sudo[62649]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bktmlhowfjdeczornyoccicdlvhvurxp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:11 np0005532585.localdomain sudo[62649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:11 np0005532585.localdomain python3[62651]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:13 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:13 np0005532585.localdomain systemd-sysv-generator[62679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:13 np0005532585.localdomain systemd-rc-local-generator[62676]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:13 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:13 np0005532585.localdomain systemd[1]: Starting iscsid container...
Nov 23 08:07:13 np0005532585.localdomain systemd[1]: Started iscsid container.
Nov 23 08:07:13 np0005532585.localdomain sudo[62649]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:13 np0005532585.localdomain sudo[62717]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqtlxfdpjqrewiwzcrklmblpvaxtazzr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:13 np0005532585.localdomain sudo[62717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:13 np0005532585.localdomain python3[62719]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:13 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:14 np0005532585.localdomain systemd-rc-local-generator[62745]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:14 np0005532585.localdomain systemd-sysv-generator[62750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:14 np0005532585.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Nov 23 08:07:14 np0005532585.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Nov 23 08:07:14 np0005532585.localdomain sudo[62717]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:14 np0005532585.localdomain sudo[62784]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmcfiygrwywruahmirzyklsmolznswfc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:14 np0005532585.localdomain sudo[62784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:15 np0005532585.localdomain python3[62786]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:15 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:15 np0005532585.localdomain systemd-sysv-generator[62814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:15 np0005532585.localdomain systemd-rc-local-generator[62810]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:15 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:15 np0005532585.localdomain systemd[1]: Starting nova_virtnodedevd container...
Nov 23 08:07:15 np0005532585.localdomain tripleo-start-podman-container[62826]: Creating additional drop-in dependency for "nova_virtnodedevd" (aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd)
Nov 23 08:07:15 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:15 np0005532585.localdomain systemd-sysv-generator[62886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:15 np0005532585.localdomain systemd-rc-local-generator[62883]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:15 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:15 np0005532585.localdomain systemd[1]: Started nova_virtnodedevd container.
Nov 23 08:07:15 np0005532585.localdomain sudo[62784]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:16 np0005532585.localdomain sudo[62908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vibwuaxciqemrgjicyjgihbjkvsfgxgo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:16 np0005532585.localdomain sudo[62908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:16 np0005532585.localdomain python3[62910]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:16 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:16 np0005532585.localdomain systemd-rc-local-generator[62937]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:16 np0005532585.localdomain systemd-sysv-generator[62942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:16 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:16 np0005532585.localdomain systemd[1]: Starting nova_virtproxyd container...
Nov 23 08:07:16 np0005532585.localdomain tripleo-start-podman-container[62950]: Creating additional drop-in dependency for "nova_virtproxyd" (108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08)
Nov 23 08:07:16 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:17 np0005532585.localdomain systemd-sysv-generator[63009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:17 np0005532585.localdomain systemd-rc-local-generator[63004]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:17 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:17 np0005532585.localdomain systemd[1]: Started nova_virtproxyd container.
Nov 23 08:07:17 np0005532585.localdomain sudo[62908]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:17 np0005532585.localdomain sudo[63031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnpirwisrcznmyhvcebdleznxlyiucnv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:17 np0005532585.localdomain sudo[63031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:17 np0005532585.localdomain python3[63033]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:17 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:17 np0005532585.localdomain systemd-rc-local-generator[63057]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:17 np0005532585.localdomain systemd-sysv-generator[63062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:18 np0005532585.localdomain systemd[1]: Starting nova_virtqemud container...
Nov 23 08:07:18 np0005532585.localdomain tripleo-start-podman-container[63072]: Creating additional drop-in dependency for "nova_virtqemud" (80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8)
Nov 23 08:07:18 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:18 np0005532585.localdomain systemd-sysv-generator[63133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:18 np0005532585.localdomain systemd-rc-local-generator[63127]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:18 np0005532585.localdomain systemd[1]: Started nova_virtqemud container.
Nov 23 08:07:18 np0005532585.localdomain sudo[63031]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:18 np0005532585.localdomain sudo[63152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvdopeiukzwvsiibtkxfwxunziaavfns ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:18 np0005532585.localdomain sudo[63152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:19 np0005532585.localdomain python3[63154]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:19 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:19 np0005532585.localdomain systemd-sysv-generator[63185]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:19 np0005532585.localdomain systemd-rc-local-generator[63180]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:19 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:19 np0005532585.localdomain systemd[1]: Starting nova_virtsecretd container...
Nov 23 08:07:20 np0005532585.localdomain tripleo-start-podman-container[63193]: Creating additional drop-in dependency for "nova_virtsecretd" (a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03)
Nov 23 08:07:20 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:20 np0005532585.localdomain systemd-sysv-generator[63250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:20 np0005532585.localdomain systemd-rc-local-generator[63247]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:20 np0005532585.localdomain systemd[1]: Started nova_virtsecretd container.
Nov 23 08:07:20 np0005532585.localdomain sudo[63152]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:20 np0005532585.localdomain sudo[63275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciufdoagxqcqbaouwemopsvbtkdioozp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:20 np0005532585.localdomain sudo[63275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:20 np0005532585.localdomain python3[63277]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:21 np0005532585.localdomain systemd-rc-local-generator[63304]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:21 np0005532585.localdomain systemd-sysv-generator[63308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:21 np0005532585.localdomain systemd[1]: Starting nova_virtstoraged container...
Nov 23 08:07:21 np0005532585.localdomain tripleo-start-podman-container[63317]: Creating additional drop-in dependency for "nova_virtstoraged" (33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a)
Nov 23 08:07:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:21 np0005532585.localdomain systemd-rc-local-generator[63373]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:21 np0005532585.localdomain systemd-sysv-generator[63377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:21 np0005532585.localdomain systemd[1]: Started nova_virtstoraged container.
Nov 23 08:07:21 np0005532585.localdomain sudo[63275]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:22 np0005532585.localdomain sudo[63400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yizumvxtcervohudikqzomygqvoqfjro ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:07:22 np0005532585.localdomain sudo[63400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:22 np0005532585.localdomain python3[63402]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:07:22 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:07:22 np0005532585.localdomain systemd-rc-local-generator[63428]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:07:22 np0005532585.localdomain systemd-sysv-generator[63433]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:07:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:07:22 np0005532585.localdomain systemd[1]: Starting rsyslog container...
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:07:23 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:23 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:23 np0005532585.localdomain podman[63442]: 2025-11-23 08:07:23.037486887 +0000 UTC m=+0.125305334 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:49Z, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, container_name=rsyslog, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 23 08:07:23 np0005532585.localdomain podman[63442]: 2025-11-23 08:07:23.049700496 +0000 UTC m=+0.137518913 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64)
Nov 23 08:07:23 np0005532585.localdomain podman[63442]: rsyslog
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: Started rsyslog container.
Nov 23 08:07:23 np0005532585.localdomain sudo[63460]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:07:23 np0005532585.localdomain sudo[63460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:07:23 np0005532585.localdomain sudo[63400]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:23 np0005532585.localdomain sudo[63460]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:07:23 np0005532585.localdomain podman[63477]: 2025-11-23 08:07:23.218400078 +0000 UTC m=+0.054188568 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Nov 23 08:07:23 np0005532585.localdomain podman[63477]: 2025-11-23 08:07:23.244864347 +0000 UTC m=+0.080652796 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3)
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:07:23 np0005532585.localdomain podman[63490]: 2025-11-23 08:07:23.336248026 +0000 UTC m=+0.063147568 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 23 08:07:23 np0005532585.localdomain podman[63490]: rsyslog
Nov 23 08:07:23 np0005532585.localdomain sudo[63512]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlkmiyixlqtdxmadpbhsbmlgwjrohlcp ; /usr/bin/python3
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 23 08:07:23 np0005532585.localdomain sudo[63512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:23 np0005532585.localdomain python3[63516]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:23 np0005532585.localdomain sudo[63512]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: Stopped rsyslog container.
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: Starting rsyslog container...
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:07:23 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:23 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:23 np0005532585.localdomain podman[63517]: 2025-11-23 08:07:23.810543843 +0000 UTC m=+0.116479367 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, container_name=rsyslog, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true)
Nov 23 08:07:23 np0005532585.localdomain podman[63517]: 2025-11-23 08:07:23.820588067 +0000 UTC m=+0.126523551 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:07:23 np0005532585.localdomain podman[63517]: rsyslog
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: Started rsyslog container.
Nov 23 08:07:23 np0005532585.localdomain sudo[63537]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:07:23 np0005532585.localdomain sudo[63537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:07:23 np0005532585.localdomain sudo[63537]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:07:23 np0005532585.localdomain podman[63572]: 2025-11-23 08:07:23.969595504 +0000 UTC m=+0.036232164 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z)
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a-userdata-shm.mount: Deactivated successfully.
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully.
Nov 23 08:07:23 np0005532585.localdomain podman[63572]: 2025-11-23 08:07:23.994559408 +0000 UTC m=+0.061196068 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:07:23 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:07:24 np0005532585.localdomain sudo[63596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxebutmwduvswpudzcbfkxxaheylzsjx ; /usr/bin/python3
Nov 23 08:07:24 np0005532585.localdomain sudo[63596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:24 np0005532585.localdomain podman[63599]: 2025-11-23 08:07:24.081272456 +0000 UTC m=+0.059899780 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 23 08:07:24 np0005532585.localdomain podman[63599]: rsyslog
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 23 08:07:24 np0005532585.localdomain sudo[63596]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:24 np0005532585.localdomain sudo[63653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcayhfccoishiueekpqzegxamztetkto ; /usr/bin/python3
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: Stopped rsyslog container.
Nov 23 08:07:24 np0005532585.localdomain sudo[63653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: Starting rsyslog container...
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:07:24 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:24 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:24 np0005532585.localdomain podman[63655]: 2025-11-23 08:07:24.529505646 +0000 UTC m=+0.119716365 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, container_name=rsyslog, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-rsyslog)
Nov 23 08:07:24 np0005532585.localdomain podman[63655]: 2025-11-23 08:07:24.539172468 +0000 UTC m=+0.129383197 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, name=rhosp17/openstack-rsyslog, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 23 08:07:24 np0005532585.localdomain podman[63655]: rsyslog
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: Started rsyslog container.
Nov 23 08:07:24 np0005532585.localdomain sudo[63673]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:07:24 np0005532585.localdomain sudo[63673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:07:24 np0005532585.localdomain sudo[63653]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:24 np0005532585.localdomain sudo[63673]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:07:24 np0005532585.localdomain podman[63690]: 2025-11-23 08:07:24.705100816 +0000 UTC m=+0.040910635 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, architecture=x86_64, release=1761123044)
Nov 23 08:07:24 np0005532585.localdomain podman[63690]: 2025-11-23 08:07:24.730656949 +0000 UTC m=+0.066466728 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:07:24 np0005532585.localdomain sudo[63722]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phvzcywqfkcrsmzomfwdtezubtwvohav ; /usr/bin/python3
Nov 23 08:07:24 np0005532585.localdomain sudo[63722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:24 np0005532585.localdomain podman[63707]: 2025-11-23 08:07:24.824064778 +0000 UTC m=+0.059491247 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:07:24 np0005532585.localdomain podman[63707]: rsyslog
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully.
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a-userdata-shm.mount: Deactivated successfully.
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: Stopped rsyslog container.
Nov 23 08:07:24 np0005532585.localdomain python3[63729]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005532585 step=3 update_config_hash_only=False
Nov 23 08:07:24 np0005532585.localdomain systemd[1]: Starting rsyslog container...
Nov 23 08:07:25 np0005532585.localdomain sudo[63722]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:07:25 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:25 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:25 np0005532585.localdomain podman[63732]: 2025-11-23 08:07:25.12028107 +0000 UTC m=+0.113617561 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=)
Nov 23 08:07:25 np0005532585.localdomain podman[63732]: 2025-11-23 08:07:25.129769166 +0000 UTC m=+0.123105657 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3)
Nov 23 08:07:25 np0005532585.localdomain podman[63732]: rsyslog
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: Started rsyslog container.
Nov 23 08:07:25 np0005532585.localdomain sudo[63751]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:07:25 np0005532585.localdomain sudo[63751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:07:25 np0005532585.localdomain sudo[63751]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:07:25 np0005532585.localdomain podman[63754]: 2025-11-23 08:07:25.280393173 +0000 UTC m=+0.043606658 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.openshift.expose-services=, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible)
Nov 23 08:07:25 np0005532585.localdomain podman[63754]: 2025-11-23 08:07:25.2965135 +0000 UTC m=+0.059726965 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, release=1761123044, vendor=Red Hat, Inc., container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:07:25 np0005532585.localdomain podman[63767]: 2025-11-23 08:07:25.376999159 +0000 UTC m=+0.045068431 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-rsyslog, release=1761123044, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, container_name=rsyslog, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true)
Nov 23 08:07:25 np0005532585.localdomain podman[63767]: rsyslog
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 23 08:07:25 np0005532585.localdomain sudo[63791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgkumnddhxjvidhhgogfwyywpavgsuxj ; /usr/bin/python3
Nov 23 08:07:25 np0005532585.localdomain sudo[63791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:25 np0005532585.localdomain python3[63793]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:07:25 np0005532585.localdomain sudo[63791]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: Stopped rsyslog container.
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: Starting rsyslog container...
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:07:25 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:25 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 23 08:07:25 np0005532585.localdomain sudo[63824]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpkljnvzilhkgevkbwqmjvcncanvnibw ; /usr/bin/python3
Nov 23 08:07:25 np0005532585.localdomain sudo[63824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:07:25 np0005532585.localdomain podman[63794]: 2025-11-23 08:07:25.820860658 +0000 UTC m=+0.122430467 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, container_name=rsyslog)
Nov 23 08:07:25 np0005532585.localdomain podman[63794]: 2025-11-23 08:07:25.828019134 +0000 UTC m=+0.129588943 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:07:25 np0005532585.localdomain podman[63794]: rsyslog
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: Started rsyslog container.
Nov 23 08:07:25 np0005532585.localdomain sudo[63829]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:07:25 np0005532585.localdomain sudo[63829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:07:25 np0005532585.localdomain sudo[63829]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:25 np0005532585.localdomain systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully.
Nov 23 08:07:25 np0005532585.localdomain podman[63832]: 2025-11-23 08:07:25.982741725 +0000 UTC m=+0.043215476 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, container_name=rsyslog, vcs-type=git, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 23 08:07:25 np0005532585.localdomain python3[63826]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a-userdata-shm.mount: Deactivated successfully.
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully.
Nov 23 08:07:26 np0005532585.localdomain podman[63832]: 2025-11-23 08:07:26.006465911 +0000 UTC m=+0.066939632 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, container_name=rsyslog, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git)
Nov 23 08:07:26 np0005532585.localdomain sudo[63824]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:07:26 np0005532585.localdomain podman[63845]: 2025-11-23 08:07:26.094403885 +0000 UTC m=+0.056166046 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, release=1761123044, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 23 08:07:26 np0005532585.localdomain podman[63845]: rsyslog
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: Stopped rsyslog container.
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 23 08:07:26 np0005532585.localdomain systemd[1]: Failed to start rsyslog container.
Nov 23 08:07:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:07:28 np0005532585.localdomain systemd[1]: tmp-crun.uwOAks.mount: Deactivated successfully.
Nov 23 08:07:28 np0005532585.localdomain podman[63858]: 2025-11-23 08:07:28.032028276 +0000 UTC m=+0.090182233 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:07:28 np0005532585.localdomain podman[63858]: 2025-11-23 08:07:28.070321703 +0000 UTC m=+0.128475660 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, container_name=collectd)
Nov 23 08:07:28 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:07:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:07:30 np0005532585.localdomain podman[63880]: 2025-11-23 08:07:30.017339286 +0000 UTC m=+0.074993905 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:07:30 np0005532585.localdomain podman[63880]: 2025-11-23 08:07:30.029305347 +0000 UTC m=+0.086959936 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z)
Nov 23 08:07:30 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:07:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:07:35 np0005532585.localdomain systemd[1]: tmp-crun.KlA56n.mount: Deactivated successfully.
Nov 23 08:07:35 np0005532585.localdomain podman[63899]: 2025-11-23 08:07:35.026190677 +0000 UTC m=+0.083985146 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Nov 23 08:07:35 np0005532585.localdomain podman[63899]: 2025-11-23 08:07:35.216953565 +0000 UTC m=+0.274748044 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:07:35 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:07:40 np0005532585.localdomain sudo[63928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:07:40 np0005532585.localdomain sudo[63928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:07:40 np0005532585.localdomain sudo[63928]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:41 np0005532585.localdomain sudo[63943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:07:41 np0005532585.localdomain sudo[63943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:07:41 np0005532585.localdomain sudo[63943]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:42 np0005532585.localdomain sudo[63991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:07:42 np0005532585.localdomain sudo[63991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:07:42 np0005532585.localdomain sudo[63991]: pam_unix(sudo:session): session closed for user root
Nov 23 08:07:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:07:59 np0005532585.localdomain podman[64006]: 2025-11-23 08:07:59.030566791 +0000 UTC m=+0.085159941 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, managed_by=tripleo_ansible)
Nov 23 08:07:59 np0005532585.localdomain podman[64006]: 2025-11-23 08:07:59.042141231 +0000 UTC m=+0.096734311 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=1761123044, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-type=git)
Nov 23 08:07:59 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:08:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:08:01 np0005532585.localdomain systemd[1]: tmp-crun.DjTdEN.mount: Deactivated successfully.
Nov 23 08:08:01 np0005532585.localdomain podman[64023]: 2025-11-23 08:08:01.024381399 +0000 UTC m=+0.077992196 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, url=https://www.redhat.com)
Nov 23 08:08:01 np0005532585.localdomain podman[64023]: 2025-11-23 08:08:01.057125477 +0000 UTC m=+0.110736244 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:08:01 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:08:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:08:06 np0005532585.localdomain podman[64044]: 2025-11-23 08:08:06.018342729 +0000 UTC m=+0.075859490 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:08:06 np0005532585.localdomain podman[64044]: 2025-11-23 08:08:06.204236521 +0000 UTC m=+0.261753362 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:08:06 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:08:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:08:30 np0005532585.localdomain podman[64072]: 2025-11-23 08:08:30.021406727 +0000 UTC m=+0.079639395 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:08:30 np0005532585.localdomain podman[64072]: 2025-11-23 08:08:30.057574399 +0000 UTC m=+0.115807057 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:08:30 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:08:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:08:32 np0005532585.localdomain podman[64092]: 2025-11-23 08:08:32.014489622 +0000 UTC m=+0.077235243 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, distribution-scope=public)
Nov 23 08:08:32 np0005532585.localdomain podman[64092]: 2025-11-23 08:08:32.026323269 +0000 UTC m=+0.089068890 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1)
Nov 23 08:08:32 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:08:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:08:37 np0005532585.localdomain podman[64111]: 2025-11-23 08:08:37.020755124 +0000 UTC m=+0.077218483 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z)
Nov 23 08:08:37 np0005532585.localdomain podman[64111]: 2025-11-23 08:08:37.219283848 +0000 UTC m=+0.275747146 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Nov 23 08:08:37 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:08:42 np0005532585.localdomain sudo[64141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:08:42 np0005532585.localdomain sudo[64141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:08:42 np0005532585.localdomain sudo[64141]: pam_unix(sudo:session): session closed for user root
Nov 23 08:08:42 np0005532585.localdomain sudo[64156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:08:42 np0005532585.localdomain sudo[64156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:08:43 np0005532585.localdomain sudo[64156]: pam_unix(sudo:session): session closed for user root
Nov 23 08:08:43 np0005532585.localdomain sudo[64202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:08:43 np0005532585.localdomain sudo[64202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:08:43 np0005532585.localdomain sudo[64202]: pam_unix(sudo:session): session closed for user root
Nov 23 08:09:00 np0005532585.localdomain sshd[64217]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:09:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:09:01 np0005532585.localdomain podman[64218]: 2025-11-23 08:09:01.042143971 +0000 UTC m=+0.093225325 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Nov 23 08:09:01 np0005532585.localdomain podman[64218]: 2025-11-23 08:09:01.049546525 +0000 UTC m=+0.100627869 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, release=1761123044)
Nov 23 08:09:01 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:09:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:09:03 np0005532585.localdomain systemd[1]: tmp-crun.63uZ70.mount: Deactivated successfully.
Nov 23 08:09:03 np0005532585.localdomain podman[64240]: 2025-11-23 08:09:03.015588654 +0000 UTC m=+0.073839920 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4)
Nov 23 08:09:03 np0005532585.localdomain podman[64240]: 2025-11-23 08:09:03.054393285 +0000 UTC m=+0.112644521 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:09:03 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:09:03 np0005532585.localdomain sshd[64217]: Invalid user debian from 114.247.207.98 port 47886
Nov 23 08:09:04 np0005532585.localdomain sshd[64217]: Connection closed by invalid user debian 114.247.207.98 port 47886 [preauth]
Nov 23 08:09:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:09:08 np0005532585.localdomain podman[64260]: 2025-11-23 08:09:08.028352822 +0000 UTC m=+0.085559423 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:09:08 np0005532585.localdomain podman[64260]: 2025-11-23 08:09:08.223566705 +0000 UTC m=+0.280773296 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:09:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:09:23 np0005532585.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=100.29.192.117 DST=38.102.83.198 LEN=44 TOS=0x00 PREC=0x00 TTL=243 ID=54321 PROTO=TCP SPT=19431 DPT=9090 SEQ=0 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B4) 
Nov 23 08:09:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:09:32 np0005532585.localdomain systemd[1]: tmp-crun.ikiX8n.mount: Deactivated successfully.
Nov 23 08:09:32 np0005532585.localdomain podman[64289]: 2025-11-23 08:09:32.036961598 +0000 UTC m=+0.092239385 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Nov 23 08:09:32 np0005532585.localdomain podman[64289]: 2025-11-23 08:09:32.046348361 +0000 UTC m=+0.101626158 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, distribution-scope=public, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 23 08:09:32 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:09:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:09:34 np0005532585.localdomain podman[64309]: 2025-11-23 08:09:34.02567698 +0000 UTC m=+0.079828930 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:09:34 np0005532585.localdomain podman[64309]: 2025-11-23 08:09:34.03459856 +0000 UTC m=+0.088750540 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:09:34 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:09:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:09:39 np0005532585.localdomain podman[64329]: 2025-11-23 08:09:39.032844055 +0000 UTC m=+0.090199360 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:09:39 np0005532585.localdomain podman[64329]: 2025-11-23 08:09:39.222155484 +0000 UTC m=+0.279510739 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 23 08:09:39 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:09:43 np0005532585.localdomain sudo[64356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:09:43 np0005532585.localdomain sudo[64356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:09:43 np0005532585.localdomain sudo[64356]: pam_unix(sudo:session): session closed for user root
Nov 23 08:09:44 np0005532585.localdomain sudo[64371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:09:44 np0005532585.localdomain sudo[64371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:09:44 np0005532585.localdomain sudo[64371]: pam_unix(sudo:session): session closed for user root
Nov 23 08:09:45 np0005532585.localdomain sudo[64418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:09:45 np0005532585.localdomain sudo[64418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:09:45 np0005532585.localdomain sudo[64418]: pam_unix(sudo:session): session closed for user root
Nov 23 08:10:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:10:03 np0005532585.localdomain systemd[1]: tmp-crun.U8eznj.mount: Deactivated successfully.
Nov 23 08:10:03 np0005532585.localdomain podman[64433]: 2025-11-23 08:10:03.029421493 +0000 UTC m=+0.090405076 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true)
Nov 23 08:10:03 np0005532585.localdomain podman[64433]: 2025-11-23 08:10:03.035814828 +0000 UTC m=+0.096798411 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, release=1761123044)
Nov 23 08:10:03 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:10:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:10:05 np0005532585.localdomain systemd[1]: tmp-crun.NUhv9U.mount: Deactivated successfully.
Nov 23 08:10:05 np0005532585.localdomain podman[64453]: 2025-11-23 08:10:05.012540074 +0000 UTC m=+0.070179879 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-iscsid, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:10:05 np0005532585.localdomain podman[64453]: 2025-11-23 08:10:05.021614121 +0000 UTC m=+0.079253996 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:10:05 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:10:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:10:10 np0005532585.localdomain podman[64473]: 2025-11-23 08:10:10.019021489 +0000 UTC m=+0.077239825 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:10:10 np0005532585.localdomain podman[64473]: 2025-11-23 08:10:10.23104757 +0000 UTC m=+0.289265866 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Nov 23 08:10:10 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:10:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:10:34 np0005532585.localdomain systemd[1]: tmp-crun.lqka5K.mount: Deactivated successfully.
Nov 23 08:10:34 np0005532585.localdomain podman[64501]: 2025-11-23 08:10:34.022636871 +0000 UTC m=+0.084193867 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:10:34 np0005532585.localdomain podman[64501]: 2025-11-23 08:10:34.032114101 +0000 UTC m=+0.093671017 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, container_name=collectd, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:10:34 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:10:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:10:36 np0005532585.localdomain podman[64521]: 2025-11-23 08:10:36.018850851 +0000 UTC m=+0.076905785 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Nov 23 08:10:36 np0005532585.localdomain podman[64521]: 2025-11-23 08:10:36.026562217 +0000 UTC m=+0.084617121 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, release=1761123044, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:10:36 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:10:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:10:41 np0005532585.localdomain systemd[1]: tmp-crun.rglPxe.mount: Deactivated successfully.
Nov 23 08:10:41 np0005532585.localdomain podman[64540]: 2025-11-23 08:10:41.015866705 +0000 UTC m=+0.077319438 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:10:41 np0005532585.localdomain podman[64540]: 2025-11-23 08:10:41.236417216 +0000 UTC m=+0.297869969 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:10:41 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:10:45 np0005532585.localdomain sudo[64569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:10:45 np0005532585.localdomain sudo[64569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:10:45 np0005532585.localdomain sudo[64569]: pam_unix(sudo:session): session closed for user root
Nov 23 08:10:45 np0005532585.localdomain sudo[64584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:10:45 np0005532585.localdomain sudo[64584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:10:46 np0005532585.localdomain sudo[64584]: pam_unix(sudo:session): session closed for user root
Nov 23 08:10:46 np0005532585.localdomain sudo[64631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:10:46 np0005532585.localdomain sudo[64631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:10:46 np0005532585.localdomain sudo[64631]: pam_unix(sudo:session): session closed for user root
Nov 23 08:10:46 np0005532585.localdomain sudo[64646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 08:10:46 np0005532585.localdomain sudo[64646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:10:46 np0005532585.localdomain sudo[64646]: pam_unix(sudo:session): session closed for user root
Nov 23 08:10:51 np0005532585.localdomain sudo[64681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:10:51 np0005532585.localdomain sudo[64681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:10:51 np0005532585.localdomain sudo[64681]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:04 np0005532585.localdomain sshd[64697]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:11:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:11:05 np0005532585.localdomain podman[64698]: 2025-11-23 08:11:05.032166343 +0000 UTC m=+0.082595848 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:11:05 np0005532585.localdomain podman[64698]: 2025-11-23 08:11:05.072381399 +0000 UTC m=+0.122810894 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step3)
Nov 23 08:11:05 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:11:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:11:07 np0005532585.localdomain podman[64719]: 2025-11-23 08:11:07.025513694 +0000 UTC m=+0.078706390 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Nov 23 08:11:07 np0005532585.localdomain podman[64719]: 2025-11-23 08:11:07.034241309 +0000 UTC m=+0.087433945 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:11:07 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:11:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:11:12 np0005532585.localdomain podman[64738]: 2025-11-23 08:11:12.013774721 +0000 UTC m=+0.074946336 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 23 08:11:12 np0005532585.localdomain podman[64738]: 2025-11-23 08:11:12.233372783 +0000 UTC m=+0.294544368 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:11:12 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:11:21 np0005532585.localdomain sshd[64697]: Connection closed by 206.168.34.35 port 32218 [preauth]
Nov 23 08:11:24 np0005532585.localdomain sudo[64812]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftqohpqtgwbpwkqorctpfodscwykjrui ; /usr/bin/python3
Nov 23 08:11:24 np0005532585.localdomain sudo[64812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:24 np0005532585.localdomain python3[64814]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:11:24 np0005532585.localdomain sudo[64812]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:24 np0005532585.localdomain sudo[64857]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adyxjsxdhcyamisdfbyqpurqdambgwpo ; /usr/bin/python3
Nov 23 08:11:24 np0005532585.localdomain sudo[64857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:24 np0005532585.localdomain python3[64859]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885484.2567315-107579-259519731031558/source _original_basename=tmpkz8lkwxt follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:24 np0005532585.localdomain sudo[64857]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:25 np0005532585.localdomain sudo[64919]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shrlvkuettkuwhjmfftriwwtifeswxbv ; /usr/bin/python3
Nov 23 08:11:25 np0005532585.localdomain sudo[64919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:25 np0005532585.localdomain python3[64921]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:11:25 np0005532585.localdomain sudo[64919]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:26 np0005532585.localdomain sudo[64962]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuhuredbdsddykzwxrmwspdotqfmukqg ; /usr/bin/python3
Nov 23 08:11:26 np0005532585.localdomain sudo[64962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:26 np0005532585.localdomain python3[64964]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885485.5420556-107660-163651403600910/source _original_basename=tmpejbk836s follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:26 np0005532585.localdomain sudo[64962]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:26 np0005532585.localdomain sudo[65024]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkqojpisxmynycqcizkokwhtlkvpghnh ; /usr/bin/python3
Nov 23 08:11:26 np0005532585.localdomain sudo[65024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:26 np0005532585.localdomain python3[65026]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:11:26 np0005532585.localdomain sudo[65024]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:26 np0005532585.localdomain sudo[65067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxabglnvdvlzvavxqadwqlwdcyxdyosj ; /usr/bin/python3
Nov 23 08:11:26 np0005532585.localdomain sudo[65067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:27 np0005532585.localdomain python3[65069]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885486.4869795-107749-183953673592417/source _original_basename=tmp_r7_govb follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:27 np0005532585.localdomain sudo[65067]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:27 np0005532585.localdomain sudo[65129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atleojvffssbmzwxslhsrcoikiyqqwdw ; /usr/bin/python3
Nov 23 08:11:27 np0005532585.localdomain sudo[65129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:27 np0005532585.localdomain python3[65131]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:11:27 np0005532585.localdomain sudo[65129]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:27 np0005532585.localdomain sudo[65172]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynsglprlynhgrqmugyqxlhruhlcprjta ; /usr/bin/python3
Nov 23 08:11:27 np0005532585.localdomain sudo[65172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:28 np0005532585.localdomain python3[65174]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885487.342244-107815-178809715685081/source _original_basename=tmpbo677tha follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:28 np0005532585.localdomain sudo[65172]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:28 np0005532585.localdomain sudo[65202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfqvddmkyvnictisilbivfphjfdpgfpr ; /usr/bin/python3
Nov 23 08:11:28 np0005532585.localdomain sudo[65202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:28 np0005532585.localdomain python3[65204]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 08:11:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:28 np0005532585.localdomain systemd-rc-local-generator[65229]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:28 np0005532585.localdomain systemd-sysv-generator[65232]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:29 np0005532585.localdomain systemd-sysv-generator[65272]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:29 np0005532585.localdomain systemd-rc-local-generator[65268]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:29 np0005532585.localdomain sudo[65202]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:29 np0005532585.localdomain sudo[65294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tigogyyuanbnlbdywbqovvbkyjmvzyyg ; /usr/bin/python3
Nov 23 08:11:29 np0005532585.localdomain sudo[65294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:29 np0005532585.localdomain python3[65296]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:11:29 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:29 np0005532585.localdomain systemd-rc-local-generator[65321]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:29 np0005532585.localdomain systemd-sysv-generator[65324]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:30 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:30 np0005532585.localdomain systemd-sysv-generator[65363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:30 np0005532585.localdomain systemd-rc-local-generator[65358]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:30 np0005532585.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Nov 23 08:11:30 np0005532585.localdomain sudo[65294]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:30 np0005532585.localdomain sudo[65384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nspetnvysbhariqtabuemlxsayyfqbsc ; /usr/bin/python3
Nov 23 08:11:30 np0005532585.localdomain sudo[65384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:30 np0005532585.localdomain python3[65386]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 08:11:30 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:30 np0005532585.localdomain systemd-rc-local-generator[65411]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:30 np0005532585.localdomain systemd-sysv-generator[65414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:31 np0005532585.localdomain sudo[65384]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:31 np0005532585.localdomain sudo[65468]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzuvxetgdfnqgerbnrbrnnmlaukzcffx ; /usr/bin/python3
Nov 23 08:11:31 np0005532585.localdomain sudo[65468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:31 np0005532585.localdomain python3[65470]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:11:31 np0005532585.localdomain sudo[65468]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:31 np0005532585.localdomain sudo[65511]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onbytetokfwtuunmsurpcidlesucmbki ; /usr/bin/python3
Nov 23 08:11:31 np0005532585.localdomain sudo[65511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:31 np0005532585.localdomain python3[65513]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885491.1600728-107951-109599284716971/source _original_basename=tmp6m_gek2v follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:31 np0005532585.localdomain sudo[65511]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:32 np0005532585.localdomain sudo[65541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esezhxwjpyddhgewbnlzzfvhaissijbb ; /usr/bin/python3
Nov 23 08:11:32 np0005532585.localdomain sudo[65541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:32 np0005532585.localdomain python3[65543]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:11:32 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:32 np0005532585.localdomain systemd-sysv-generator[65575]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:32 np0005532585.localdomain systemd-rc-local-generator[65568]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:32 np0005532585.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Nov 23 08:11:32 np0005532585.localdomain sudo[65541]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:32 np0005532585.localdomain sudo[65595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huypxowikxykgmlbaxmxynsdfexqases ; /usr/bin/python3
Nov 23 08:11:32 np0005532585.localdomain sudo[65595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:33 np0005532585.localdomain python3[65597]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:11:33 np0005532585.localdomain sudo[65595]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:33 np0005532585.localdomain sudo[65645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwyqejrhjdpkzaslhmnaaqqbxzcqxkii ; /usr/bin/python3
Nov 23 08:11:33 np0005532585.localdomain sudo[65645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:33 np0005532585.localdomain sudo[65645]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:33 np0005532585.localdomain sudo[65663]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xunaaxjskfwwyrxzlklrdgfzwyoiiuaw ; /usr/bin/python3
Nov 23 08:11:33 np0005532585.localdomain sudo[65663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:33 np0005532585.localdomain sudo[65663]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:34 np0005532585.localdomain sudo[65767]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upmwdwmwmktbopmhmevvqavueqnkrsrv ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885494.1805284-108061-154113572429303/async_wrapper.py 594696035795 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885494.1805284-108061-154113572429303/AnsiballZ_command.py _
Nov 23 08:11:34 np0005532585.localdomain sudo[65767]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 08:11:34 np0005532585.localdomain ansible-async_wrapper.py[65769]: Invoked with 594696035795 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885494.1805284-108061-154113572429303/AnsiballZ_command.py _
Nov 23 08:11:34 np0005532585.localdomain ansible-async_wrapper.py[65772]: Starting module and watcher
Nov 23 08:11:34 np0005532585.localdomain ansible-async_wrapper.py[65772]: Start watching 65773 (3600)
Nov 23 08:11:34 np0005532585.localdomain ansible-async_wrapper.py[65773]: Start module (65773)
Nov 23 08:11:34 np0005532585.localdomain ansible-async_wrapper.py[65769]: Return async_wrapper task started.
Nov 23 08:11:34 np0005532585.localdomain sudo[65767]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:34 np0005532585.localdomain sudo[65788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdpwxjvrknearbuhlumsinyfvdqgekxk ; /usr/bin/python3
Nov 23 08:11:34 np0005532585.localdomain sudo[65788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:35 np0005532585.localdomain python3[65793]: ansible-ansible.legacy.async_status Invoked with jid=594696035795.65769 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:11:35 np0005532585.localdomain sudo[65788]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:11:36 np0005532585.localdomain podman[65807]: 2025-11-23 08:11:36.025583924 +0000 UTC m=+0.080427643 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, distribution-scope=public, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=)
Nov 23 08:11:36 np0005532585.localdomain podman[65807]: 2025-11-23 08:11:36.03924452 +0000 UTC m=+0.094088259 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:11:36 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:11:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:11:37 np0005532585.localdomain podman[65863]: 2025-11-23 08:11:37.41630917 +0000 UTC m=+0.084915140 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:11:37 np0005532585.localdomain podman[65863]: 2025-11-23 08:11:37.422118127 +0000 UTC m=+0.090724077 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Nov 23 08:11:37 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (file: /etc/puppet/hiera.yaml)
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: Undefined variable '::deploy_config_name';
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (file & line not available)
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (file & line not available)
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 23 08:11:38 np0005532585.localdomain puppet-user[65792]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.21 seconds
Nov 23 08:11:39 np0005532585.localdomain ansible-async_wrapper.py[65772]: 65773 still running (3600)
Nov 23 08:11:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:11:43 np0005532585.localdomain podman[65952]: 2025-11-23 08:11:43.029630926 +0000 UTC m=+0.082438613 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 08:11:43 np0005532585.localdomain podman[65952]: 2025-11-23 08:11:43.217723398 +0000 UTC m=+0.270531005 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:11:43 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:11:44 np0005532585.localdomain ansible-async_wrapper.py[65772]: 65773 still running (3595)
Nov 23 08:11:45 np0005532585.localdomain sudo[66052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovwfdkczduchjbyfnwofbsrgfsqtqxvx ; /usr/bin/python3
Nov 23 08:11:45 np0005532585.localdomain sudo[66052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:45 np0005532585.localdomain python3[66056]: ansible-ansible.legacy.async_status Invoked with jid=594696035795.65769 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:11:45 np0005532585.localdomain sudo[66052]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:47 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 08:11:47 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 08:11:47 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:47 np0005532585.localdomain systemd-sysv-generator[66150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:47 np0005532585.localdomain systemd-rc-local-generator[66147]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:47 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 08:11:48 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 08:11:48 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 08:11:48 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.127s CPU time.
Nov 23 08:11:48 np0005532585.localdomain systemd[1]: run-ra987bbe231014b0d8198b24404565d19.service: Deactivated successfully.
Nov 23 08:11:48 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Nov 23 08:11:48 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}10c7cbde53c4e0c35154236143cc1ebe7f111a329076e2d1cfa3e9aea340c260'
Nov 23 08:11:48 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Nov 23 08:11:48 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Nov 23 08:11:48 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Nov 23 08:11:48 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Nov 23 08:11:49 np0005532585.localdomain ansible-async_wrapper.py[65772]: 65773 still running (3590)
Nov 23 08:11:51 np0005532585.localdomain sudo[67353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:11:51 np0005532585.localdomain sudo[67353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:11:51 np0005532585.localdomain sudo[67353]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:51 np0005532585.localdomain sudo[67368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:11:51 np0005532585.localdomain sudo[67368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:11:52 np0005532585.localdomain sudo[67368]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:53 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:54 np0005532585.localdomain systemd-rc-local-generator[67446]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:54 np0005532585.localdomain systemd-sysv-generator[67449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Nov 23 08:11:54 np0005532585.localdomain snmpd[67457]: Can't find directory of RPM packages
Nov 23 08:11:54 np0005532585.localdomain snmpd[67457]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:54 np0005532585.localdomain systemd-rc-local-generator[67481]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:54 np0005532585.localdomain systemd-sysv-generator[67487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:54 np0005532585.localdomain ansible-async_wrapper.py[65772]: 65773 still running (3585)
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:11:54 np0005532585.localdomain systemd-rc-local-generator[67519]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:11:54 np0005532585.localdomain systemd-sysv-generator[67522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:11:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Notice: Applied catalog in 16.41 seconds
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Application:
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:    Initial environment: production
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:    Converged environment: production
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:          Run mode: user
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Changes:
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:             Total: 8
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Events:
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:           Success: 8
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:             Total: 8
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Resources:
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:         Restarted: 1
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:           Changed: 8
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:       Out of sync: 8
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:             Total: 19
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Time:
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:        Filebucket: 0.00
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:          Schedule: 0.00
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:            Augeas: 0.01
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:              File: 0.10
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:    Config retrieval: 0.27
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:           Service: 1.23
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:    Transaction evaluation: 16.40
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:    Catalog application: 16.41
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:          Last run: 1763885515
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:              Exec: 5.07
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:           Package: 9.83
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:             Total: 16.41
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]: Version:
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:            Config: 1763885498
Nov 23 08:11:55 np0005532585.localdomain puppet-user[65792]:            Puppet: 7.10.0
Nov 23 08:11:55 np0005532585.localdomain ansible-async_wrapper.py[65773]: Module complete (65773)
Nov 23 08:11:55 np0005532585.localdomain sudo[67531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:11:55 np0005532585.localdomain sudo[67531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:11:55 np0005532585.localdomain sudo[67531]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:55 np0005532585.localdomain sudo[67559]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaoanmjkgccotwpujnqtkoxatlmyegaj ; /usr/bin/python3
Nov 23 08:11:55 np0005532585.localdomain sudo[67559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:55 np0005532585.localdomain python3[67561]: ansible-ansible.legacy.async_status Invoked with jid=594696035795.65769 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:11:55 np0005532585.localdomain sudo[67559]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:56 np0005532585.localdomain sudo[67575]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilcioyrqftosvssxsyebuvwxdwevijnz ; /usr/bin/python3
Nov 23 08:11:56 np0005532585.localdomain sudo[67575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:56 np0005532585.localdomain python3[67577]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:11:56 np0005532585.localdomain sudo[67575]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:56 np0005532585.localdomain sudo[67591]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-encwqhjfazzkutsarppiznjcjqypzpfo ; /usr/bin/python3
Nov 23 08:11:56 np0005532585.localdomain sudo[67591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:56 np0005532585.localdomain python3[67593]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:11:56 np0005532585.localdomain sudo[67591]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:56 np0005532585.localdomain sudo[67641]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smbeajfxkakbiicjyriyrmmdqnpeknst ; /usr/bin/python3
Nov 23 08:11:56 np0005532585.localdomain sudo[67641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:57 np0005532585.localdomain python3[67643]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:11:57 np0005532585.localdomain sudo[67641]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:57 np0005532585.localdomain sudo[67659]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbfiiupjdbzmpbyedvtnnvdinjrrmflt ; /usr/bin/python3
Nov 23 08:11:57 np0005532585.localdomain sudo[67659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:57 np0005532585.localdomain python3[67661]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpjqih1ckc recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:11:57 np0005532585.localdomain sudo[67659]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:57 np0005532585.localdomain sudo[67689]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibnswtysbpxggxfgoeauyevsbaqqdjwd ; /usr/bin/python3
Nov 23 08:11:57 np0005532585.localdomain sudo[67689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:57 np0005532585.localdomain python3[67691]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:57 np0005532585.localdomain sudo[67689]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:58 np0005532585.localdomain sudo[67705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oybcooobjljfdcvqqacllfjcyqrwnzsu ; /usr/bin/python3
Nov 23 08:11:58 np0005532585.localdomain sudo[67705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:58 np0005532585.localdomain sudo[67705]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:58 np0005532585.localdomain sudo[67792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hltarwmtoporibxtmsdjfdochvigtjas ; /usr/bin/python3
Nov 23 08:11:58 np0005532585.localdomain sudo[67792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:58 np0005532585.localdomain python3[67794]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 23 08:11:59 np0005532585.localdomain sudo[67792]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:59 np0005532585.localdomain sudo[67811]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnehjhlokyrhajozuhdohqnvtexxdjfn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:11:59 np0005532585.localdomain sudo[67811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:11:59 np0005532585.localdomain ansible-async_wrapper.py[65772]: Done in kid B.
Nov 23 08:11:59 np0005532585.localdomain python3[67813]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:11:59 np0005532585.localdomain sudo[67811]: pam_unix(sudo:session): session closed for user root
Nov 23 08:11:59 np0005532585.localdomain sudo[67827]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skdczfsyeuurcvcvlzegwmzfqvpnhkfx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:11:59 np0005532585.localdomain sudo[67827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:00 np0005532585.localdomain sudo[67827]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:00 np0005532585.localdomain sudo[67843]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-taejpqqujjqnyghcgbvmsqslwhlhjxni ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:00 np0005532585.localdomain sudo[67843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:00 np0005532585.localdomain python3[67845]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:00 np0005532585.localdomain sudo[67843]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:12:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4446 writes, 20K keys, 4446 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4446 writes, 451 syncs, 9.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8 writes, 16 keys, 8 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 8 writes, 4 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:12:01 np0005532585.localdomain sudo[67893]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcwadcnafmqaqpsgeehyqckynygqdtkc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:01 np0005532585.localdomain sudo[67893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:01 np0005532585.localdomain python3[67895]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:12:01 np0005532585.localdomain sudo[67893]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:01 np0005532585.localdomain sudo[67911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxqzzoyglbzfwlabvacyztwrdzxlqepx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:01 np0005532585.localdomain sudo[67911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:01 np0005532585.localdomain python3[67913]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:01 np0005532585.localdomain sudo[67911]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:01 np0005532585.localdomain sudo[67973]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqfuaqajokugoayaxfrkpibeewzbimen ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:01 np0005532585.localdomain sudo[67973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:02 np0005532585.localdomain python3[67975]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:12:02 np0005532585.localdomain sudo[67973]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:02 np0005532585.localdomain sudo[67991]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayorrwuaqubhfwbwienqyvclwvhxpzzh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:02 np0005532585.localdomain sudo[67991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:02 np0005532585.localdomain python3[67993]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:02 np0005532585.localdomain sudo[67991]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:02 np0005532585.localdomain sudo[68053]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvqxanqvawwaxbhdwyqullejktukplnk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:02 np0005532585.localdomain sudo[68053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:02 np0005532585.localdomain python3[68055]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:12:02 np0005532585.localdomain sudo[68053]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:03 np0005532585.localdomain sudo[68071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvpoiawwgnaescakbbkfpiehmxdvcgoy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:03 np0005532585.localdomain sudo[68071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:03 np0005532585.localdomain python3[68073]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:03 np0005532585.localdomain sudo[68071]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:03 np0005532585.localdomain sudo[68133]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcwoxuocccikubvhkjlcyvgvjcaslwib ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:03 np0005532585.localdomain sudo[68133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:03 np0005532585.localdomain python3[68135]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:12:03 np0005532585.localdomain sudo[68133]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:03 np0005532585.localdomain sudo[68151]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylopcrdmiawvskufyalirrcdtftgfyut ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:03 np0005532585.localdomain sudo[68151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:03 np0005532585.localdomain python3[68153]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:03 np0005532585.localdomain sudo[68151]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:04 np0005532585.localdomain sudo[68181]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysdnfyzskovliunblampitsqkolxixao ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:04 np0005532585.localdomain sudo[68181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:04 np0005532585.localdomain python3[68183]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:04 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:04 np0005532585.localdomain systemd-sysv-generator[68209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:04 np0005532585.localdomain systemd-rc-local-generator[68205]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:04 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:12:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 5196 writes, 22K keys, 5196 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5196 writes, 612 syncs, 8.49 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 72 writes, 104 keys, 72 commit groups, 1.0 writes per commit group, ingest: 0.03 MB, 0.00 MB/s
                                                          Interval WAL: 73 writes, 37 syncs, 1.97 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:12:04 np0005532585.localdomain sudo[68181]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:05 np0005532585.localdomain sudo[68267]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zebyotshbmnylrmfreolrsscuozswwtc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:05 np0005532585.localdomain sudo[68267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:05 np0005532585.localdomain python3[68269]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:12:05 np0005532585.localdomain sudo[68267]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:05 np0005532585.localdomain sudo[68285]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owqhaqsahwgmejlrnpdwhrjcgeptbrvk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:05 np0005532585.localdomain sudo[68285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:05 np0005532585.localdomain python3[68287]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:05 np0005532585.localdomain sudo[68285]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:05 np0005532585.localdomain sudo[68347]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-feethrkomgwpnnxcjfdztgsluogvsrxm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:05 np0005532585.localdomain sudo[68347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:06 np0005532585.localdomain python3[68349]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:12:06 np0005532585.localdomain sudo[68347]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:06 np0005532585.localdomain sudo[68365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbwmhguacqfiiofrdcbejmfmzhptaiuk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:06 np0005532585.localdomain sudo[68365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:12:06 np0005532585.localdomain systemd[1]: tmp-crun.69qU5i.mount: Deactivated successfully.
Nov 23 08:12:06 np0005532585.localdomain podman[68368]: 2025-11-23 08:12:06.335135351 +0000 UTC m=+0.096543354 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true)
Nov 23 08:12:06 np0005532585.localdomain podman[68368]: 2025-11-23 08:12:06.375284044 +0000 UTC m=+0.136692057 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z)
Nov 23 08:12:06 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:12:06 np0005532585.localdomain python3[68367]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:06 np0005532585.localdomain sudo[68365]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:06 np0005532585.localdomain sudo[68414]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibzkzbtvugxjbihqbcxohgrxwqeewlhg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:06 np0005532585.localdomain sudo[68414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:06 np0005532585.localdomain python3[68416]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:06 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:07 np0005532585.localdomain systemd-rc-local-generator[68442]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:07 np0005532585.localdomain systemd-sysv-generator[68446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 08:12:07 np0005532585.localdomain sudo[68414]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:07 np0005532585.localdomain sudo[68470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcqkwwmfmyubdopdpbtzrukwjkgycxph ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:07 np0005532585.localdomain sudo[68470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:12:07 np0005532585.localdomain podman[68472]: 2025-11-23 08:12:07.719809132 +0000 UTC m=+0.092643425 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, container_name=iscsid)
Nov 23 08:12:07 np0005532585.localdomain podman[68472]: 2025-11-23 08:12:07.735384756 +0000 UTC m=+0.108219079 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:12:07 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:12:07 np0005532585.localdomain python3[68473]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 23 08:12:07 np0005532585.localdomain sudo[68470]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:08 np0005532585.localdomain sudo[68505]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggzmtrcdbeozezpqzugtxtjhxfvuqrsq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:08 np0005532585.localdomain sudo[68505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:08 np0005532585.localdomain sudo[68505]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:09 np0005532585.localdomain sudo[68549]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvknqwomnukfbzgozzttpnnpiocagttf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:09 np0005532585.localdomain sudo[68549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:09 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 23 08:12:09 np0005532585.localdomain podman[68702]: 2025-11-23 08:12:09.798396842 +0000 UTC m=+0.070610873 container create d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12)
Nov 23 08:12:09 np0005532585.localdomain podman[68732]: 2025-11-23 08:12:09.827087656 +0000 UTC m=+0.068677614 container create 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:12:09 np0005532585.localdomain podman[68727]: 2025-11-23 08:12:09.845141547 +0000 UTC m=+0.091069357 container create b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope.
Nov 23 08:12:09 np0005532585.localdomain podman[68702]: 2025-11-23 08:12:09.763272831 +0000 UTC m=+0.035486892 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 23 08:12:09 np0005532585.localdomain podman[68731]: 2025-11-23 08:12:09.868176278 +0000 UTC m=+0.110168119 container create c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.scope.
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1.scope.
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8785be8dea5fa0361315af1fc74fe453e62e737d2ff3d773f6811b45d15cd9a/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/503e987379308b9e6b9946670c4ac6382bcf032235dd53dd51b9045ac75aedc7/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:09 np0005532585.localdomain podman[68749]: 2025-11-23 08:12:09.889556339 +0000 UTC m=+0.119077979 container create 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:12:09 np0005532585.localdomain podman[68731]: 2025-11-23 08:12:09.790601594 +0000 UTC m=+0.032593445 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b.scope.
Nov 23 08:12:09 np0005532585.localdomain podman[68732]: 2025-11-23 08:12:09.794740281 +0000 UTC m=+0.036330209 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 23 08:12:09 np0005532585.localdomain podman[68727]: 2025-11-23 08:12:09.897377178 +0000 UTC m=+0.143304988 container init b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, release=1761123044, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, container_name=configure_cms_options, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc.)
Nov 23 08:12:09 np0005532585.localdomain podman[68727]: 2025-11-23 08:12:09.797874986 +0000 UTC m=+0.043802816 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 23 08:12:09 np0005532585.localdomain podman[68727]: 2025-11-23 08:12:09.906212717 +0000 UTC m=+0.152140557 container start b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, container_name=configure_cms_options, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:12:09 np0005532585.localdomain podman[68727]: 2025-11-23 08:12:09.906672981 +0000 UTC m=+0.152600811 container attach b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=configure_cms_options, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope.
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:12:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:09 np0005532585.localdomain podman[68702]: 2025-11-23 08:12:09.920026498 +0000 UTC m=+0.192240519 container init d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:12:09 np0005532585.localdomain podman[68731]: 2025-11-23 08:12:09.924133844 +0000 UTC m=+0.166125675 container init c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:09 np0005532585.localdomain podman[68731]: 2025-11-23 08:12:09.932515899 +0000 UTC m=+0.174507760 container start c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=nova_libvirt_init_secret, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt)
Nov 23 08:12:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43834aabac3051c95b0bd48b6a3d7296604e45656eac8be0b6aa4803a8bc68b2/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:09 np0005532585.localdomain podman[68731]: 2025-11-23 08:12:09.932812458 +0000 UTC m=+0.174804289 container attach c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, version=17.1.12, config_id=tripleo_step4)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:12:09 np0005532585.localdomain podman[68702]: 2025-11-23 08:12:09.936322905 +0000 UTC m=+0.208536926 container start d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:12:09 np0005532585.localdomain sudo[68809]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:12:09 np0005532585.localdomain sudo[68809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 23 08:12:09 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1bd1f352f264f24512a1a2440e47a1f5 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 23 08:12:09 np0005532585.localdomain podman[68749]: 2025-11-23 08:12:09.849629063 +0000 UTC m=+0.079150713 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:12:09 np0005532585.localdomain podman[68732]: 2025-11-23 08:12:09.966282218 +0000 UTC m=+0.207872176 container init 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:12:09 np0005532585.localdomain podman[68749]: 2025-11-23 08:12:09.969157366 +0000 UTC m=+0.198679046 container init 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:12:09 np0005532585.localdomain sudo[68841]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:12:09 np0005532585.localdomain sudo[68841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:12:09 np0005532585.localdomain sudo[68844]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:12:09 np0005532585.localdomain sudo[68844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:12:10 np0005532585.localdomain sudo[68809]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:12:10 np0005532585.localdomain podman[68749]: 2025-11-23 08:12:10.00704761 +0000 UTC m=+0.236569250 container start 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible)
Nov 23 08:12:10 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1bd1f352f264f24512a1a2440e47a1f5 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 23 08:12:10 np0005532585.localdomain ovs-vsctl[68863]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: libpod-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1.scope: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain podman[68727]: 2025-11-23 08:12:10.025037949 +0000 UTC m=+0.270965779 container died b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, io.openshift.expose-services=, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:12:10 np0005532585.localdomain sudo[68841]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:10 np0005532585.localdomain podman[68732]: 2025-11-23 08:12:10.042564313 +0000 UTC m=+0.284154221 container start 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, architecture=x86_64)
Nov 23 08:12:10 np0005532585.localdomain crond[68839]: (CRON) STARTUP (1.5.7)
Nov 23 08:12:10 np0005532585.localdomain crond[68839]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 6% if used.)
Nov 23 08:12:10 np0005532585.localdomain crond[68839]: (CRON) INFO (running with inotify support)
Nov 23 08:12:10 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 23 08:12:10 np0005532585.localdomain sudo[68844]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: libpod-c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b.scope: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain podman[68854]: 2025-11-23 08:12:10.1431923 +0000 UTC m=+0.129488787 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Nov 23 08:12:10 np0005532585.localdomain podman[68882]: 2025-11-23 08:12:10.166783319 +0000 UTC m=+0.130041174 container cleanup b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 23 08:12:10 np0005532585.localdomain podman[68843]: 2025-11-23 08:12:10.065570214 +0000 UTC m=+0.077165472 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: libpod-conmon-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1.scope: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain podman[68811]: 2025-11-23 08:12:10.025352419 +0000 UTC m=+0.082006330 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:12:10 np0005532585.localdomain podman[68854]: 2025-11-23 08:12:10.182192239 +0000 UTC m=+0.168488726 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044)
Nov 23 08:12:10 np0005532585.localdomain podman[68854]: unhealthy
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed with result 'exit-code'.
Nov 23 08:12:10 np0005532585.localdomain podman[68843]: 2025-11-23 08:12:10.200227488 +0000 UTC m=+0.211822766 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Nov 23 08:12:10 np0005532585.localdomain podman[68811]: 2025-11-23 08:12:10.210145551 +0000 UTC m=+0.266799482 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:12:10 np0005532585.localdomain podman[68811]: unhealthy
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed with result 'exit-code'.
Nov 23 08:12:10 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Nov 23 08:12:10 np0005532585.localdomain podman[68731]: 2025-11-23 08:12:10.278856614 +0000 UTC m=+0.520848465 container died c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:12:10 np0005532585.localdomain podman[68918]: 2025-11-23 08:12:10.415180579 +0000 UTC m=+0.329676979 container cleanup c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, container_name=nova_libvirt_init_secret, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:35:22Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container)
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: libpod-conmon-c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b.scope: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Nov 23 08:12:10 np0005532585.localdomain podman[69077]: 2025-11-23 08:12:10.514133625 +0000 UTC m=+0.070955023 container create e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=)
Nov 23 08:12:10 np0005532585.localdomain podman[69089]: 2025-11-23 08:12:10.540755257 +0000 UTC m=+0.079281237 container create 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope.
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope.
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:10 np0005532585.localdomain podman[69077]: 2025-11-23 08:12:10.474198488 +0000 UTC m=+0.031019916 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:12:10 np0005532585.localdomain podman[69089]: 2025-11-23 08:12:10.578769905 +0000 UTC m=+0.117295885 container init 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Nov 23 08:12:10 np0005532585.localdomain podman[69089]: 2025-11-23 08:12:10.586748438 +0000 UTC m=+0.125274428 container start 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:12:10 np0005532585.localdomain podman[69089]: 2025-11-23 08:12:10.586965035 +0000 UTC m=+0.125491045 container attach 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, distribution-scope=public, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:12:10 np0005532585.localdomain podman[69077]: 2025-11-23 08:12:10.589286445 +0000 UTC m=+0.146107863 container init e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 23 08:12:10 np0005532585.localdomain podman[69089]: 2025-11-23 08:12:10.492136914 +0000 UTC m=+0.030662944 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 23 08:12:10 np0005532585.localdomain sudo[69142]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:12:10 np0005532585.localdomain sudo[69142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:12:10 np0005532585.localdomain podman[69077]: 2025-11-23 08:12:10.614006999 +0000 UTC m=+0.170828407 container start e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:12:10 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:12:10 np0005532585.localdomain sudo[69142]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:10 np0005532585.localdomain sshd[69165]: Server listening on 0.0.0.0 port 2022.
Nov 23 08:12:10 np0005532585.localdomain sshd[69165]: Server listening on :: port 2022.
Nov 23 08:12:10 np0005532585.localdomain podman[69145]: 2025-11-23 08:12:10.750042916 +0000 UTC m=+0.129424717 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, container_name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-63ab7b1688305e5f88e4974e557ea0bb87f0e73ce1236c8e61a48437546ff0ac-merged.mount: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1-userdata-shm.mount: Deactivated successfully.
Nov 23 08:12:10 np0005532585.localdomain sudo[69193]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj903lcn0/privsep.sock
Nov 23 08:12:10 np0005532585.localdomain sudo[69193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 23 08:12:11 np0005532585.localdomain podman[69145]: 2025-11-23 08:12:11.094660048 +0000 UTC m=+0.474041819 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:12:11 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:12:11 np0005532585.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 23 08:12:11 np0005532585.localdomain sudo[69193]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:13 np0005532585.localdomain ovs-vsctl[69322]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: libpod-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope: Deactivated successfully.
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: libpod-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope: Consumed 2.918s CPU time.
Nov 23 08:12:13 np0005532585.localdomain podman[69089]: 2025-11-23 08:12:13.586389569 +0000 UTC m=+3.124915609 container died 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: tmp-crun.AyrjvV.mount: Deactivated successfully.
Nov 23 08:12:13 np0005532585.localdomain podman[69324]: 2025-11-23 08:12:13.680477437 +0000 UTC m=+0.074225124 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2-userdata-shm.mount: Deactivated successfully.
Nov 23 08:12:13 np0005532585.localdomain podman[69323]: 2025-11-23 08:12:13.727249081 +0000 UTC m=+0.127883187 container cleanup 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=setup_ovs_manager, url=https://www.redhat.com)
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: libpod-conmon-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope: Deactivated successfully.
Nov 23 08:12:13 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Nov 23 08:12:13 np0005532585.localdomain podman[69324]: 2025-11-23 08:12:13.839346648 +0000 UTC m=+0.233094315 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:12:13 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:12:14 np0005532585.localdomain podman[69460]: 2025-11-23 08:12:14.154910546 +0000 UTC m=+0.063679772 container create 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started libpod-conmon-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:14 np0005532585.localdomain podman[69460]: 2025-11-23 08:12:14.124640643 +0000 UTC m=+0.033409909 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:12:14 np0005532585.localdomain podman[69460]: 2025-11-23 08:12:14.25182957 +0000 UTC m=+0.160598826 container init 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 23 08:12:14 np0005532585.localdomain podman[69476]: 2025-11-23 08:12:14.279371858 +0000 UTC m=+0.147894467 container create 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, container_name=ovn_metadata_agent)
Nov 23 08:12:14 np0005532585.localdomain podman[69476]: 2025-11-23 08:12:14.1839429 +0000 UTC m=+0.052465569 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:12:14 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:12:14 np0005532585.localdomain podman[69460]: 2025-11-23 08:12:14.289089605 +0000 UTC m=+0.197858861 container start 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:12:14 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started libpod-conmon-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:12:14 np0005532585.localdomain podman[69499]: 2025-11-23 08:12:14.386205095 +0000 UTC m=+0.091167840 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:12:14 np0005532585.localdomain podman[69499]: 2025-11-23 08:12:14.406202415 +0000 UTC m=+0.111165190 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 08:12:14 np0005532585.localdomain podman[69499]: unhealthy
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:12:14 np0005532585.localdomain podman[69476]: 2025-11-23 08:12:14.479986723 +0000 UTC m=+0.348509402 container init 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=)
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Queued start job for default target Main User Target.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Created slice User Application Slice.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Reached target Paths.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Reached target Timers.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Starting D-Bus User Message Bus Socket...
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Starting Create User's Volatile Files and Directories...
Nov 23 08:12:14 np0005532585.localdomain sudo[69563]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:12:14 np0005532585.localdomain sudo[69563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Finished Create User's Volatile Files and Directories.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Listening on D-Bus User Message Bus Socket.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Reached target Sockets.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Reached target Basic System.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Reached target Main User Target.
Nov 23 08:12:14 np0005532585.localdomain systemd[69521]: Startup finished in 155ms.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started User Manager for UID 0.
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: Started Session c9 of User root.
Nov 23 08:12:14 np0005532585.localdomain sudo[69563]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Nov 23 08:12:14 np0005532585.localdomain kernel: device br-int entered promiscuous mode
Nov 23 08:12:14 np0005532585.localdomain NetworkManager[5975]: <info>  [1763885534.6067] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Nov 23 08:12:14 np0005532585.localdomain systemd-udevd[69589]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 08:12:14 np0005532585.localdomain podman[69476]: 2025-11-23 08:12:14.62362208 +0000 UTC m=+0.492144729 container start 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044)
Nov 23 08:12:14 np0005532585.localdomain python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=a43bf0e2ecc9c9d02be7a27eac338b4c --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9cd49e7d070c0c00c09eb4d4067ba3757eea8902a6a55fed079db1124e7c8aad-merged.mount: Deactivated successfully.
Nov 23 08:12:14 np0005532585.localdomain podman[69566]: 2025-11-23 08:12:14.664253999 +0000 UTC m=+0.140524083 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:14:25Z, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Nov 23 08:12:14 np0005532585.localdomain podman[69566]: 2025-11-23 08:12:14.711648934 +0000 UTC m=+0.187919078 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:12:14 np0005532585.localdomain podman[69566]: unhealthy
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:12:14 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:12:14 np0005532585.localdomain sudo[68549]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:15 np0005532585.localdomain sudo[69640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aibrgteovbzmetwdfrpjbbkhqryjvzqi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:15 np0005532585.localdomain sudo[69640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:15 np0005532585.localdomain python3[69643]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:15 np0005532585.localdomain sudo[69640]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:15 np0005532585.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Nov 23 08:12:15 np0005532585.localdomain NetworkManager[5975]: <info>  [1763885535.2858] device (genev_sys_6081): carrier: link connected
Nov 23 08:12:15 np0005532585.localdomain NetworkManager[5975]: <info>  [1763885535.2864] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Nov 23 08:12:15 np0005532585.localdomain sudo[69659]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypomwrolrsnbyhohozwtxriltbwubzhm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:15 np0005532585.localdomain sudo[69659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:15 np0005532585.localdomain python3[69661]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:15 np0005532585.localdomain sudo[69659]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:15 np0005532585.localdomain sudo[69675]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqvnvghvugdljznvvguduffptkfrsrxf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:15 np0005532585.localdomain sudo[69675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:15 np0005532585.localdomain python3[69677]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:15 np0005532585.localdomain sudo[69675]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:15 np0005532585.localdomain sudo[69691]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pppavrfikltmryemvokgolfkklstojmo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:15 np0005532585.localdomain sudo[69691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:16 np0005532585.localdomain python3[69693]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:16 np0005532585.localdomain sudo[69691]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:16 np0005532585.localdomain sudo[69707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjoazrchilaciiigfbmnyisfrbmeigtx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:16 np0005532585.localdomain sudo[69707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:16 np0005532585.localdomain python3[69709]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:16 np0005532585.localdomain sudo[69707]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:16 np0005532585.localdomain sudo[69711]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmp1rqtlzmv/privsep.sock
Nov 23 08:12:16 np0005532585.localdomain sudo[69711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 23 08:12:16 np0005532585.localdomain sudo[69726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqemlcgogdiikvlzsalnvvxlfovaerrk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:16 np0005532585.localdomain sudo[69726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:16 np0005532585.localdomain python3[69728]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:16 np0005532585.localdomain sudo[69726]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:16 np0005532585.localdomain sudo[69743]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zavboxkhpocepolpftxsvipghzocpfqj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:16 np0005532585.localdomain sudo[69743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:16 np0005532585.localdomain python3[69745]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:16 np0005532585.localdomain sudo[69743]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:16 np0005532585.localdomain sudo[69759]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmadwpvbtgbkwmzyimgyrmxkizvomocv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:16 np0005532585.localdomain sudo[69759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:16 np0005532585.localdomain sudo[69711]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:17 np0005532585.localdomain python3[69761]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:17 np0005532585.localdomain sudo[69759]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:17 np0005532585.localdomain sudo[69777]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frqkgcawwroqmxcqjtcbgnxbgebpelax ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:17 np0005532585.localdomain sudo[69777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:17 np0005532585.localdomain python3[69779]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:17 np0005532585.localdomain sudo[69777]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:17 np0005532585.localdomain sudo[69795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-talupnzrkbqtzmahwzgjlpzzevqffwzy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:17 np0005532585.localdomain sudo[69795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:17 np0005532585.localdomain python3[69797]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:17 np0005532585.localdomain sudo[69795]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:17 np0005532585.localdomain sudo[69811]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnizwxkuswapzbiehatkbczkdhrtwzre ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:17 np0005532585.localdomain sudo[69811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:17 np0005532585.localdomain python3[69813]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:17 np0005532585.localdomain sudo[69811]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:17 np0005532585.localdomain sudo[69827]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctinabxsqnudbphsehhlvwihmzrbzgnk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:17 np0005532585.localdomain sudo[69827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:18 np0005532585.localdomain python3[69829]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:12:18 np0005532585.localdomain sudo[69827]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:18 np0005532585.localdomain sudo[69888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihlwdgqdbgbzlngoyyjteazzocabcufh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:18 np0005532585.localdomain sudo[69888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:18 np0005532585.localdomain python3[69890]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:18 np0005532585.localdomain sudo[69888]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:18 np0005532585.localdomain sudo[69917]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpgawgdirjkabihrgomdfqfrvbxdoott ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:18 np0005532585.localdomain sudo[69917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:19 np0005532585.localdomain python3[69919]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:19 np0005532585.localdomain sudo[69917]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:19 np0005532585.localdomain sudo[69946]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktrmqsietrrierxvimvbtwywfphzmkgi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:19 np0005532585.localdomain sudo[69946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:19 np0005532585.localdomain python3[69948]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:19 np0005532585.localdomain sudo[69946]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:19 np0005532585.localdomain sudo[69975]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrymxmhoonvcixmtxamarykjyckbfkwl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:19 np0005532585.localdomain sudo[69975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:20 np0005532585.localdomain python3[69977]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:20 np0005532585.localdomain sudo[69975]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:20 np0005532585.localdomain sudo[70004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndsgcthttqvpgpmixguxexiitvxosxge ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:20 np0005532585.localdomain sudo[70004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:20 np0005532585.localdomain python3[70006]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:20 np0005532585.localdomain sudo[70004]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:20 np0005532585.localdomain sudo[70033]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzryhauoxngoozqpsuxyogdmfipxzgli ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:20 np0005532585.localdomain sudo[70033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:20 np0005532585.localdomain python3[70035]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:20 np0005532585.localdomain sudo[70033]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:21 np0005532585.localdomain sudo[70049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iggzslcfoibfbqasxzkaaaolwmvuqlug ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:21 np0005532585.localdomain sudo[70049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:21 np0005532585.localdomain python3[70051]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 08:12:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:21 np0005532585.localdomain systemd-rc-local-generator[70076]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:21 np0005532585.localdomain systemd-sysv-generator[70080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:21 np0005532585.localdomain sudo[70049]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:21 np0005532585.localdomain sudo[70102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzcpzobeghykcwllsbvbkingrzwpbfbd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:21 np0005532585.localdomain sudo[70102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:22 np0005532585.localdomain python3[70104]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:22 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:22 np0005532585.localdomain systemd-sysv-generator[70134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:22 np0005532585.localdomain systemd-rc-local-generator[70130]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:22 np0005532585.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 23 08:12:22 np0005532585.localdomain tripleo-start-podman-container[70144]: Creating additional drop-in dependency for "ceilometer_agent_compute" (6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9)
Nov 23 08:12:22 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:22 np0005532585.localdomain systemd-sysv-generator[70207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:22 np0005532585.localdomain systemd-rc-local-generator[70203]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:23 np0005532585.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 23 08:12:23 np0005532585.localdomain sudo[70102]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:23 np0005532585.localdomain sudo[70227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxnxgwscbecwyfbprcmvasukxkpadmlf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:23 np0005532585.localdomain sudo[70227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:23 np0005532585.localdomain python3[70229]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:23 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:24 np0005532585.localdomain systemd-rc-local-generator[70254]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:24 np0005532585.localdomain systemd-sysv-generator[70261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Nov 23 08:12:24 np0005532585.localdomain sudo[70227]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 23 08:12:24 np0005532585.localdomain sudo[70295]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwaviunjfypyjojwzyyyzmcmmilzfgxr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Activating special unit Exit the Session...
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped target Main User Target.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped target Basic System.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped target Paths.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped target Sockets.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped target Timers.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Closed D-Bus User Message Bus Socket.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Stopped Create User's Volatile Files and Directories.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Removed slice User Application Slice.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Reached target Shutdown.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Finished Exit the Session.
Nov 23 08:12:24 np0005532585.localdomain systemd[69521]: Reached target Exit the Session.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 23 08:12:24 np0005532585.localdomain sudo[70295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 23 08:12:24 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 23 08:12:24 np0005532585.localdomain python3[70297]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:26 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:26 np0005532585.localdomain systemd-rc-local-generator[70327]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:26 np0005532585.localdomain systemd-sysv-generator[70332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:26 np0005532585.localdomain systemd[1]: Starting logrotate_crond container...
Nov 23 08:12:26 np0005532585.localdomain systemd[1]: Started logrotate_crond container.
Nov 23 08:12:26 np0005532585.localdomain sudo[70295]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:26 np0005532585.localdomain sudo[70364]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjufqabtfcqgbcsflocxtghrscqoqcks ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:26 np0005532585.localdomain sudo[70364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:27 np0005532585.localdomain python3[70366]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:27 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:27 np0005532585.localdomain systemd-rc-local-generator[70391]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:27 np0005532585.localdomain systemd-sysv-generator[70396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:27 np0005532585.localdomain systemd[1]: Starting nova_migration_target container...
Nov 23 08:12:27 np0005532585.localdomain systemd[1]: Started nova_migration_target container.
Nov 23 08:12:27 np0005532585.localdomain sudo[70364]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:28 np0005532585.localdomain sudo[70430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfciizcfwmspbztqstglfyntxxrqqdxb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:28 np0005532585.localdomain sudo[70430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:28 np0005532585.localdomain python3[70432]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:28 np0005532585.localdomain systemd-rc-local-generator[70455]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:28 np0005532585.localdomain systemd-sysv-generator[70459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:28 np0005532585.localdomain systemd[1]: Starting ovn_controller container...
Nov 23 08:12:28 np0005532585.localdomain tripleo-start-podman-container[70471]: Creating additional drop-in dependency for "ovn_controller" (99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23)
Nov 23 08:12:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:28 np0005532585.localdomain systemd-sysv-generator[70535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:28 np0005532585.localdomain systemd-rc-local-generator[70531]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:29 np0005532585.localdomain systemd[1]: Started ovn_controller container.
Nov 23 08:12:29 np0005532585.localdomain sudo[70430]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:29 np0005532585.localdomain sudo[70554]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjnljdrjxcavogiuyjrtqwzaxkatzozs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:12:29 np0005532585.localdomain sudo[70554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:29 np0005532585.localdomain python3[70556]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:12:29 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:12:29 np0005532585.localdomain systemd-sysv-generator[70586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:12:29 np0005532585.localdomain systemd-rc-local-generator[70582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:12:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:12:30 np0005532585.localdomain systemd[1]: Starting ovn_metadata_agent container...
Nov 23 08:12:30 np0005532585.localdomain systemd[1]: Started ovn_metadata_agent container.
Nov 23 08:12:30 np0005532585.localdomain sudo[70554]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:30 np0005532585.localdomain sudo[70635]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wozaaqctolhrpadxrhldrgswtqinqaju ; /usr/bin/python3
Nov 23 08:12:30 np0005532585.localdomain sudo[70635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:30 np0005532585.localdomain python3[70637]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:30 np0005532585.localdomain sudo[70635]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:31 np0005532585.localdomain sudo[70683]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phfyoegzqbopkrpivpiztqibuefmwvxm ; /usr/bin/python3
Nov 23 08:12:31 np0005532585.localdomain sudo[70683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:31 np0005532585.localdomain sudo[70683]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:31 np0005532585.localdomain sudo[70726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpzuchmtivtdsbotshxmkzowflxmzyay ; /usr/bin/python3
Nov 23 08:12:31 np0005532585.localdomain sudo[70726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:31 np0005532585.localdomain sudo[70726]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:31 np0005532585.localdomain sudo[70756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyxymnjdsegwyypyhubpxsvbocpwdplf ; /usr/bin/python3
Nov 23 08:12:31 np0005532585.localdomain sudo[70756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:31 np0005532585.localdomain python3[70758]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005532585 step=4 update_config_hash_only=False
Nov 23 08:12:32 np0005532585.localdomain sudo[70756]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:32 np0005532585.localdomain sudo[70772]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgsdymzvsubqiaibodugdiuregxasonc ; /usr/bin/python3
Nov 23 08:12:32 np0005532585.localdomain sudo[70772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:32 np0005532585.localdomain python3[70774]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:12:32 np0005532585.localdomain sudo[70772]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:32 np0005532585.localdomain sudo[70789]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okrbwofzktbsrwljetwgpggqhjvcicoq ; /usr/bin/python3
Nov 23 08:12:32 np0005532585.localdomain sudo[70789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:12:32 np0005532585.localdomain python3[70791]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 23 08:12:32 np0005532585.localdomain sudo[70789]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:12:37 np0005532585.localdomain podman[70792]: 2025-11-23 08:12:37.024823496 +0000 UTC m=+0.077413121 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:12:37 np0005532585.localdomain podman[70792]: 2025-11-23 08:12:37.037394359 +0000 UTC m=+0.089983994 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, container_name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 23 08:12:37 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:12:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:12:38 np0005532585.localdomain podman[70812]: 2025-11-23 08:12:38.003016719 +0000 UTC m=+0.067515319 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3)
Nov 23 08:12:38 np0005532585.localdomain podman[70812]: 2025-11-23 08:12:38.039389927 +0000 UTC m=+0.103888467 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:12:38 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:12:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:12:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:12:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: tmp-crun.PKd3dw.mount: Deactivated successfully.
Nov 23 08:12:41 np0005532585.localdomain podman[70835]: 2025-11-23 08:12:41.014922313 +0000 UTC m=+0.067978703 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: tmp-crun.lm4kOZ.mount: Deactivated successfully.
Nov 23 08:12:41 np0005532585.localdomain podman[70834]: 2025-11-23 08:12:41.061654187 +0000 UTC m=+0.113706127 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Nov 23 08:12:41 np0005532585.localdomain podman[70833]: 2025-11-23 08:12:41.061819322 +0000 UTC m=+0.118055939 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 23 08:12:41 np0005532585.localdomain podman[70834]: 2025-11-23 08:12:41.121521101 +0000 UTC m=+0.173573091 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute)
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:12:41 np0005532585.localdomain podman[70833]: 2025-11-23 08:12:41.146356318 +0000 UTC m=+0.202592905 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64)
Nov 23 08:12:41 np0005532585.localdomain podman[70835]: 2025-11-23 08:12:41.144579874 +0000 UTC m=+0.197636294 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:12:41 np0005532585.localdomain podman[70902]: 2025-11-23 08:12:41.241817198 +0000 UTC m=+0.075117511 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:12:41 np0005532585.localdomain podman[70902]: 2025-11-23 08:12:41.616515058 +0000 UTC m=+0.449815391 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Nov 23 08:12:41 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:12:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:12:44 np0005532585.localdomain systemd[1]: tmp-crun.zzxHOY.mount: Deactivated successfully.
Nov 23 08:12:44 np0005532585.localdomain podman[70924]: 2025-11-23 08:12:44.04629369 +0000 UTC m=+0.099628737 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1761123044)
Nov 23 08:12:44 np0005532585.localdomain podman[70924]: 2025-11-23 08:12:44.256355242 +0000 UTC m=+0.309690199 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible)
Nov 23 08:12:44 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:12:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:12:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:12:45 np0005532585.localdomain systemd[1]: tmp-crun.v8Rvml.mount: Deactivated successfully.
Nov 23 08:12:45 np0005532585.localdomain podman[70952]: 2025-11-23 08:12:45.030547127 +0000 UTC m=+0.088313892 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:12:45 np0005532585.localdomain systemd[1]: tmp-crun.Lz0prH.mount: Deactivated successfully.
Nov 23 08:12:45 np0005532585.localdomain podman[70953]: 2025-11-23 08:12:45.08314326 +0000 UTC m=+0.138788090 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git)
Nov 23 08:12:45 np0005532585.localdomain podman[70952]: 2025-11-23 08:12:45.094381313 +0000 UTC m=+0.152148048 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:12:45 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:12:45 np0005532585.localdomain podman[70953]: 2025-11-23 08:12:45.109561326 +0000 UTC m=+0.165206186 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:12:45 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:12:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 08:12:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 08:12:55 np0005532585.localdomain sudo[71001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:12:55 np0005532585.localdomain sudo[71001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:55 np0005532585.localdomain sudo[71001]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:55 np0005532585.localdomain sudo[71016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 08:12:55 np0005532585.localdomain sudo[71016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:55 np0005532585.localdomain sudo[71016]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:56 np0005532585.localdomain sudo[71053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:12:56 np0005532585.localdomain sudo[71053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:56 np0005532585.localdomain sudo[71053]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:56 np0005532585.localdomain sudo[71068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:12:56 np0005532585.localdomain sudo[71068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:56 np0005532585.localdomain sudo[71068]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:56 np0005532585.localdomain sudo[71116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:12:56 np0005532585.localdomain sudo[71116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:56 np0005532585.localdomain sudo[71116]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:56 np0005532585.localdomain sudo[71131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 08:12:56 np0005532585.localdomain sudo[71131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 2025-11-23 08:12:57.439059184 +0000 UTC m=+0.075650636 container create f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vendor=Red Hat, Inc., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Nov 23 08:12:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75.scope.
Nov 23 08:12:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 2025-11-23 08:12:57.407608836 +0000 UTC m=+0.044200328 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 2025-11-23 08:12:57.548314193 +0000 UTC m=+0.184905645 container init f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, io.openshift.expose-services=, ceph=True, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc.)
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 2025-11-23 08:12:57.55869897 +0000 UTC m=+0.195290432 container start f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55)
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 2025-11-23 08:12:57.558981928 +0000 UTC m=+0.195573440 container attach f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, io.buildah.version=1.33.12, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7)
Nov 23 08:12:57 np0005532585.localdomain elastic_bhaskara[71202]: 167 167
Nov 23 08:12:57 np0005532585.localdomain systemd[1]: libpod-f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75.scope: Deactivated successfully.
Nov 23 08:12:57 np0005532585.localdomain podman[71187]: 2025-11-23 08:12:57.564476446 +0000 UTC m=+0.201067948 container died f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, version=7)
Nov 23 08:12:57 np0005532585.localdomain podman[71207]: 2025-11-23 08:12:57.658655067 +0000 UTC m=+0.084313621 container remove f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, RELEASE=main)
Nov 23 08:12:57 np0005532585.localdomain systemd[1]: libpod-conmon-f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75.scope: Deactivated successfully.
Nov 23 08:12:57 np0005532585.localdomain podman[71229]: 
Nov 23 08:12:57 np0005532585.localdomain podman[71229]: 2025-11-23 08:12:57.864945264 +0000 UTC m=+0.081566177 container create c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_BRANCH=main)
Nov 23 08:12:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c.scope.
Nov 23 08:12:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:12:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 08:12:57 np0005532585.localdomain podman[71229]: 2025-11-23 08:12:57.831473394 +0000 UTC m=+0.048094307 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 08:12:57 np0005532585.localdomain podman[71229]: 2025-11-23 08:12:57.931197924 +0000 UTC m=+0.147818847 container init c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True)
Nov 23 08:12:57 np0005532585.localdomain podman[71229]: 2025-11-23 08:12:57.938053162 +0000 UTC m=+0.154674075 container start c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 08:12:57 np0005532585.localdomain podman[71229]: 2025-11-23 08:12:57.938252708 +0000 UTC m=+0.154873631 container attach c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-type=git)
Nov 23 08:12:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f5c66cdc19ec9d720046bbff2a6ffc9a66c1f02b6abc0c7fcab935f67be64f1a-merged.mount: Deactivated successfully.
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]: [
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:     {
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "available": false,
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "ceph_device": false,
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "lsm_data": {},
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "lvs": [],
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "path": "/dev/sr0",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "rejected_reasons": [
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "Has a FileSystem",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "Insufficient space (<5GB)"
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         ],
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         "sys_api": {
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "actuators": null,
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "device_nodes": "sr0",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "human_readable_size": "482.00 KB",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "id_bus": "ata",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "model": "QEMU DVD-ROM",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "nr_requests": "2",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "partitions": {},
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "path": "/dev/sr0",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "removable": "1",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "rev": "2.5+",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "ro": "0",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "rotational": "1",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "sas_address": "",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "sas_device_handle": "",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "scheduler_mode": "mq-deadline",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "sectors": 0,
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "sectorsize": "2048",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "size": 493568.0,
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "support_discard": "0",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "type": "disk",
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:             "vendor": "QEMU"
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:         }
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]:     }
Nov 23 08:12:58 np0005532585.localdomain boring_mcnulty[71245]: ]
Nov 23 08:12:58 np0005532585.localdomain systemd[1]: libpod-c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c.scope: Deactivated successfully.
Nov 23 08:12:58 np0005532585.localdomain podman[71229]: 2025-11-23 08:12:58.86980309 +0000 UTC m=+1.086424053 container died c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, release=553, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container)
Nov 23 08:12:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd-merged.mount: Deactivated successfully.
Nov 23 08:12:58 np0005532585.localdomain podman[73262]: 2025-11-23 08:12:58.944146425 +0000 UTC m=+0.068972503 container remove c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 23 08:12:58 np0005532585.localdomain systemd[1]: libpod-conmon-c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c.scope: Deactivated successfully.
Nov 23 08:12:58 np0005532585.localdomain sudo[71131]: pam_unix(sudo:session): session closed for user root
Nov 23 08:12:59 np0005532585.localdomain sudo[73277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:12:59 np0005532585.localdomain sudo[73277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:12:59 np0005532585.localdomain sudo[73277]: pam_unix(sudo:session): session closed for user root
Nov 23 08:13:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:13:08 np0005532585.localdomain podman[73292]: 2025-11-23 08:13:08.037149235 +0000 UTC m=+0.088617542 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, architecture=x86_64, container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:13:08 np0005532585.localdomain podman[73292]: 2025-11-23 08:13:08.072825172 +0000 UTC m=+0.124293479 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Nov 23 08:13:08 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:13:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:13:08 np0005532585.localdomain systemd[1]: tmp-crun.EI3ST2.mount: Deactivated successfully.
Nov 23 08:13:08 np0005532585.localdomain podman[73310]: 2025-11-23 08:13:08.202064821 +0000 UTC m=+0.088044575 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, version=17.1.12, architecture=x86_64, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:13:08 np0005532585.localdomain podman[73310]: 2025-11-23 08:13:08.23716244 +0000 UTC m=+0.123142164 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, container_name=iscsid, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1)
Nov 23 08:13:08 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:13:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:13:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:13:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:13:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:13:12 np0005532585.localdomain podman[73330]: 2025-11-23 08:13:12.038243945 +0000 UTC m=+0.089796708 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible)
Nov 23 08:13:12 np0005532585.localdomain podman[73329]: 2025-11-23 08:13:12.092796998 +0000 UTC m=+0.144947479 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Nov 23 08:13:12 np0005532585.localdomain podman[73329]: 2025-11-23 08:13:12.105253657 +0000 UTC m=+0.157404078 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, version=17.1.12, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:13:12 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:13:12 np0005532585.localdomain podman[73331]: 2025-11-23 08:13:12.143209474 +0000 UTC m=+0.189790075 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4)
Nov 23 08:13:12 np0005532585.localdomain podman[73335]: 2025-11-23 08:13:12.200999925 +0000 UTC m=+0.245185503 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4)
Nov 23 08:13:12 np0005532585.localdomain podman[73330]: 2025-11-23 08:13:12.217799817 +0000 UTC m=+0.269352580 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Nov 23 08:13:12 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:13:12 np0005532585.localdomain podman[73331]: 2025-11-23 08:13:12.272537996 +0000 UTC m=+0.319118527 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:13:12 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:13:12 np0005532585.localdomain podman[73335]: 2025-11-23 08:13:12.58735221 +0000 UTC m=+0.631537758 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute)
Nov 23 08:13:12 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:13:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:13:15 np0005532585.localdomain podman[73422]: 2025-11-23 08:13:15.031584374 +0000 UTC m=+0.083546387 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr)
Nov 23 08:13:15 np0005532585.localdomain podman[73422]: 2025-11-23 08:13:15.235273132 +0000 UTC m=+0.287235085 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:13:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:13:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:13:15 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:13:15 np0005532585.localdomain podman[73452]: 2025-11-23 08:13:15.345962415 +0000 UTC m=+0.077638917 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=tripleo_ansible)
Nov 23 08:13:15 np0005532585.localdomain podman[73453]: 2025-11-23 08:13:15.32511949 +0000 UTC m=+0.056880275 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:13:15 np0005532585.localdomain podman[73453]: 2025-11-23 08:13:15.409491262 +0000 UTC m=+0.141252107 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64)
Nov 23 08:13:15 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:13:15 np0005532585.localdomain podman[73452]: 2025-11-23 08:13:15.423413466 +0000 UTC m=+0.155090018 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1761123044, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public)
Nov 23 08:13:15 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:13:16 np0005532585.localdomain systemd[1]: tmp-crun.mE9koz.mount: Deactivated successfully.
Nov 23 08:13:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:13:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:13:39 np0005532585.localdomain systemd[1]: tmp-crun.jPlrRA.mount: Deactivated successfully.
Nov 23 08:13:39 np0005532585.localdomain podman[73499]: 2025-11-23 08:13:39.012789413 +0000 UTC m=+0.075388909 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, release=1761123044, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4)
Nov 23 08:13:39 np0005532585.localdomain systemd[1]: tmp-crun.5u5sJN.mount: Deactivated successfully.
Nov 23 08:13:39 np0005532585.localdomain podman[73500]: 2025-11-23 08:13:39.061270451 +0000 UTC m=+0.121581837 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Nov 23 08:13:39 np0005532585.localdomain podman[73500]: 2025-11-23 08:13:39.067440819 +0000 UTC m=+0.127752205 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:13:39 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:13:39 np0005532585.localdomain podman[73499]: 2025-11-23 08:13:39.079795865 +0000 UTC m=+0.142395381 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:13:39 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:13:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:13:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:13:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:13:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:13:43 np0005532585.localdomain podman[73538]: 2025-11-23 08:13:43.008643906 +0000 UTC m=+0.066386125 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.openshift.expose-services=)
Nov 23 08:13:43 np0005532585.localdomain podman[73538]: 2025-11-23 08:13:43.044446686 +0000 UTC m=+0.102188975 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, release=1761123044)
Nov 23 08:13:43 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:13:43 np0005532585.localdomain podman[73539]: 2025-11-23 08:13:43.063239059 +0000 UTC m=+0.117382328 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, architecture=x86_64, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:13:43 np0005532585.localdomain systemd[1]: tmp-crun.9SS82T.mount: Deactivated successfully.
Nov 23 08:13:43 np0005532585.localdomain podman[73540]: 2025-11-23 08:13:43.123666121 +0000 UTC m=+0.174537131 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:13:43 np0005532585.localdomain podman[73540]: 2025-11-23 08:13:43.152288093 +0000 UTC m=+0.203159073 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:13:43 np0005532585.localdomain podman[73546]: 2025-11-23 08:13:43.170533169 +0000 UTC m=+0.217427717 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044)
Nov 23 08:13:43 np0005532585.localdomain podman[73539]: 2025-11-23 08:13:43.193710956 +0000 UTC m=+0.247854285 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:13:43 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:13:43 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:13:43 np0005532585.localdomain podman[73546]: 2025-11-23 08:13:43.514157292 +0000 UTC m=+0.561051810 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, distribution-scope=public, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:13:43 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:13:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:13:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:13:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:13:46 np0005532585.localdomain systemd[1]: tmp-crun.VRNBhO.mount: Deactivated successfully.
Nov 23 08:13:46 np0005532585.localdomain podman[73630]: 2025-11-23 08:13:46.028146351 +0000 UTC m=+0.084623700 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc.)
Nov 23 08:13:46 np0005532585.localdomain systemd[1]: tmp-crun.9leR8K.mount: Deactivated successfully.
Nov 23 08:13:46 np0005532585.localdomain podman[73629]: 2025-11-23 08:13:46.081276381 +0000 UTC m=+0.140481973 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 08:13:46 np0005532585.localdomain podman[73631]: 2025-11-23 08:13:46.130153821 +0000 UTC m=+0.183303988 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, version=17.1.12, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:13:46 np0005532585.localdomain podman[73630]: 2025-11-23 08:13:46.158647829 +0000 UTC m=+0.215125167 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1761123044, container_name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:13:46 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:13:46 np0005532585.localdomain podman[73631]: 2025-11-23 08:13:46.213546373 +0000 UTC m=+0.266696600 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Nov 23 08:13:46 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:13:46 np0005532585.localdomain podman[73629]: 2025-11-23 08:13:46.273288953 +0000 UTC m=+0.332494605 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 23 08:13:46 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:14:00 np0005532585.localdomain sudo[73704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:14:00 np0005532585.localdomain sudo[73704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:14:00 np0005532585.localdomain sudo[73704]: pam_unix(sudo:session): session closed for user root
Nov 23 08:14:00 np0005532585.localdomain sudo[73719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:14:00 np0005532585.localdomain sudo[73719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:14:00 np0005532585.localdomain sudo[73719]: pam_unix(sudo:session): session closed for user root
Nov 23 08:14:01 np0005532585.localdomain sudo[73765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:14:01 np0005532585.localdomain sudo[73765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:14:01 np0005532585.localdomain sudo[73765]: pam_unix(sudo:session): session closed for user root
Nov 23 08:14:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:14:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:14:10 np0005532585.localdomain podman[73780]: 2025-11-23 08:14:10.029662266 +0000 UTC m=+0.087982513 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git)
Nov 23 08:14:10 np0005532585.localdomain podman[73780]: 2025-11-23 08:14:10.039104073 +0000 UTC m=+0.097424280 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd)
Nov 23 08:14:10 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:14:10 np0005532585.localdomain systemd[1]: tmp-crun.pZxztP.mount: Deactivated successfully.
Nov 23 08:14:10 np0005532585.localdomain podman[73781]: 2025-11-23 08:14:10.095035862 +0000 UTC m=+0.149918645 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-iscsid, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:14:10 np0005532585.localdomain podman[73781]: 2025-11-23 08:14:10.106142309 +0000 UTC m=+0.161025032 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 23 08:14:10 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:14:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:14:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:14:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:14:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:14:14 np0005532585.localdomain systemd[1]: tmp-crun.imXMqm.mount: Deactivated successfully.
Nov 23 08:14:14 np0005532585.localdomain podman[73824]: 2025-11-23 08:14:14.05308901 +0000 UTC m=+0.103158484 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:14:14 np0005532585.localdomain podman[73820]: 2025-11-23 08:14:14.012770696 +0000 UTC m=+0.069841773 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vcs-type=git)
Nov 23 08:14:14 np0005532585.localdomain podman[73820]: 2025-11-23 08:14:14.095135547 +0000 UTC m=+0.152206584 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible)
Nov 23 08:14:14 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:14:14 np0005532585.localdomain podman[73821]: 2025-11-23 08:14:14.142322821 +0000 UTC m=+0.194308433 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z)
Nov 23 08:14:14 np0005532585.localdomain podman[73822]: 2025-11-23 08:14:14.18607091 +0000 UTC m=+0.236624319 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.)
Nov 23 08:14:14 np0005532585.localdomain podman[73821]: 2025-11-23 08:14:14.201751355 +0000 UTC m=+0.253736957 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Nov 23 08:14:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:14:14 np0005532585.localdomain podman[73822]: 2025-11-23 08:14:14.241016058 +0000 UTC m=+0.291569447 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:14:14 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:14:14 np0005532585.localdomain podman[73824]: 2025-11-23 08:14:14.407678021 +0000 UTC m=+0.457747495 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible)
Nov 23 08:14:14 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:14:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:14:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:14:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:14:17 np0005532585.localdomain systemd[1]: tmp-crun.w1lMZb.mount: Deactivated successfully.
Nov 23 08:14:17 np0005532585.localdomain podman[73913]: 2025-11-23 08:14:17.025971005 +0000 UTC m=+0.085108877 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 23 08:14:17 np0005532585.localdomain podman[73915]: 2025-11-23 08:14:17.070737194 +0000 UTC m=+0.124284576 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 23 08:14:17 np0005532585.localdomain podman[73915]: 2025-11-23 08:14:17.091130224 +0000 UTC m=+0.144677616 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, release=1761123044, tcib_managed=true, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team)
Nov 23 08:14:17 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:14:17 np0005532585.localdomain podman[73914]: 2025-11-23 08:14:17.175193977 +0000 UTC m=+0.230984728 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.12)
Nov 23 08:14:17 np0005532585.localdomain podman[73914]: 2025-11-23 08:14:17.218336817 +0000 UTC m=+0.274127568 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:14:17 np0005532585.localdomain podman[73913]: 2025-11-23 08:14:17.229231918 +0000 UTC m=+0.288369770 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4)
Nov 23 08:14:17 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:14:17 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:14:18 np0005532585.localdomain systemd[1]: tmp-crun.HvtjGL.mount: Deactivated successfully.
Nov 23 08:14:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:14:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:14:41 np0005532585.localdomain podman[73987]: 2025-11-23 08:14:41.025924192 +0000 UTC m=+0.077760013 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid)
Nov 23 08:14:41 np0005532585.localdomain podman[73987]: 2025-11-23 08:14:41.03605793 +0000 UTC m=+0.087893741 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:14:41 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:14:41 np0005532585.localdomain systemd[1]: tmp-crun.vnfKNZ.mount: Deactivated successfully.
Nov 23 08:14:41 np0005532585.localdomain podman[73986]: 2025-11-23 08:14:41.084004717 +0000 UTC m=+0.139033515 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z)
Nov 23 08:14:41 np0005532585.localdomain podman[73986]: 2025-11-23 08:14:41.09924836 +0000 UTC m=+0.154277158 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team)
Nov 23 08:14:41 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:14:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:14:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:14:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:14:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:14:45 np0005532585.localdomain systemd[1]: tmp-crun.H5U0qI.mount: Deactivated successfully.
Nov 23 08:14:45 np0005532585.localdomain podman[74028]: 2025-11-23 08:14:45.039539498 +0000 UTC m=+0.090936122 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:14:45 np0005532585.localdomain systemd[1]: tmp-crun.qjPP1y.mount: Deactivated successfully.
Nov 23 08:14:45 np0005532585.localdomain podman[74026]: 2025-11-23 08:14:45.077702548 +0000 UTC m=+0.136165637 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:14:45 np0005532585.localdomain podman[74028]: 2025-11-23 08:14:45.096280293 +0000 UTC m=+0.147676947 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 23 08:14:45 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:14:45 np0005532585.localdomain podman[74026]: 2025-11-23 08:14:45.114415823 +0000 UTC m=+0.172878902 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:14:45 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:14:45 np0005532585.localdomain podman[74027]: 2025-11-23 08:14:45.183178612 +0000 UTC m=+0.239460225 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:14:45 np0005532585.localdomain podman[74030]: 2025-11-23 08:14:45.229354024 +0000 UTC m=+0.278257623 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:14:45 np0005532585.localdomain podman[74027]: 2025-11-23 08:14:45.237198553 +0000 UTC m=+0.293480226 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Nov 23 08:14:45 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:14:45 np0005532585.localdomain podman[74030]: 2025-11-23 08:14:45.617885777 +0000 UTC m=+0.666789376 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:14:45 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:14:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:14:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:14:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:14:48 np0005532585.localdomain systemd[1]: tmp-crun.tCQO8K.mount: Deactivated successfully.
Nov 23 08:14:48 np0005532585.localdomain podman[74118]: 2025-11-23 08:14:48.053673684 +0000 UTC m=+0.110383753 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Nov 23 08:14:48 np0005532585.localdomain podman[74119]: 2025-11-23 08:14:48.003864482 +0000 UTC m=+0.057886539 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64)
Nov 23 08:14:48 np0005532585.localdomain podman[74117]: 2025-11-23 08:14:48.032413359 +0000 UTC m=+0.088360145 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, version=17.1.12)
Nov 23 08:14:48 np0005532585.localdomain podman[74118]: 2025-11-23 08:14:48.089591215 +0000 UTC m=+0.146301284 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:14:48 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:14:48 np0005532585.localdomain podman[74119]: 2025-11-23 08:14:48.138218043 +0000 UTC m=+0.192240100 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64)
Nov 23 08:14:48 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:14:48 np0005532585.localdomain podman[74117]: 2025-11-23 08:14:48.215027276 +0000 UTC m=+0.270974152 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Nov 23 08:14:48 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:15:01 np0005532585.localdomain sudo[74189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:15:01 np0005532585.localdomain sudo[74189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:15:01 np0005532585.localdomain sudo[74189]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:02 np0005532585.localdomain sudo[74204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 08:15:02 np0005532585.localdomain sudo[74204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:15:02 np0005532585.localdomain podman[74287]: 2025-11-23 08:15:02.733780393 +0000 UTC m=+0.086467227 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 08:15:02 np0005532585.localdomain podman[74287]: 2025-11-23 08:15:02.845318902 +0000 UTC m=+0.198005716 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Nov 23 08:15:03 np0005532585.localdomain sudo[74204]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:03 np0005532585.localdomain sudo[74356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:15:03 np0005532585.localdomain sudo[74356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:15:03 np0005532585.localdomain sudo[74356]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:03 np0005532585.localdomain sudo[74371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:15:03 np0005532585.localdomain sudo[74371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:15:03 np0005532585.localdomain sudo[74371]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:04 np0005532585.localdomain sudo[74417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:15:04 np0005532585.localdomain sudo[74417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:15:04 np0005532585.localdomain sudo[74417]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:15:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:15:12 np0005532585.localdomain podman[74433]: 2025-11-23 08:15:12.040236714 +0000 UTC m=+0.092667836 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:15:12 np0005532585.localdomain podman[74433]: 2025-11-23 08:15:12.075739532 +0000 UTC m=+0.128170644 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:15:12 np0005532585.localdomain podman[74432]: 2025-11-23 08:15:12.085175679 +0000 UTC m=+0.137120786 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, container_name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:15:12 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:15:12 np0005532585.localdomain podman[74432]: 2025-11-23 08:15:12.096255326 +0000 UTC m=+0.148200423 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:15:12 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:15:15 np0005532585.localdomain sudo[74513]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glkignuxxmpszihygqqfgochuplrpmmg ; /usr/bin/python3
Nov 23 08:15:15 np0005532585.localdomain sudo[74513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: tmp-crun.PvSLM6.mount: Deactivated successfully.
Nov 23 08:15:15 np0005532585.localdomain podman[74517]: 2025-11-23 08:15:15.480239527 +0000 UTC m=+0.096176223 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:15:15 np0005532585.localdomain podman[74517]: 2025-11-23 08:15:15.518136118 +0000 UTC m=+0.134072834 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:15:15 np0005532585.localdomain podman[74518]: 2025-11-23 08:15:15.532463403 +0000 UTC m=+0.144077307 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:15:15 np0005532585.localdomain python3[74515]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:15 np0005532585.localdomain sudo[74513]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:15 np0005532585.localdomain podman[74516]: 2025-11-23 08:15:15.580620166 +0000 UTC m=+0.196561772 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.expose-services=)
Nov 23 08:15:15 np0005532585.localdomain podman[74516]: 2025-11-23 08:15:15.587875736 +0000 UTC m=+0.203817302 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:15:15 np0005532585.localdomain podman[74518]: 2025-11-23 08:15:15.642616869 +0000 UTC m=+0.254230783 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, version=17.1.12, architecture=x86_64)
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:15:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:15:15 np0005532585.localdomain podman[74611]: 2025-11-23 08:15:15.745752832 +0000 UTC m=+0.066425859 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:15:15 np0005532585.localdomain sudo[74640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uikmmnlewwwlonbuswogpwiqvccgffiz ; /usr/bin/python3
Nov 23 08:15:15 np0005532585.localdomain sudo[74640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:15 np0005532585.localdomain python3[74649]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885715.2035592-113720-120016200137482/source _original_basename=tmpdq5n753y follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:15 np0005532585.localdomain sudo[74640]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:16 np0005532585.localdomain podman[74611]: 2025-11-23 08:15:16.143443583 +0000 UTC m=+0.464116640 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:15:16 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:15:16 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:15:16 np0005532585.localdomain recover_tripleo_nova_virtqemud[74668]: 61756
Nov 23 08:15:16 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:15:16 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:15:16 np0005532585.localdomain sudo[74682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubfcfzfgtpfizmrepinenxnxljzdhduu ; /usr/bin/python3
Nov 23 08:15:16 np0005532585.localdomain sudo[74682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:16 np0005532585.localdomain python3[74684]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:15:16 np0005532585.localdomain sudo[74682]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:17 np0005532585.localdomain sudo[74732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrmnxsrjhmdcixvfpwpwwfogbedddzmu ; /usr/bin/python3
Nov 23 08:15:17 np0005532585.localdomain sudo[74732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:17 np0005532585.localdomain sudo[74732]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:17 np0005532585.localdomain sudo[74750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbsyxzbwmtidhagvbowscrwvevjlvdze ; /usr/bin/python3
Nov 23 08:15:17 np0005532585.localdomain sudo[74750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:17 np0005532585.localdomain sudo[74750]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:18 np0005532585.localdomain sudo[74854]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izseuwqltdbbuuexubmqxyzwcfftxdpx ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885717.8290527-113848-201341338781547/async_wrapper.py 721452772396 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885717.8290527-113848-201341338781547/AnsiballZ_command.py _
Nov 23 08:15:18 np0005532585.localdomain sudo[74854]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 08:15:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:15:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:15:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:15:18 np0005532585.localdomain podman[74858]: 2025-11-23 08:15:18.274964778 +0000 UTC m=+0.078296839 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Nov 23 08:15:18 np0005532585.localdomain podman[74857]: 2025-11-23 08:15:18.256708914 +0000 UTC m=+0.066538362 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:15:18 np0005532585.localdomain podman[74893]: 2025-11-23 08:15:18.339394295 +0000 UTC m=+0.058915830 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:15:18 np0005532585.localdomain podman[74857]: 2025-11-23 08:15:18.340225231 +0000 UTC m=+0.150054679 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12)
Nov 23 08:15:18 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:15:18 np0005532585.localdomain podman[74858]: 2025-11-23 08:15:18.367992815 +0000 UTC m=+0.171324866 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:15:18 np0005532585.localdomain ansible-async_wrapper.py[74856]: Invoked with 721452772396 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885717.8290527-113848-201341338781547/AnsiballZ_command.py _
Nov 23 08:15:18 np0005532585.localdomain ansible-async_wrapper.py[74935]: Starting module and watcher
Nov 23 08:15:18 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:15:18 np0005532585.localdomain ansible-async_wrapper.py[74935]: Start watching 74936 (3600)
Nov 23 08:15:18 np0005532585.localdomain ansible-async_wrapper.py[74936]: Start module (74936)
Nov 23 08:15:18 np0005532585.localdomain ansible-async_wrapper.py[74856]: Return async_wrapper task started.
Nov 23 08:15:18 np0005532585.localdomain sudo[74854]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:18 np0005532585.localdomain podman[74893]: 2025-11-23 08:15:18.543189466 +0000 UTC m=+0.262710991 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git)
Nov 23 08:15:18 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:15:18 np0005532585.localdomain sudo[74954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehdfncecdantgkqldwyuzzbjmwuzzpgz ; /usr/bin/python3
Nov 23 08:15:18 np0005532585.localdomain sudo[74954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:18 np0005532585.localdomain python3[74956]: ansible-ansible.legacy.async_status Invoked with jid=721452772396.74856 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:15:18 np0005532585.localdomain sudo[74954]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]:    (file: /etc/puppet/hiera.yaml)
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]: Warning: Undefined variable '::deploy_config_name';
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]:    (file & line not available)
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]:    (file & line not available)
Nov 23 08:15:21 np0005532585.localdomain puppet-user[74948]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.22 seconds
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Notice: Applied catalog in 0.22 seconds
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Application:
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    Initial environment: production
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    Converged environment: production
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:          Run mode: user
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Changes:
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Events:
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Resources:
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:             Total: 19
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Time:
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:          Schedule: 0.00
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:           Package: 0.00
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:              Exec: 0.00
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:            Augeas: 0.01
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:              File: 0.02
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:           Service: 0.04
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    Transaction evaluation: 0.21
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    Catalog application: 0.22
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:    Config retrieval: 0.29
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:          Last run: 1763885722
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:        Filebucket: 0.00
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:             Total: 0.22
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]: Version:
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:            Config: 1763885721
Nov 23 08:15:22 np0005532585.localdomain puppet-user[74948]:            Puppet: 7.10.0
Nov 23 08:15:22 np0005532585.localdomain ansible-async_wrapper.py[74936]: Module complete (74936)
Nov 23 08:15:23 np0005532585.localdomain ansible-async_wrapper.py[74935]: Done in kid B.
Nov 23 08:15:28 np0005532585.localdomain sudo[75092]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlemitrosopqfufppxsccdvfhjmnmmwp ; /usr/bin/python3
Nov 23 08:15:28 np0005532585.localdomain sudo[75092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:29 np0005532585.localdomain python3[75094]: ansible-ansible.legacy.async_status Invoked with jid=721452772396.74856 mode=status _async_dir=/tmp/.ansible_async
Nov 23 08:15:29 np0005532585.localdomain sudo[75092]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:29 np0005532585.localdomain sudo[75108]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysubsjxswjvgtviibcuxkompwzktujqt ; /usr/bin/python3
Nov 23 08:15:29 np0005532585.localdomain sudo[75108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:29 np0005532585.localdomain python3[75110]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:15:29 np0005532585.localdomain sudo[75108]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:29 np0005532585.localdomain sudo[75124]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-steejkfujbecfjnseisazpoxhnxkaanb ; /usr/bin/python3
Nov 23 08:15:29 np0005532585.localdomain sudo[75124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:30 np0005532585.localdomain python3[75126]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:15:30 np0005532585.localdomain sudo[75124]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:30 np0005532585.localdomain sudo[75174]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzpdujlbquejwaaogcfftfhxdvusjtgi ; /usr/bin/python3
Nov 23 08:15:30 np0005532585.localdomain sudo[75174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:30 np0005532585.localdomain python3[75176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:30 np0005532585.localdomain sudo[75174]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:30 np0005532585.localdomain sudo[75192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dseoznegyzrfejzmvhhaqyzbihtnarij ; /usr/bin/python3
Nov 23 08:15:30 np0005532585.localdomain sudo[75192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:30 np0005532585.localdomain python3[75194]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp98se98qh recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 08:15:30 np0005532585.localdomain sudo[75192]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:31 np0005532585.localdomain sudo[75222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbrampshxcegnueeerhxcsgzajyckevj ; /usr/bin/python3
Nov 23 08:15:31 np0005532585.localdomain sudo[75222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:31 np0005532585.localdomain python3[75224]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:31 np0005532585.localdomain sudo[75222]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:31 np0005532585.localdomain sudo[75238]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyyvczmkbyjptyobrlhqqzphlfsiugbd ; /usr/bin/python3
Nov 23 08:15:31 np0005532585.localdomain sudo[75238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:32 np0005532585.localdomain sudo[75238]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:32 np0005532585.localdomain sudo[75327]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifgcbiufurikrzbmmvjgqwkzenyiuxkq ; /usr/bin/python3
Nov 23 08:15:32 np0005532585.localdomain sudo[75327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:32 np0005532585.localdomain python3[75329]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 23 08:15:32 np0005532585.localdomain sudo[75327]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:32 np0005532585.localdomain sudo[75346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrcqexjxpvvlsrknmnamxhkwouffqgwp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:32 np0005532585.localdomain sudo[75346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:33 np0005532585.localdomain python3[75348]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:33 np0005532585.localdomain sudo[75346]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:33 np0005532585.localdomain sudo[75362]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htclpjokgksvtyzgduqvoeqsgyyjtozf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:33 np0005532585.localdomain sudo[75362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:33 np0005532585.localdomain sudo[75362]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:33 np0005532585.localdomain sudo[75378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muvkywywcmjwvvtnzjhsvoojexpruhqm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:33 np0005532585.localdomain sudo[75378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:33 np0005532585.localdomain python3[75380]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:15:33 np0005532585.localdomain sudo[75378]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:34 np0005532585.localdomain sudo[75428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvclgkbaqejcacjowwbdmcnrityqftny ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:34 np0005532585.localdomain sudo[75428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:34 np0005532585.localdomain python3[75430]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:34 np0005532585.localdomain sudo[75428]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:34 np0005532585.localdomain sudo[75446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akrqihdkqpwnopvkrkstyrsvtxieqiqq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:34 np0005532585.localdomain sudo[75446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:34 np0005532585.localdomain python3[75448]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:34 np0005532585.localdomain sudo[75446]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:35 np0005532585.localdomain sudo[75508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whheppkevejxgqhqisorvgkhsyfctdjl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:35 np0005532585.localdomain sudo[75508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:35 np0005532585.localdomain python3[75510]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:35 np0005532585.localdomain sudo[75508]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:35 np0005532585.localdomain sudo[75526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uebnbcjetptujnkpbkvhqntbttageqxb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:35 np0005532585.localdomain sudo[75526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:35 np0005532585.localdomain python3[75528]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:35 np0005532585.localdomain sudo[75526]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:36 np0005532585.localdomain sudo[75588]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khxfcwqgpxqganfhejkdxxzwlhraifrl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:36 np0005532585.localdomain sudo[75588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:36 np0005532585.localdomain python3[75590]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:36 np0005532585.localdomain sudo[75588]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:36 np0005532585.localdomain sudo[75606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svwtpkuoqebrxgufbvfhhppsoohpnbgg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:36 np0005532585.localdomain sudo[75606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:36 np0005532585.localdomain python3[75608]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:36 np0005532585.localdomain sudo[75606]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:36 np0005532585.localdomain sudo[75668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnyaiupwcvbpistxzxdtbekgbrglvmgg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:36 np0005532585.localdomain sudo[75668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:37 np0005532585.localdomain python3[75670]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:37 np0005532585.localdomain sudo[75668]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:37 np0005532585.localdomain sudo[75686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xorpiqgzpzfmaiirlzqxndqykagvldbx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:37 np0005532585.localdomain sudo[75686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:37 np0005532585.localdomain python3[75688]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:37 np0005532585.localdomain sudo[75686]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:37 np0005532585.localdomain sudo[75716]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iufgosmluzwvswglcriqrdcmllhhnarw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:37 np0005532585.localdomain sudo[75716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:37 np0005532585.localdomain python3[75718]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:15:37 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:15:37 np0005532585.localdomain systemd-rc-local-generator[75738]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:15:37 np0005532585.localdomain systemd-sysv-generator[75742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:15:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:15:39 np0005532585.localdomain sudo[75716]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:39 np0005532585.localdomain sudo[75802]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdpmkmeapoitlhjwucakfpsaygpwlxlz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:39 np0005532585.localdomain sudo[75802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:39 np0005532585.localdomain python3[75804]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:39 np0005532585.localdomain sudo[75802]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:39 np0005532585.localdomain sudo[75820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcbigjlycmvrjqcpoicfuyfxrpgljcuy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:39 np0005532585.localdomain sudo[75820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:39 np0005532585.localdomain python3[75822]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:39 np0005532585.localdomain sudo[75820]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:40 np0005532585.localdomain sudo[75882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnryqpgaqwmejekgjqpfonocltwtfpgr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:40 np0005532585.localdomain sudo[75882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:40 np0005532585.localdomain python3[75884]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 08:15:40 np0005532585.localdomain sudo[75882]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:40 np0005532585.localdomain sudo[75900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smpqlucdttkxckfncfxkgltqxhjxxllj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:40 np0005532585.localdomain sudo[75900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:40 np0005532585.localdomain python3[75902]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:15:40 np0005532585.localdomain sudo[75900]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:40 np0005532585.localdomain sudo[75930]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbakjpmohvoiemiczlhiloqaavxbntfr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:40 np0005532585.localdomain sudo[75930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:41 np0005532585.localdomain python3[75932]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:15:41 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:15:41 np0005532585.localdomain systemd-sysv-generator[75959]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:15:41 np0005532585.localdomain systemd-rc-local-generator[75954]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:15:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:15:41 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 08:15:41 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 08:15:41 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 08:15:41 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 08:15:41 np0005532585.localdomain sudo[75930]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:41 np0005532585.localdomain sudo[75987]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezmotkmfgkxlzmfgawvsfbpsxcvwhfed ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:41 np0005532585.localdomain sudo[75987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:41 np0005532585.localdomain python3[75989]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 23 08:15:41 np0005532585.localdomain sudo[75987]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:42 np0005532585.localdomain sudo[76003]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnmfgiuzxdfgxqydptmhqdpggujvgrgx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:42 np0005532585.localdomain sudo[76003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:15:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:15:42 np0005532585.localdomain podman[76006]: 2025-11-23 08:15:42.49005658 +0000 UTC m=+0.077626638 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12)
Nov 23 08:15:42 np0005532585.localdomain podman[76006]: 2025-11-23 08:15:42.506213991 +0000 UTC m=+0.093784069 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, release=1761123044, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible)
Nov 23 08:15:42 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:15:42 np0005532585.localdomain systemd[1]: tmp-crun.62CGE8.mount: Deactivated successfully.
Nov 23 08:15:42 np0005532585.localdomain podman[76007]: 2025-11-23 08:15:42.552944691 +0000 UTC m=+0.138594911 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 23 08:15:42 np0005532585.localdomain podman[76007]: 2025-11-23 08:15:42.563208822 +0000 UTC m=+0.148859042 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12)
Nov 23 08:15:42 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:15:42 np0005532585.localdomain sudo[76003]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:43 np0005532585.localdomain sudo[76081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewpjjnutiuylpauhdxiisdajltkvrbdw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:15:43 np0005532585.localdomain sudo[76081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:15:43 np0005532585.localdomain python3[76083]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 23 08:15:44 np0005532585.localdomain podman[76122]: 2025-11-23 08:15:44.105774019 +0000 UTC m=+0.089074167 container create e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started libpod-conmon-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope.
Nov 23 08:15:44 np0005532585.localdomain podman[76122]: 2025-11-23 08:15:44.058983227 +0000 UTC m=+0.042283405 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:15:44 np0005532585.localdomain podman[76122]: 2025-11-23 08:15:44.204123656 +0000 UTC m=+0.187423834 container init e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git)
Nov 23 08:15:44 np0005532585.localdomain sudo[76143]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:15:44 np0005532585.localdomain podman[76122]: 2025-11-23 08:15:44.237859991 +0000 UTC m=+0.221160149 container start e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 23 08:15:44 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:15:44 np0005532585.localdomain python3[76083]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:15:44 np0005532585.localdomain podman[76144]: 2025-11-23 08:15:44.334229049 +0000 UTC m=+0.086181259 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc.)
Nov 23 08:15:44 np0005532585.localdomain podman[76144]: 2025-11-23 08:15:44.374215093 +0000 UTC m=+0.126167293 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git)
Nov 23 08:15:44 np0005532585.localdomain podman[76144]: unhealthy
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Queued start job for default target Main User Target.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Created slice User Application Slice.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Reached target Paths.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Reached target Timers.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Starting D-Bus User Message Bus Socket...
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Starting Create User's Volatile Files and Directories...
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Finished Create User's Volatile Files and Directories.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Listening on D-Bus User Message Bus Socket.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Reached target Sockets.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Reached target Basic System.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Reached target Main User Target.
Nov 23 08:15:44 np0005532585.localdomain systemd[76168]: Startup finished in 133ms.
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started User Manager for UID 0.
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started Session c10 of User root.
Nov 23 08:15:44 np0005532585.localdomain sudo[76143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 23 08:15:44 np0005532585.localdomain sudo[76143]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Nov 23 08:15:44 np0005532585.localdomain podman[76245]: 2025-11-23 08:15:44.67172578 +0000 UTC m=+0.070697608 container create 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_wait_for_compute_service)
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started libpod-conmon-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8.scope.
Nov 23 08:15:44 np0005532585.localdomain podman[76245]: 2025-11-23 08:15:44.629786146 +0000 UTC m=+0.028757954 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:15:44 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 08:15:44 np0005532585.localdomain podman[76245]: 2025-11-23 08:15:44.748773471 +0000 UTC m=+0.147745309 container init 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 23 08:15:44 np0005532585.localdomain podman[76245]: 2025-11-23 08:15:44.759536327 +0000 UTC m=+0.158508155 container start 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 23 08:15:44 np0005532585.localdomain podman[76245]: 2025-11-23 08:15:44.759823287 +0000 UTC m=+0.158795155 container attach 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true)
Nov 23 08:15:44 np0005532585.localdomain sudo[76264]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 08:15:44 np0005532585.localdomain sudo[76264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 23 08:15:44 np0005532585.localdomain sudo[76264]: pam_unix(sudo:session): session closed for user root
Nov 23 08:15:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:15:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:15:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: tmp-crun.b5wln3.mount: Deactivated successfully.
Nov 23 08:15:46 np0005532585.localdomain podman[76269]: 2025-11-23 08:15:46.034230367 +0000 UTC m=+0.088550610 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:15:46 np0005532585.localdomain podman[76268]: 2025-11-23 08:15:46.088561608 +0000 UTC m=+0.142921833 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Nov 23 08:15:46 np0005532585.localdomain podman[76268]: 2025-11-23 08:15:46.095063525 +0000 UTC m=+0.149423690 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: tmp-crun.14FYvA.mount: Deactivated successfully.
Nov 23 08:15:46 np0005532585.localdomain podman[76270]: 2025-11-23 08:15:46.125300514 +0000 UTC m=+0.177006428 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:15:46 np0005532585.localdomain podman[76269]: 2025-11-23 08:15:46.140745782 +0000 UTC m=+0.195066015 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:15:46 np0005532585.localdomain podman[76270]: 2025-11-23 08:15:46.175280512 +0000 UTC m=+0.226986446 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:15:46 np0005532585.localdomain podman[76337]: 2025-11-23 08:15:46.254432206 +0000 UTC m=+0.060203879 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64)
Nov 23 08:15:46 np0005532585.localdomain podman[76337]: 2025-11-23 08:15:46.64631157 +0000 UTC m=+0.452083273 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:15:46 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:15:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:15:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:15:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:15:49 np0005532585.localdomain systemd[1]: tmp-crun.YPPUfr.mount: Deactivated successfully.
Nov 23 08:15:49 np0005532585.localdomain podman[76360]: 2025-11-23 08:15:49.024084677 +0000 UTC m=+0.084360094 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=metrics_qdr)
Nov 23 08:15:49 np0005532585.localdomain systemd[1]: tmp-crun.wlHVKd.mount: Deactivated successfully.
Nov 23 08:15:49 np0005532585.localdomain podman[76361]: 2025-11-23 08:15:49.083432829 +0000 UTC m=+0.139139668 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:15:49 np0005532585.localdomain podman[76362]: 2025-11-23 08:15:49.125032983 +0000 UTC m=+0.176758170 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 23 08:15:49 np0005532585.localdomain podman[76361]: 2025-11-23 08:15:49.140187963 +0000 UTC m=+0.195894812 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Nov 23 08:15:49 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:15:49 np0005532585.localdomain podman[76362]: 2025-11-23 08:15:49.180659333 +0000 UTC m=+0.232384540 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Nov 23 08:15:49 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:15:49 np0005532585.localdomain podman[76360]: 2025-11-23 08:15:49.232638741 +0000 UTC m=+0.292914118 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 23 08:15:49 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Activating special unit Exit the Session...
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped target Main User Target.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped target Basic System.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped target Paths.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped target Sockets.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped target Timers.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Closed D-Bus User Message Bus Socket.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Stopped Create User's Volatile Files and Directories.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Removed slice User Application Slice.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Reached target Shutdown.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Finished Exit the Session.
Nov 23 08:15:54 np0005532585.localdomain systemd[76168]: Reached target Exit the Session.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 23 08:15:54 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 23 08:16:04 np0005532585.localdomain sudo[76433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:16:04 np0005532585.localdomain sudo[76433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:16:04 np0005532585.localdomain sudo[76433]: pam_unix(sudo:session): session closed for user root
Nov 23 08:16:04 np0005532585.localdomain sudo[76448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:16:04 np0005532585.localdomain sudo[76448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:16:05 np0005532585.localdomain sudo[76448]: pam_unix(sudo:session): session closed for user root
Nov 23 08:16:06 np0005532585.localdomain sudo[76494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:16:06 np0005532585.localdomain sudo[76494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:16:06 np0005532585.localdomain sudo[76494]: pam_unix(sudo:session): session closed for user root
Nov 23 08:16:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:16:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:16:13 np0005532585.localdomain podman[76510]: 2025-11-23 08:16:13.011345777 +0000 UTC m=+0.069569893 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:16:13 np0005532585.localdomain podman[76510]: 2025-11-23 08:16:13.017813575 +0000 UTC m=+0.076037701 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:16:13 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:16:13 np0005532585.localdomain podman[76509]: 2025-11-23 08:16:13.064076429 +0000 UTC m=+0.121836781 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:16:13 np0005532585.localdomain podman[76509]: 2025-11-23 08:16:13.10228908 +0000 UTC m=+0.160049382 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 23 08:16:13 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:16:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:16:15 np0005532585.localdomain podman[76548]: 2025-11-23 08:16:15.022046654 +0000 UTC m=+0.075738562 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 23 08:16:15 np0005532585.localdomain systemd[1]: tmp-crun.sbFxrJ.mount: Deactivated successfully.
Nov 23 08:16:15 np0005532585.localdomain podman[76548]: 2025-11-23 08:16:15.054712487 +0000 UTC m=+0.108404355 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:16:15 np0005532585.localdomain podman[76548]: unhealthy
Nov 23 08:16:15 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:16:15 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:16:16 np0005532585.localdomain recover_tripleo_nova_virtqemud[76594]: 61756
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:16:16 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:16:17 np0005532585.localdomain systemd[1]: tmp-crun.QzuSQG.mount: Deactivated successfully.
Nov 23 08:16:17 np0005532585.localdomain podman[76570]: 2025-11-23 08:16:17.026694717 +0000 UTC m=+0.088541261 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 23 08:16:17 np0005532585.localdomain podman[76570]: 2025-11-23 08:16:17.032840744 +0000 UTC m=+0.094687328 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, container_name=logrotate_crond, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible)
Nov 23 08:16:17 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:16:17 np0005532585.localdomain podman[76576]: 2025-11-23 08:16:17.085607197 +0000 UTC m=+0.135908960 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 23 08:16:17 np0005532585.localdomain podman[76571]: 2025-11-23 08:16:17.050731377 +0000 UTC m=+0.105651760 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044)
Nov 23 08:16:17 np0005532585.localdomain podman[76577]: 2025-11-23 08:16:17.132730348 +0000 UTC m=+0.180618508 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:16:17 np0005532585.localdomain podman[76576]: 2025-11-23 08:16:17.164970667 +0000 UTC m=+0.215272430 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:16:17 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:16:17 np0005532585.localdomain podman[76571]: 2025-11-23 08:16:17.180997734 +0000 UTC m=+0.235918147 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4)
Nov 23 08:16:17 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:16:17 np0005532585.localdomain podman[76577]: 2025-11-23 08:16:17.459308288 +0000 UTC m=+0.507196438 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com)
Nov 23 08:16:17 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:16:18 np0005532585.localdomain systemd[1]: tmp-crun.1mZ5ze.mount: Deactivated successfully.
Nov 23 08:16:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:16:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:16:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:16:20 np0005532585.localdomain podman[76671]: 2025-11-23 08:16:20.015110891 +0000 UTC m=+0.077374510 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1)
Nov 23 08:16:20 np0005532585.localdomain podman[76672]: 2025-11-23 08:16:20.030699715 +0000 UTC m=+0.084190128 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044)
Nov 23 08:16:20 np0005532585.localdomain podman[76673]: 2025-11-23 08:16:20.075230768 +0000 UTC m=+0.128286248 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Nov 23 08:16:20 np0005532585.localdomain podman[76672]: 2025-11-23 08:16:20.110071636 +0000 UTC m=+0.163562059 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:16:20 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:16:20 np0005532585.localdomain podman[76673]: 2025-11-23 08:16:20.130247309 +0000 UTC m=+0.183302799 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:16:20 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:16:20 np0005532585.localdomain podman[76671]: 2025-11-23 08:16:20.235465485 +0000 UTC m=+0.297729064 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true)
Nov 23 08:16:20 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:16:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:16:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:16:44 np0005532585.localdomain systemd[1]: tmp-crun.a8Td3U.mount: Deactivated successfully.
Nov 23 08:16:44 np0005532585.localdomain podman[76749]: 2025-11-23 08:16:44.030193995 +0000 UTC m=+0.082242379 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git)
Nov 23 08:16:44 np0005532585.localdomain podman[76749]: 2025-11-23 08:16:44.040248411 +0000 UTC m=+0.092296795 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:16:44 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:16:44 np0005532585.localdomain podman[76748]: 2025-11-23 08:16:44.127370017 +0000 UTC m=+0.179837834 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:16:44 np0005532585.localdomain podman[76748]: 2025-11-23 08:16:44.141643121 +0000 UTC m=+0.194110928 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3)
Nov 23 08:16:44 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:16:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:16:46 np0005532585.localdomain podman[76786]: 2025-11-23 08:16:46.019621445 +0000 UTC m=+0.081689102 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12)
Nov 23 08:16:46 np0005532585.localdomain podman[76786]: 2025-11-23 08:16:46.105203635 +0000 UTC m=+0.167271352 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=nova_compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:16:46 np0005532585.localdomain podman[76786]: unhealthy
Nov 23 08:16:46 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:16:46 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:16:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:16:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:16:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:16:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:16:48 np0005532585.localdomain podman[76812]: 2025-11-23 08:16:48.011302864 +0000 UTC m=+0.058099416 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:16:48 np0005532585.localdomain podman[76811]: 2025-11-23 08:16:48.07374727 +0000 UTC m=+0.119438248 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Nov 23 08:16:48 np0005532585.localdomain podman[76810]: 2025-11-23 08:16:48.047294927 +0000 UTC m=+0.093879503 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute)
Nov 23 08:16:48 np0005532585.localdomain podman[76811]: 2025-11-23 08:16:48.102238627 +0000 UTC m=+0.147929565 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Nov 23 08:16:48 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:16:48 np0005532585.localdomain podman[76810]: 2025-11-23 08:16:48.130173355 +0000 UTC m=+0.176757931 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12)
Nov 23 08:16:48 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:16:48 np0005532585.localdomain podman[76809]: 2025-11-23 08:16:48.143719676 +0000 UTC m=+0.193853809 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:16:48 np0005532585.localdomain podman[76809]: 2025-11-23 08:16:48.173995586 +0000 UTC m=+0.224129699 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Nov 23 08:16:48 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:16:48 np0005532585.localdomain podman[76812]: 2025-11-23 08:16:48.371468475 +0000 UTC m=+0.418265017 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:16:48 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:16:49 np0005532585.localdomain systemd[1]: tmp-crun.v0QURz.mount: Deactivated successfully.
Nov 23 08:16:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:16:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:16:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:16:51 np0005532585.localdomain systemd[1]: tmp-crun.6Un2ST.mount: Deactivated successfully.
Nov 23 08:16:51 np0005532585.localdomain podman[76897]: 2025-11-23 08:16:51.032099282 +0000 UTC m=+0.090618883 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true)
Nov 23 08:16:51 np0005532585.localdomain systemd[1]: tmp-crun.wdfeKE.mount: Deactivated successfully.
Nov 23 08:16:51 np0005532585.localdomain podman[76899]: 2025-11-23 08:16:51.083637668 +0000 UTC m=+0.138618412 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1)
Nov 23 08:16:51 np0005532585.localdomain podman[76898]: 2025-11-23 08:16:51.131467981 +0000 UTC m=+0.187162386 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent)
Nov 23 08:16:51 np0005532585.localdomain podman[76899]: 2025-11-23 08:16:51.161296818 +0000 UTC m=+0.216277582 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 23 08:16:51 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:16:51 np0005532585.localdomain podman[76898]: 2025-11-23 08:16:51.177979744 +0000 UTC m=+0.233674219 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044)
Nov 23 08:16:51 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:16:51 np0005532585.localdomain podman[76897]: 2025-11-23 08:16:51.256682725 +0000 UTC m=+0.315202366 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, tcib_managed=true, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044)
Nov 23 08:16:51 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:16:59 np0005532585.localdomain sshd[76971]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:17:00 np0005532585.localdomain sshd[35840]: Received disconnect from 192.168.122.100 port 34248:11: disconnected by user
Nov 23 08:17:00 np0005532585.localdomain sshd[35840]: Disconnected from user zuul 192.168.122.100 port 34248
Nov 23 08:17:00 np0005532585.localdomain sshd[35837]: pam_unix(sshd:session): session closed for user zuul
Nov 23 08:17:00 np0005532585.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Nov 23 08:17:00 np0005532585.localdomain systemd[1]: session-27.scope: Consumed 2.993s CPU time.
Nov 23 08:17:00 np0005532585.localdomain systemd-logind[761]: Session 27 logged out. Waiting for processes to exit.
Nov 23 08:17:00 np0005532585.localdomain systemd-logind[761]: Removed session 27.
Nov 23 08:17:01 np0005532585.localdomain sshd[76971]: Invalid user user from 80.94.95.115 port 51244
Nov 23 08:17:01 np0005532585.localdomain sshd[76971]: Connection closed by invalid user user 80.94.95.115 port 51244 [preauth]
Nov 23 08:17:06 np0005532585.localdomain sudo[76973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:17:06 np0005532585.localdomain sudo[76973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:17:06 np0005532585.localdomain sudo[76973]: pam_unix(sudo:session): session closed for user root
Nov 23 08:17:06 np0005532585.localdomain sudo[76988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:17:06 np0005532585.localdomain sudo[76988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:17:07 np0005532585.localdomain sudo[76988]: pam_unix(sudo:session): session closed for user root
Nov 23 08:17:07 np0005532585.localdomain sudo[77034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:17:07 np0005532585.localdomain sudo[77034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:17:07 np0005532585.localdomain sudo[77034]: pam_unix(sudo:session): session closed for user root
Nov 23 08:17:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:17:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:17:15 np0005532585.localdomain podman[77049]: 2025-11-23 08:17:15.026492213 +0000 UTC m=+0.081377483 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:17:15 np0005532585.localdomain podman[77049]: 2025-11-23 08:17:15.044436067 +0000 UTC m=+0.099321317 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:17:15 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:17:15 np0005532585.localdomain podman[77050]: 2025-11-23 08:17:15.122167749 +0000 UTC m=+0.177164752 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:17:15 np0005532585.localdomain podman[77050]: 2025-11-23 08:17:15.161266136 +0000 UTC m=+0.216263159 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-iscsid, version=17.1.12)
Nov 23 08:17:15 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:17:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:17:17 np0005532585.localdomain podman[77087]: 2025-11-23 08:17:17.023492012 +0000 UTC m=+0.081503226 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step5, container_name=nova_compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 23 08:17:17 np0005532585.localdomain podman[77087]: 2025-11-23 08:17:17.081281988 +0000 UTC m=+0.139293152 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 23 08:17:17 np0005532585.localdomain podman[77087]: unhealthy
Nov 23 08:17:17 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:17:17 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:17:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:17:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:17:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:17:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:17:19 np0005532585.localdomain podman[77111]: 2025-11-23 08:17:19.040072327 +0000 UTC m=+0.092084358 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:17:19 np0005532585.localdomain podman[77111]: 2025-11-23 08:17:19.074491343 +0000 UTC m=+0.126503364 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute)
Nov 23 08:17:19 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:17:19 np0005532585.localdomain systemd[1]: tmp-crun.gW1i3v.mount: Deactivated successfully.
Nov 23 08:17:19 np0005532585.localdomain podman[77112]: 2025-11-23 08:17:19.118032886 +0000 UTC m=+0.166889900 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 23 08:17:19 np0005532585.localdomain podman[77112]: 2025-11-23 08:17:19.178429681 +0000 UTC m=+0.227286715 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64)
Nov 23 08:17:19 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:17:19 np0005532585.localdomain podman[77118]: 2025-11-23 08:17:19.190553049 +0000 UTC m=+0.236266928 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 23 08:17:19 np0005532585.localdomain podman[77110]: 2025-11-23 08:17:19.239928919 +0000 UTC m=+0.295474757 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:17:19 np0005532585.localdomain podman[77110]: 2025-11-23 08:17:19.246332673 +0000 UTC m=+0.301878511 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 23 08:17:19 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:17:19 np0005532585.localdomain podman[77118]: 2025-11-23 08:17:19.560375432 +0000 UTC m=+0.606089331 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, tcib_managed=true)
Nov 23 08:17:19 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:17:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:17:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:17:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:17:22 np0005532585.localdomain systemd[1]: tmp-crun.jsookL.mount: Deactivated successfully.
Nov 23 08:17:22 np0005532585.localdomain systemd[1]: tmp-crun.FERFqT.mount: Deactivated successfully.
Nov 23 08:17:22 np0005532585.localdomain podman[77205]: 2025-11-23 08:17:22.036028503 +0000 UTC m=+0.095212153 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 23 08:17:22 np0005532585.localdomain podman[77206]: 2025-11-23 08:17:22.004101273 +0000 UTC m=+0.061281173 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true)
Nov 23 08:17:22 np0005532585.localdomain podman[77206]: 2025-11-23 08:17:22.086355822 +0000 UTC m=+0.143535772 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, version=17.1.12, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:17:22 np0005532585.localdomain podman[77207]: 2025-11-23 08:17:22.095983554 +0000 UTC m=+0.148821782 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 23 08:17:22 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:17:22 np0005532585.localdomain podman[77207]: 2025-11-23 08:17:22.116491076 +0000 UTC m=+0.169329294 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:17:22 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:17:22 np0005532585.localdomain podman[77205]: 2025-11-23 08:17:22.253499028 +0000 UTC m=+0.312682728 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:17:22 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:17:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:17:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:17:46 np0005532585.localdomain podman[77287]: 2025-11-23 08:17:46.027304268 +0000 UTC m=+0.088644794 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd)
Nov 23 08:17:46 np0005532585.localdomain podman[77287]: 2025-11-23 08:17:46.040487489 +0000 UTC m=+0.101828005 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z)
Nov 23 08:17:46 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:17:46 np0005532585.localdomain podman[77288]: 2025-11-23 08:17:46.004061022 +0000 UTC m=+0.068440460 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:17:46 np0005532585.localdomain podman[77288]: 2025-11-23 08:17:46.084187665 +0000 UTC m=+0.148567093 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4)
Nov 23 08:17:46 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:17:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:17:48 np0005532585.localdomain podman[77327]: 2025-11-23 08:17:48.026099253 +0000 UTC m=+0.083769856 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute)
Nov 23 08:17:48 np0005532585.localdomain podman[77327]: 2025-11-23 08:17:48.085837588 +0000 UTC m=+0.143508191 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public)
Nov 23 08:17:48 np0005532585.localdomain podman[77327]: unhealthy
Nov 23 08:17:48 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:17:48 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:17:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:17:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:17:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:17:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:17:50 np0005532585.localdomain podman[77348]: 2025-11-23 08:17:50.03714348 +0000 UTC m=+0.090973014 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 23 08:17:50 np0005532585.localdomain podman[77348]: 2025-11-23 08:17:50.047297869 +0000 UTC m=+0.101127483 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, version=17.1.12, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, vcs-type=git, architecture=x86_64)
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: tmp-crun.CoLnVf.mount: Deactivated successfully.
Nov 23 08:17:50 np0005532585.localdomain podman[77350]: 2025-11-23 08:17:50.088247822 +0000 UTC m=+0.136876118 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible)
Nov 23 08:17:50 np0005532585.localdomain podman[77350]: 2025-11-23 08:17:50.109194079 +0000 UTC m=+0.157822365 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:17:50 np0005532585.localdomain podman[77349]: 2025-11-23 08:17:50.193043376 +0000 UTC m=+0.243854408 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute)
Nov 23 08:17:50 np0005532585.localdomain podman[77349]: 2025-11-23 08:17:50.243047235 +0000 UTC m=+0.293858217 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, name=rhosp17/openstack-ceilometer-compute)
Nov 23 08:17:50 np0005532585.localdomain podman[77351]: 2025-11-23 08:17:50.251046888 +0000 UTC m=+0.291491456 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:17:50 np0005532585.localdomain podman[77351]: 2025-11-23 08:17:50.650716048 +0000 UTC m=+0.691160626 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:17:50 np0005532585.localdomain recover_tripleo_nova_virtqemud[77441]: 61756
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:17:50 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:17:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:17:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:17:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:17:53 np0005532585.localdomain podman[77444]: 2025-11-23 08:17:53.021127491 +0000 UTC m=+0.073143862 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Nov 23 08:17:53 np0005532585.localdomain podman[77444]: 2025-11-23 08:17:53.071248324 +0000 UTC m=+0.123264685 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, url=https://www.redhat.com)
Nov 23 08:17:53 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:17:53 np0005532585.localdomain podman[77443]: 2025-11-23 08:17:53.071876323 +0000 UTC m=+0.125494963 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:17:53 np0005532585.localdomain podman[77442]: 2025-11-23 08:17:53.136684412 +0000 UTC m=+0.190959822 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team)
Nov 23 08:17:53 np0005532585.localdomain podman[77443]: 2025-11-23 08:17:53.157266487 +0000 UTC m=+0.210885077 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Nov 23 08:17:53 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:17:53 np0005532585.localdomain podman[77442]: 2025-11-23 08:17:53.331176629 +0000 UTC m=+0.385452019 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public)
Nov 23 08:17:53 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:17:54 np0005532585.localdomain systemd[1]: tmp-crun.UnHzMz.mount: Deactivated successfully.
Nov 23 08:18:08 np0005532585.localdomain sudo[77519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:18:08 np0005532585.localdomain sudo[77519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:18:08 np0005532585.localdomain sudo[77519]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:08 np0005532585.localdomain sudo[77534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:18:08 np0005532585.localdomain sudo[77534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:18:08 np0005532585.localdomain sudo[77534]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:09 np0005532585.localdomain sudo[77582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:18:09 np0005532585.localdomain sudo[77582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:18:09 np0005532585.localdomain sudo[77582]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:18:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:18:17 np0005532585.localdomain podman[77597]: 2025-11-23 08:18:17.081530963 +0000 UTC m=+0.098942848 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Nov 23 08:18:17 np0005532585.localdomain podman[77598]: 2025-11-23 08:18:17.143394788 +0000 UTC m=+0.158991792 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:18:17 np0005532585.localdomain podman[77597]: 2025-11-23 08:18:17.168430098 +0000 UTC m=+0.185841993 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12)
Nov 23 08:18:17 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:18:17 np0005532585.localdomain podman[77598]: 2025-11-23 08:18:17.183360281 +0000 UTC m=+0.198957285 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 23 08:18:17 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:18:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:18:19 np0005532585.localdomain podman[77633]: 2025-11-23 08:18:19.028920862 +0000 UTC m=+0.082710357 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, container_name=nova_compute, config_id=tripleo_step5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:18:19 np0005532585.localdomain podman[77633]: 2025-11-23 08:18:19.095298906 +0000 UTC m=+0.149088381 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Nov 23 08:18:19 np0005532585.localdomain podman[77633]: unhealthy
Nov 23 08:18:19 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:18:19 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:18:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:18:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:18:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:18:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:18:21 np0005532585.localdomain systemd[1]: tmp-crun.hEBUdc.mount: Deactivated successfully.
Nov 23 08:18:21 np0005532585.localdomain podman[77658]: 2025-11-23 08:18:21.032084493 +0000 UTC m=+0.084342064 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:18:21 np0005532585.localdomain podman[77656]: 2025-11-23 08:18:21.009452311 +0000 UTC m=+0.067197348 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 23 08:18:21 np0005532585.localdomain podman[77655]: 2025-11-23 08:18:21.111559568 +0000 UTC m=+0.172343551 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:18:21 np0005532585.localdomain podman[77655]: 2025-11-23 08:18:21.118411762 +0000 UTC m=+0.179195805 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team)
Nov 23 08:18:21 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:18:21 np0005532585.localdomain podman[77657]: 2025-11-23 08:18:21.175749099 +0000 UTC m=+0.230128751 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 23 08:18:21 np0005532585.localdomain podman[77656]: 2025-11-23 08:18:21.193990036 +0000 UTC m=+0.251735143 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public)
Nov 23 08:18:21 np0005532585.localdomain podman[77657]: 2025-11-23 08:18:21.203435434 +0000 UTC m=+0.257815096 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 08:18:21 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:18:21 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:18:21 np0005532585.localdomain podman[77658]: 2025-11-23 08:18:21.371868643 +0000 UTC m=+0.424126264 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:18:21 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:18:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:18:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:18:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:18:24 np0005532585.localdomain systemd[1]: tmp-crun.D0HOYu.mount: Deactivated successfully.
Nov 23 08:18:24 np0005532585.localdomain podman[77753]: 2025-11-23 08:18:24.023464462 +0000 UTC m=+0.080031341 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:18:24 np0005532585.localdomain systemd[1]: tmp-crun.JP09Qb.mount: Deactivated successfully.
Nov 23 08:18:24 np0005532585.localdomain podman[77755]: 2025-11-23 08:18:24.074966973 +0000 UTC m=+0.124963696 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:18:24 np0005532585.localdomain podman[77755]: 2025-11-23 08:18:24.119600949 +0000 UTC m=+0.169597652 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller)
Nov 23 08:18:24 np0005532585.localdomain podman[77754]: 2025-11-23 08:18:24.132048092 +0000 UTC m=+0.183972410 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Nov 23 08:18:24 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:18:24 np0005532585.localdomain podman[77754]: 2025-11-23 08:18:24.176538745 +0000 UTC m=+0.228462973 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4)
Nov 23 08:18:24 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:18:24 np0005532585.localdomain podman[77753]: 2025-11-23 08:18:24.21830219 +0000 UTC m=+0.274869089 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr)
Nov 23 08:18:24 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:18:25 np0005532585.localdomain systemd[1]: tmp-crun.z9PkSm.mount: Deactivated successfully.
Nov 23 08:18:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:18:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:18:47 np0005532585.localdomain podman[77858]: 2025-11-23 08:18:47.326408124 +0000 UTC m=+0.077097057 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:18:47 np0005532585.localdomain systemd[1]: tmp-crun.42wpaN.mount: Deactivated successfully.
Nov 23 08:18:47 np0005532585.localdomain podman[77858]: 2025-11-23 08:18:47.369730604 +0000 UTC m=+0.120419517 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:18:47 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:18:47 np0005532585.localdomain podman[77859]: 2025-11-23 08:18:47.3709918 +0000 UTC m=+0.118531664 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, tcib_managed=true, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:18:47 np0005532585.localdomain podman[77859]: 2025-11-23 08:18:47.45736934 +0000 UTC m=+0.204909234 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=)
Nov 23 08:18:47 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:18:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:18:50 np0005532585.localdomain systemd[1]: tmp-crun.A5TCAd.mount: Deactivated successfully.
Nov 23 08:18:50 np0005532585.localdomain podman[77961]: 2025-11-23 08:18:50.025600384 +0000 UTC m=+0.083582013 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5)
Nov 23 08:18:50 np0005532585.localdomain podman[77961]: 2025-11-23 08:18:50.059438804 +0000 UTC m=+0.117420503 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 23 08:18:50 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:18:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:18:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:18:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:18:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:18:52 np0005532585.localdomain podman[77989]: 2025-11-23 08:18:52.035821946 +0000 UTC m=+0.087716211 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4)
Nov 23 08:18:52 np0005532585.localdomain podman[77987]: 2025-11-23 08:18:52.081433599 +0000 UTC m=+0.140919078 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:18:52 np0005532585.localdomain podman[77989]: 2025-11-23 08:18:52.093331987 +0000 UTC m=+0.145226282 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:18:52 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:18:52 np0005532585.localdomain podman[77988]: 2025-11-23 08:18:52.156979843 +0000 UTC m=+0.211944385 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:18:52 np0005532585.localdomain podman[77987]: 2025-11-23 08:18:52.168340035 +0000 UTC m=+0.227825464 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:18:52 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:18:52 np0005532585.localdomain podman[77988]: 2025-11-23 08:18:52.191230804 +0000 UTC m=+0.246195366 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z)
Nov 23 08:18:52 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:18:52 np0005532585.localdomain podman[77995]: 2025-11-23 08:18:52.247545602 +0000 UTC m=+0.297008728 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 23 08:18:52 np0005532585.localdomain podman[77995]: 2025-11-23 08:18:52.610671194 +0000 UTC m=+0.660134330 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:18:52 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:18:53 np0005532585.localdomain systemd[1]: tmp-crun.98bcJT.mount: Deactivated successfully.
Nov 23 08:18:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:18:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:18:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:18:55 np0005532585.localdomain systemd[1]: tmp-crun.UsObRt.mount: Deactivated successfully.
Nov 23 08:18:55 np0005532585.localdomain podman[78085]: 2025-11-23 08:18:55.038054321 +0000 UTC m=+0.094957285 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:18:55 np0005532585.localdomain podman[78087]: 2025-11-23 08:18:55.084108518 +0000 UTC m=+0.136361910 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:18:55 np0005532585.localdomain podman[78087]: 2025-11-23 08:18:55.112155374 +0000 UTC m=+0.164408766 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044)
Nov 23 08:18:55 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:18:55 np0005532585.localdomain podman[78086]: 2025-11-23 08:18:55.191428022 +0000 UTC m=+0.245739413 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 23 08:18:55 np0005532585.localdomain podman[78086]: 2025-11-23 08:18:55.230368037 +0000 UTC m=+0.284679478 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4)
Nov 23 08:18:55 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:18:55 np0005532585.localdomain podman[78085]: 2025-11-23 08:18:55.265306938 +0000 UTC m=+0.322209822 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044)
Nov 23 08:18:55 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:18:56 np0005532585.localdomain systemd[1]: tmp-crun.KA7vkC.mount: Deactivated successfully.
Nov 23 08:18:57 np0005532585.localdomain systemd[1]: libpod-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8.scope: Deactivated successfully.
Nov 23 08:18:57 np0005532585.localdomain podman[78157]: 2025-11-23 08:18:57.49050206 +0000 UTC m=+0.064244424 container died 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_wait_for_compute_service, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:18:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8-userdata-shm.mount: Deactivated successfully.
Nov 23 08:18:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483-merged.mount: Deactivated successfully.
Nov 23 08:18:57 np0005532585.localdomain podman[78157]: 2025-11-23 08:18:57.524085273 +0000 UTC m=+0.097827587 container cleanup 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:18:57 np0005532585.localdomain systemd[1]: libpod-conmon-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8.scope: Deactivated successfully.
Nov 23 08:18:57 np0005532585.localdomain python3[76083]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 23 08:18:57 np0005532585.localdomain sudo[76081]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:57 np0005532585.localdomain sudo[78208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wclfenfrrumqevvkjbigoemakkupnfdc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:18:57 np0005532585.localdomain sudo[78208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:18:58 np0005532585.localdomain python3[78210]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:18:58 np0005532585.localdomain sudo[78208]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:58 np0005532585.localdomain sudo[78224]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnyvkngmtplieyaijhxwjqusukckutof ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:18:58 np0005532585.localdomain sudo[78224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:18:58 np0005532585.localdomain python3[78226]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 08:18:58 np0005532585.localdomain sudo[78224]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:58 np0005532585.localdomain sudo[78285]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcxbsauwvzaiuhzsjlamwrgvyurekomo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:18:58 np0005532585.localdomain sudo[78285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:18:59 np0005532585.localdomain python3[78287]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885938.4695463-118615-85218611173393/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:18:59 np0005532585.localdomain sudo[78285]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:59 np0005532585.localdomain sudo[78301]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfsnoqjbfaqpxentuvzoyickkgxvkgbg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:18:59 np0005532585.localdomain sudo[78301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:18:59 np0005532585.localdomain python3[78303]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 08:18:59 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:18:59 np0005532585.localdomain systemd-rc-local-generator[78328]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:18:59 np0005532585.localdomain systemd-sysv-generator[78334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:18:59 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:18:59 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:18:59 np0005532585.localdomain sudo[78301]: pam_unix(sudo:session): session closed for user root
Nov 23 08:18:59 np0005532585.localdomain recover_tripleo_nova_virtqemud[78341]: 61756
Nov 23 08:18:59 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:18:59 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:19:00 np0005532585.localdomain sudo[78355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzzbxcneosbttmdpchzhmisjpjghlqrp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 23 08:19:00 np0005532585.localdomain sudo[78355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:00 np0005532585.localdomain python3[78357]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:19:00 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:19:00 np0005532585.localdomain systemd-sysv-generator[78387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:19:00 np0005532585.localdomain systemd-rc-local-generator[78382]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:19:00 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:19:00 np0005532585.localdomain systemd[1]: Starting nova_compute container...
Nov 23 08:19:01 np0005532585.localdomain tripleo-start-podman-container[78397]: Creating additional drop-in dependency for "nova_compute" (e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce)
Nov 23 08:19:01 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:19:01 np0005532585.localdomain systemd-rc-local-generator[78451]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:19:01 np0005532585.localdomain systemd-sysv-generator[78458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:19:01 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:19:01 np0005532585.localdomain systemd[1]: Started nova_compute container.
Nov 23 08:19:01 np0005532585.localdomain systemd[1]: Starting dnf makecache...
Nov 23 08:19:01 np0005532585.localdomain sudo[78355]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:01 np0005532585.localdomain dnf[78466]: Updating Subscription Management repositories.
Nov 23 08:19:01 np0005532585.localdomain sudo[78494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrnvfmxqizuttlkqqvctfbotbjmsqgkk ; /usr/bin/python3
Nov 23 08:19:01 np0005532585.localdomain sudo[78494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:01 np0005532585.localdomain python3[78496]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:19:01 np0005532585.localdomain sudo[78494]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:02 np0005532585.localdomain sudo[78542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhvxxorariqkzryiipvikwndzgeqjdwy ; /usr/bin/python3
Nov 23 08:19:02 np0005532585.localdomain sudo[78542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:02 np0005532585.localdomain sudo[78542]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:02 np0005532585.localdomain sudo[78585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbnkddcreykcwtkwdvqfhtctcrhdhbak ; /usr/bin/python3
Nov 23 08:19:02 np0005532585.localdomain sudo[78585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:02 np0005532585.localdomain sudo[78585]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:03 np0005532585.localdomain sudo[78616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koozdbvblpucqgaeaexjcoifoxpeghpv ; /usr/bin/python3
Nov 23 08:19:03 np0005532585.localdomain sudo[78616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:03 np0005532585.localdomain dnf[78466]: Metadata cache refreshed recently.
Nov 23 08:19:03 np0005532585.localdomain python3[78618]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005532585 step=5 update_config_hash_only=False
Nov 23 08:19:03 np0005532585.localdomain sudo[78616]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:03 np0005532585.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 23 08:19:03 np0005532585.localdomain systemd[1]: Finished dnf makecache.
Nov 23 08:19:03 np0005532585.localdomain systemd[1]: dnf-makecache.service: Consumed 1.953s CPU time.
Nov 23 08:19:03 np0005532585.localdomain sudo[78632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfeyebjxixdysqpaizbppklevfrfaykf ; /usr/bin/python3
Nov 23 08:19:03 np0005532585.localdomain sudo[78632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:03 np0005532585.localdomain python3[78634]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 08:19:03 np0005532585.localdomain sudo[78632]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:04 np0005532585.localdomain sudo[78648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjshktkbiierlnsjbxwpvlerqnzzwijm ; /usr/bin/python3
Nov 23 08:19:04 np0005532585.localdomain sudo[78648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 23 08:19:04 np0005532585.localdomain python3[78650]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 23 08:19:04 np0005532585.localdomain sudo[78648]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:09 np0005532585.localdomain sudo[78651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:19:09 np0005532585.localdomain sudo[78651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:19:09 np0005532585.localdomain sudo[78651]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:09 np0005532585.localdomain sudo[78666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:19:09 np0005532585.localdomain sudo[78666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:19:10 np0005532585.localdomain sudo[78666]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:11 np0005532585.localdomain sudo[78713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:19:11 np0005532585.localdomain sudo[78713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:19:11 np0005532585.localdomain sudo[78713]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:19:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:19:18 np0005532585.localdomain podman[78729]: 2025-11-23 08:19:18.034654444 +0000 UTC m=+0.086593238 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:19:18 np0005532585.localdomain podman[78729]: 2025-11-23 08:19:18.071222841 +0000 UTC m=+0.123161625 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:19:18 np0005532585.localdomain podman[78728]: 2025-11-23 08:19:18.082885282 +0000 UTC m=+0.135160555 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible)
Nov 23 08:19:18 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:19:18 np0005532585.localdomain podman[78728]: 2025-11-23 08:19:18.098245488 +0000 UTC m=+0.150520751 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, release=1761123044, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:19:18 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:19:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:19:21 np0005532585.localdomain podman[78767]: 2025-11-23 08:19:21.024113967 +0000 UTC m=+0.080122243 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, config_id=tripleo_step5, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044)
Nov 23 08:19:21 np0005532585.localdomain podman[78767]: 2025-11-23 08:19:21.056333742 +0000 UTC m=+0.112342018 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 23 08:19:21 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:19:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:19:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:19:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:19:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:19:23 np0005532585.localdomain podman[78793]: 2025-11-23 08:19:23.030580023 +0000 UTC m=+0.085990821 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=)
Nov 23 08:19:23 np0005532585.localdomain podman[78793]: 2025-11-23 08:19:23.039397953 +0000 UTC m=+0.094808721 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12)
Nov 23 08:19:23 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:19:23 np0005532585.localdomain systemd[1]: tmp-crun.nnnJNR.mount: Deactivated successfully.
Nov 23 08:19:23 np0005532585.localdomain podman[78794]: 2025-11-23 08:19:23.099192639 +0000 UTC m=+0.152982700 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:19:23 np0005532585.localdomain podman[78794]: 2025-11-23 08:19:23.127846352 +0000 UTC m=+0.181636473 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12)
Nov 23 08:19:23 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:19:23 np0005532585.localdomain podman[78795]: 2025-11-23 08:19:23.148863739 +0000 UTC m=+0.198490893 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 23 08:19:23 np0005532585.localdomain podman[78799]: 2025-11-23 08:19:23.198748564 +0000 UTC m=+0.244526319 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, release=1761123044, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:19:23 np0005532585.localdomain podman[78795]: 2025-11-23 08:19:23.207290546 +0000 UTC m=+0.256917750 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:19:23 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:19:23 np0005532585.localdomain podman[78799]: 2025-11-23 08:19:23.581078031 +0000 UTC m=+0.626855776 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Nov 23 08:19:23 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:19:24 np0005532585.localdomain systemd[1]: tmp-crun.qZPxba.mount: Deactivated successfully.
Nov 23 08:19:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:19:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:19:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:19:26 np0005532585.localdomain systemd[1]: tmp-crun.lcSStK.mount: Deactivated successfully.
Nov 23 08:19:26 np0005532585.localdomain podman[78885]: 2025-11-23 08:19:26.037299336 +0000 UTC m=+0.096929701 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, container_name=metrics_qdr, distribution-scope=public)
Nov 23 08:19:26 np0005532585.localdomain podman[78886]: 2025-11-23 08:19:26.002503619 +0000 UTC m=+0.061671670 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z)
Nov 23 08:19:26 np0005532585.localdomain podman[78886]: 2025-11-23 08:19:26.088345254 +0000 UTC m=+0.147513275 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:19:26 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:19:26 np0005532585.localdomain podman[78887]: 2025-11-23 08:19:26.14354711 +0000 UTC m=+0.195027234 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:19:26 np0005532585.localdomain podman[78887]: 2025-11-23 08:19:26.165158394 +0000 UTC m=+0.216638528 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 23 08:19:26 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:19:26 np0005532585.localdomain podman[78885]: 2025-11-23 08:19:26.221733119 +0000 UTC m=+0.281363544 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step1, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:19:26 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:19:32 np0005532585.localdomain sshd[78959]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:19:32 np0005532585.localdomain sshd[78959]: Accepted publickey for zuul from 192.168.122.100 port 53526 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 08:19:32 np0005532585.localdomain systemd-logind[761]: New session 33 of user zuul.
Nov 23 08:19:32 np0005532585.localdomain systemd[1]: Started Session 33 of User zuul.
Nov 23 08:19:32 np0005532585.localdomain sshd[78959]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 08:19:32 np0005532585.localdomain sudo[79066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eytupuolsdzpppibhlpgzfkixkelqvia ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763885972.279355-40234-197009558560642/AnsiballZ_setup.py
Nov 23 08:19:32 np0005532585.localdomain sudo[79066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:19:33 np0005532585.localdomain python3[79068]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 08:19:36 np0005532585.localdomain sudo[79066]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:40 np0005532585.localdomain sudo[79329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yymuvxoalnjzzdlrrkydkbdpashsxewm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763885980.3385315-40323-211974230491438/AnsiballZ_dnf.py
Nov 23 08:19:40 np0005532585.localdomain sudo[79329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:19:40 np0005532585.localdomain python3[79331]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Nov 23 08:19:43 np0005532585.localdomain sudo[79329]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:47 np0005532585.localdomain sudo[79446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cttfcswfqkggiltxmhigpyzltgpcewsz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763885987.200659-40379-210426799912704/AnsiballZ_iptables.py
Nov 23 08:19:47 np0005532585.localdomain sudo[79446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:19:47 np0005532585.localdomain python3[79449]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Nov 23 08:19:47 np0005532585.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 23 08:19:47 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Nov 23 08:19:47 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 08:19:47 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 08:19:47 np0005532585.localdomain sudo[79446]: pam_unix(sudo:session): session closed for user root
Nov 23 08:19:47 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 08:19:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:19:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:19:49 np0005532585.localdomain podman[79494]: 2025-11-23 08:19:49.039995312 +0000 UTC m=+0.092951058 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, tcib_managed=true, container_name=iscsid, distribution-scope=public, version=17.1.12, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-type=git)
Nov 23 08:19:49 np0005532585.localdomain podman[79493]: 2025-11-23 08:19:49.080353298 +0000 UTC m=+0.132971004 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3)
Nov 23 08:19:49 np0005532585.localdomain podman[79494]: 2025-11-23 08:19:49.108681451 +0000 UTC m=+0.161637197 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-type=git, version=17.1.12, tcib_managed=true, distribution-scope=public)
Nov 23 08:19:49 np0005532585.localdomain podman[79493]: 2025-11-23 08:19:49.1181729 +0000 UTC m=+0.170790596 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z)
Nov 23 08:19:49 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:19:49 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:19:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:19:52 np0005532585.localdomain podman[79535]: 2025-11-23 08:19:52.020106551 +0000 UTC m=+0.077658424 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 23 08:19:52 np0005532585.localdomain podman[79535]: 2025-11-23 08:19:52.050261197 +0000 UTC m=+0.107813060 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, distribution-scope=public, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:19:52 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:19:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:19:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:19:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:19:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:19:54 np0005532585.localdomain systemd[1]: tmp-crun.wrUks9.mount: Deactivated successfully.
Nov 23 08:19:54 np0005532585.localdomain podman[79562]: 2025-11-23 08:19:54.02426192 +0000 UTC m=+0.081627687 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, name=rhosp17/openstack-cron)
Nov 23 08:19:54 np0005532585.localdomain podman[79564]: 2025-11-23 08:19:54.055873627 +0000 UTC m=+0.104614649 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 23 08:19:54 np0005532585.localdomain podman[79563]: 2025-11-23 08:19:54.076863712 +0000 UTC m=+0.130038880 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public)
Nov 23 08:19:54 np0005532585.localdomain podman[79570]: 2025-11-23 08:19:54.133283253 +0000 UTC m=+0.179251527 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Nov 23 08:19:54 np0005532585.localdomain podman[79564]: 2025-11-23 08:19:54.139397776 +0000 UTC m=+0.188138818 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:19:54 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:19:54 np0005532585.localdomain podman[79562]: 2025-11-23 08:19:54.157386217 +0000 UTC m=+0.214751984 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:19:54 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:19:54 np0005532585.localdomain podman[79563]: 2025-11-23 08:19:54.209001681 +0000 UTC m=+0.262176849 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:19:54 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:19:54 np0005532585.localdomain podman[79570]: 2025-11-23 08:19:54.487354199 +0000 UTC m=+0.533322483 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:19:54 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:19:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:19:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:19:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:19:57 np0005532585.localdomain podman[79657]: 2025-11-23 08:19:57.034775081 +0000 UTC m=+0.090049755 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1)
Nov 23 08:19:57 np0005532585.localdomain systemd[1]: tmp-crun.047zH9.mount: Deactivated successfully.
Nov 23 08:19:57 np0005532585.localdomain podman[79659]: 2025-11-23 08:19:57.101462053 +0000 UTC m=+0.151052916 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044)
Nov 23 08:19:57 np0005532585.localdomain podman[79659]: 2025-11-23 08:19:57.129214171 +0000 UTC m=+0.178805024 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, release=1761123044, build-date=2025-11-18T23:34:05Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:19:57 np0005532585.localdomain podman[79658]: 2025-11-23 08:19:57.141343394 +0000 UTC m=+0.193864471 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:19:57 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:19:57 np0005532585.localdomain podman[79658]: 2025-11-23 08:19:57.174451794 +0000 UTC m=+0.226972910 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:19:57 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:19:57 np0005532585.localdomain podman[79657]: 2025-11-23 08:19:57.273446282 +0000 UTC m=+0.328720906 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:19:57 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:20:11 np0005532585.localdomain sudo[79734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:20:11 np0005532585.localdomain sudo[79734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:20:11 np0005532585.localdomain sudo[79734]: pam_unix(sudo:session): session closed for user root
Nov 23 08:20:11 np0005532585.localdomain sudo[79749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:20:11 np0005532585.localdomain sudo[79749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:20:12 np0005532585.localdomain sudo[79749]: pam_unix(sudo:session): session closed for user root
Nov 23 08:20:12 np0005532585.localdomain sudo[79795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:20:12 np0005532585.localdomain sudo[79795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:20:12 np0005532585.localdomain sudo[79795]: pam_unix(sudo:session): session closed for user root
Nov 23 08:20:16 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:20:16 np0005532585.localdomain recover_tripleo_nova_virtqemud[79811]: 61756
Nov 23 08:20:16 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:20:16 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:20:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:20:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:20:20 np0005532585.localdomain podman[79812]: 2025-11-23 08:20:20.054186471 +0000 UTC m=+0.103513788 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:20:20 np0005532585.localdomain podman[79812]: 2025-11-23 08:20:20.064153374 +0000 UTC m=+0.113480741 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc.)
Nov 23 08:20:20 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:20:20 np0005532585.localdomain podman[79813]: 2025-11-23 08:20:20.151280796 +0000 UTC m=+0.200593562 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:20:20 np0005532585.localdomain podman[79813]: 2025-11-23 08:20:20.188288956 +0000 UTC m=+0.237601692 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=iscsid, build-date=2025-11-18T23:44:13Z)
Nov 23 08:20:20 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:20:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:20:23 np0005532585.localdomain podman[79851]: 2025-11-23 08:20:23.049729967 +0000 UTC m=+0.102418116 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:20:23 np0005532585.localdomain podman[79851]: 2025-11-23 08:20:23.107544587 +0000 UTC m=+0.160232746 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:20:23 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:20:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:20:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:20:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:20:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:20:25 np0005532585.localdomain systemd[1]: tmp-crun.5hsHES.mount: Deactivated successfully.
Nov 23 08:20:25 np0005532585.localdomain podman[79877]: 2025-11-23 08:20:25.011743402 +0000 UTC m=+0.070770158 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:20:25 np0005532585.localdomain podman[79877]: 2025-11-23 08:20:25.023183487 +0000 UTC m=+0.082210273 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond)
Nov 23 08:20:25 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:20:25 np0005532585.localdomain systemd[1]: tmp-crun.kGNdKt.mount: Deactivated successfully.
Nov 23 08:20:25 np0005532585.localdomain podman[79878]: 2025-11-23 08:20:25.064808018 +0000 UTC m=+0.122184108 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Nov 23 08:20:25 np0005532585.localdomain podman[79878]: 2025-11-23 08:20:25.092127412 +0000 UTC m=+0.149503432 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:20:25 np0005532585.localdomain podman[79880]: 2025-11-23 08:20:25.098954486 +0000 UTC m=+0.154247527 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 23 08:20:25 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:20:25 np0005532585.localdomain podman[79879]: 2025-11-23 08:20:25.161102639 +0000 UTC m=+0.216613396 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:20:25 np0005532585.localdomain podman[79879]: 2025-11-23 08:20:25.205071537 +0000 UTC m=+0.260582304 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:20:25 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:20:25 np0005532585.localdomain podman[79880]: 2025-11-23 08:20:25.431666256 +0000 UTC m=+0.486959357 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_migration_target)
Nov 23 08:20:25 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:20:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:20:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:20:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:20:28 np0005532585.localdomain systemd[1]: tmp-crun.rbhSeb.mount: Deactivated successfully.
Nov 23 08:20:28 np0005532585.localdomain podman[79969]: 2025-11-23 08:20:28.023648983 +0000 UTC m=+0.079001012 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 23 08:20:28 np0005532585.localdomain podman[79969]: 2025-11-23 08:20:28.118081652 +0000 UTC m=+0.173433641 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:20:28 np0005532585.localdomain podman[79968]: 2025-11-23 08:20:28.124122743 +0000 UTC m=+0.182844788 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_id=tripleo_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:20:28 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:20:28 np0005532585.localdomain podman[79970]: 2025-11-23 08:20:28.169806819 +0000 UTC m=+0.222952385 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:20:28 np0005532585.localdomain podman[79970]: 2025-11-23 08:20:28.186096622 +0000 UTC m=+0.239242118 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:20:28 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:20:28 np0005532585.localdomain podman[79968]: 2025-11-23 08:20:28.315496443 +0000 UTC m=+0.374218498 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vcs-type=git)
Nov 23 08:20:28 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:20:29 np0005532585.localdomain systemd[1]: tmp-crun.CjKvx6.mount: Deactivated successfully.
Nov 23 08:20:47 np0005532585.localdomain sshd[78959]: pam_unix(sshd:session): session closed for user zuul
Nov 23 08:20:47 np0005532585.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Nov 23 08:20:47 np0005532585.localdomain systemd[1]: session-33.scope: Consumed 5.718s CPU time.
Nov 23 08:20:47 np0005532585.localdomain systemd-logind[761]: Session 33 logged out. Waiting for processes to exit.
Nov 23 08:20:47 np0005532585.localdomain systemd-logind[761]: Removed session 33.
Nov 23 08:20:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:20:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:20:51 np0005532585.localdomain systemd[1]: tmp-crun.ACNw0k.mount: Deactivated successfully.
Nov 23 08:20:51 np0005532585.localdomain podman[80089]: 2025-11-23 08:20:51.0081017 +0000 UTC m=+0.071777618 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:51:28Z, container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com)
Nov 23 08:20:51 np0005532585.localdomain podman[80089]: 2025-11-23 08:20:51.014568923 +0000 UTC m=+0.078244861 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd)
Nov 23 08:20:51 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:20:51 np0005532585.localdomain podman[80090]: 2025-11-23 08:20:51.041198278 +0000 UTC m=+0.100961225 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible)
Nov 23 08:20:51 np0005532585.localdomain podman[80090]: 2025-11-23 08:20:51.046935821 +0000 UTC m=+0.106698788 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:20:51 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:20:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:20:54 np0005532585.localdomain podman[80128]: 2025-11-23 08:20:54.024525698 +0000 UTC m=+0.081961656 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step5, release=1761123044, vcs-type=git)
Nov 23 08:20:54 np0005532585.localdomain podman[80128]: 2025-11-23 08:20:54.048713145 +0000 UTC m=+0.106149063 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:20:54 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:20:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:20:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:20:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:20:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:20:56 np0005532585.localdomain systemd[1]: tmp-crun.JMHFHX.mount: Deactivated successfully.
Nov 23 08:20:56 np0005532585.localdomain podman[80156]: 2025-11-23 08:20:56.040195145 +0000 UTC m=+0.093152034 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Nov 23 08:20:56 np0005532585.localdomain systemd[1]: tmp-crun.Z36nxq.mount: Deactivated successfully.
Nov 23 08:20:56 np0005532585.localdomain podman[80155]: 2025-11-23 08:20:56.092508979 +0000 UTC m=+0.145732985 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:20:56 np0005532585.localdomain podman[80154]: 2025-11-23 08:20:56.144177215 +0000 UTC m=+0.199498941 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4)
Nov 23 08:20:56 np0005532585.localdomain podman[80155]: 2025-11-23 08:20:56.150405121 +0000 UTC m=+0.203629157 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:20:56 np0005532585.localdomain podman[80154]: 2025-11-23 08:20:56.150911935 +0000 UTC m=+0.206233641 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:20:56 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:20:56 np0005532585.localdomain podman[80157]: 2025-11-23 08:20:56.190074436 +0000 UTC m=+0.235674477 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, version=17.1.12)
Nov 23 08:20:56 np0005532585.localdomain podman[80156]: 2025-11-23 08:20:56.201008427 +0000 UTC m=+0.253965396 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true)
Nov 23 08:20:56 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:20:56 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:20:56 np0005532585.localdomain podman[80157]: 2025-11-23 08:20:56.550387499 +0000 UTC m=+0.595987520 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.41.4)
Nov 23 08:20:56 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:20:58 np0005532585.localdomain sshd[80246]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:20:58 np0005532585.localdomain sshd[80246]: Accepted publickey for zuul from 38.102.83.114 port 37676 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 08:20:58 np0005532585.localdomain systemd-logind[761]: New session 34 of user zuul.
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: Started Session 34 of User zuul.
Nov 23 08:20:58 np0005532585.localdomain sshd[80246]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: tmp-crun.WCEdOU.mount: Deactivated successfully.
Nov 23 08:20:58 np0005532585.localdomain podman[80255]: 2025-11-23 08:20:58.477577195 +0000 UTC m=+0.132548071 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:20:58 np0005532585.localdomain podman[80248]: 2025-11-23 08:20:58.532776142 +0000 UTC m=+0.196793485 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:20:58 np0005532585.localdomain sudo[80322]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bozshgsaldovajylemiezhziaefrfmno ; /usr/bin/python3
Nov 23 08:20:58 np0005532585.localdomain sudo[80322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 08:20:58 np0005532585.localdomain podman[80249]: 2025-11-23 08:20:58.449488138 +0000 UTC m=+0.108153909 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.12, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4)
Nov 23 08:20:58 np0005532585.localdomain podman[80248]: 2025-11-23 08:20:58.566255831 +0000 UTC m=+0.230273154 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:20:58 np0005532585.localdomain podman[80249]: 2025-11-23 08:20:58.58418488 +0000 UTC m=+0.242850661 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, managed_by=tripleo_ansible)
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:20:58 np0005532585.localdomain podman[80255]: 2025-11-23 08:20:58.671155567 +0000 UTC m=+0.326126383 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, architecture=x86_64, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:20:58 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:20:58 np0005532585.localdomain python3[80335]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 08:21:01 np0005532585.localdomain sudo[80322]: pam_unix(sudo:session): session closed for user root
Nov 23 08:21:13 np0005532585.localdomain sudo[80341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:21:13 np0005532585.localdomain sudo[80341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:21:13 np0005532585.localdomain sudo[80341]: pam_unix(sudo:session): session closed for user root
Nov 23 08:21:13 np0005532585.localdomain sudo[80356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:21:13 np0005532585.localdomain sudo[80356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:21:13 np0005532585.localdomain sudo[80356]: pam_unix(sudo:session): session closed for user root
Nov 23 08:21:16 np0005532585.localdomain sudo[80404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:21:16 np0005532585.localdomain sudo[80404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:21:16 np0005532585.localdomain sudo[80404]: pam_unix(sudo:session): session closed for user root
Nov 23 08:21:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:21:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:21:22 np0005532585.localdomain podman[80420]: 2025-11-23 08:21:22.025463429 +0000 UTC m=+0.082460141 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:21:22 np0005532585.localdomain podman[80420]: 2025-11-23 08:21:22.061165251 +0000 UTC m=+0.118161973 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64)
Nov 23 08:21:22 np0005532585.localdomain systemd[1]: tmp-crun.33AQVw.mount: Deactivated successfully.
Nov 23 08:21:22 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:21:22 np0005532585.localdomain podman[80419]: 2025-11-23 08:21:22.080961644 +0000 UTC m=+0.139262983 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 23 08:21:22 np0005532585.localdomain podman[80419]: 2025-11-23 08:21:22.090249977 +0000 UTC m=+0.148551306 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:21:22 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:21:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:21:25 np0005532585.localdomain podman[80459]: 2025-11-23 08:21:25.031026872 +0000 UTC m=+0.086284930 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Nov 23 08:21:25 np0005532585.localdomain podman[80459]: 2025-11-23 08:21:25.060116696 +0000 UTC m=+0.115374754 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step5, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:21:25 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:21:26 np0005532585.localdomain sudo[80497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqakprixgvavpvthehcyuekgchndljzt ; /usr/bin/python3
Nov 23 08:21:26 np0005532585.localdomain sudo[80497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: tmp-crun.wEZHos.mount: Deactivated successfully.
Nov 23 08:21:26 np0005532585.localdomain podman[80499]: 2025-11-23 08:21:26.494522161 +0000 UTC m=+0.061179287 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z)
Nov 23 08:21:26 np0005532585.localdomain podman[80501]: 2025-11-23 08:21:26.567544773 +0000 UTC m=+0.130769742 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:21:26 np0005532585.localdomain podman[80499]: 2025-11-23 08:21:26.57628099 +0000 UTC m=+0.142938096 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com)
Nov 23 08:21:26 np0005532585.localdomain podman[80502]: 2025-11-23 08:21:26.573361018 +0000 UTC m=+0.131880183 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, release=1761123044, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:21:26 np0005532585.localdomain podman[80501]: 2025-11-23 08:21:26.619267661 +0000 UTC m=+0.182492680 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:21:26 np0005532585.localdomain python3[80500]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 08:21:26 np0005532585.localdomain podman[80502]: 2025-11-23 08:21:26.660225992 +0000 UTC m=+0.218745227 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:21:26 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:21:26 np0005532585.localdomain podman[80557]: 2025-11-23 08:21:26.624384375 +0000 UTC m=+0.046380256 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git)
Nov 23 08:21:27 np0005532585.localdomain podman[80557]: 2025-11-23 08:21:27.010743476 +0000 UTC m=+0.432739347 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_migration_target, distribution-scope=public, tcib_managed=true, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:21:27 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:21:27 np0005532585.localdomain systemd[1]: tmp-crun.uhFfKp.mount: Deactivated successfully.
Nov 23 08:21:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:21:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:21:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:21:29 np0005532585.localdomain podman[80599]: 2025-11-23 08:21:29.006700183 +0000 UTC m=+0.067171647 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12)
Nov 23 08:21:29 np0005532585.localdomain systemd[1]: tmp-crun.TVDqaD.mount: Deactivated successfully.
Nov 23 08:21:29 np0005532585.localdomain podman[80600]: 2025-11-23 08:21:29.0559754 +0000 UTC m=+0.112732099 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller, vcs-type=git)
Nov 23 08:21:29 np0005532585.localdomain podman[80599]: 2025-11-23 08:21:29.068361122 +0000 UTC m=+0.128832566 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:21:29 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:21:29 np0005532585.localdomain systemd[1]: tmp-crun.w8ux2N.mount: Deactivated successfully.
Nov 23 08:21:29 np0005532585.localdomain podman[80598]: 2025-11-23 08:21:29.119268636 +0000 UTC m=+0.179670978 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 23 08:21:29 np0005532585.localdomain podman[80600]: 2025-11-23 08:21:29.171724165 +0000 UTC m=+0.228480894 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Nov 23 08:21:29 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:21:29 np0005532585.localdomain podman[80598]: 2025-11-23 08:21:29.345302739 +0000 UTC m=+0.405705091 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:21:29 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: run-ra5997fd30680444abb12eee4ba2ee058.service: Deactivated successfully.
Nov 23 08:21:30 np0005532585.localdomain systemd[1]: run-r343d4ce72bbd4bcb8d220b30f52876b7.service: Deactivated successfully.
Nov 23 08:21:31 np0005532585.localdomain sudo[80497]: pam_unix(sudo:session): session closed for user root
Nov 23 08:21:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:21:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:21:53 np0005532585.localdomain podman[80870]: 2025-11-23 08:21:53.022099409 +0000 UTC m=+0.079825615 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=)
Nov 23 08:21:53 np0005532585.localdomain podman[80870]: 2025-11-23 08:21:53.055675482 +0000 UTC m=+0.113401668 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:21:53 np0005532585.localdomain podman[80869]: 2025-11-23 08:21:53.078523721 +0000 UTC m=+0.135424594 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:21:53 np0005532585.localdomain podman[80869]: 2025-11-23 08:21:53.112187096 +0000 UTC m=+0.169087979 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:21:53 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:21:53 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:21:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:21:56 np0005532585.localdomain systemd[1]: tmp-crun.iyf1wz.mount: Deactivated successfully.
Nov 23 08:21:56 np0005532585.localdomain podman[80910]: 2025-11-23 08:21:56.012001376 +0000 UTC m=+0.069629566 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:21:56 np0005532585.localdomain podman[80910]: 2025-11-23 08:21:56.040660359 +0000 UTC m=+0.098288559 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible)
Nov 23 08:21:56 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:21:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:21:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:21:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: tmp-crun.DhoIIM.mount: Deactivated successfully.
Nov 23 08:21:57 np0005532585.localdomain podman[80937]: 2025-11-23 08:21:57.021053823 +0000 UTC m=+0.078603590 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:21:57 np0005532585.localdomain podman[80937]: 2025-11-23 08:21:57.057873718 +0000 UTC m=+0.115423475 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: tmp-crun.cVek3J.mount: Deactivated successfully.
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:21:57 np0005532585.localdomain podman[80938]: 2025-11-23 08:21:57.070123255 +0000 UTC m=+0.124760000 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 23 08:21:57 np0005532585.localdomain podman[80939]: 2025-11-23 08:21:57.115407171 +0000 UTC m=+0.167141984 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=)
Nov 23 08:21:57 np0005532585.localdomain podman[80938]: 2025-11-23 08:21:57.127285177 +0000 UTC m=+0.181921912 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:21:57 np0005532585.localdomain podman[80939]: 2025-11-23 08:21:57.139989527 +0000 UTC m=+0.191724340 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true)
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:21:57 np0005532585.localdomain podman[80976]: 2025-11-23 08:21:57.18625929 +0000 UTC m=+0.141714811 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 23 08:21:57 np0005532585.localdomain podman[80976]: 2025-11-23 08:21:57.547709505 +0000 UTC m=+0.503164986 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044)
Nov 23 08:21:57 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:21:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:21:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:21:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:21:59 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:21:59 np0005532585.localdomain recover_tripleo_nova_virtqemud[81048]: 61756
Nov 23 08:21:59 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:21:59 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:22:00 np0005532585.localdomain systemd[1]: tmp-crun.24w6qu.mount: Deactivated successfully.
Nov 23 08:22:00 np0005532585.localdomain podman[81030]: 2025-11-23 08:22:00.039220061 +0000 UTC m=+0.094934634 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com)
Nov 23 08:22:00 np0005532585.localdomain systemd[1]: tmp-crun.bue58g.mount: Deactivated successfully.
Nov 23 08:22:00 np0005532585.localdomain podman[81032]: 2025-11-23 08:22:00.14035189 +0000 UTC m=+0.192448250 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git)
Nov 23 08:22:00 np0005532585.localdomain podman[81032]: 2025-11-23 08:22:00.172337408 +0000 UTC m=+0.224433778 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:22:00 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:22:00 np0005532585.localdomain podman[81031]: 2025-11-23 08:22:00.229108009 +0000 UTC m=+0.283572006 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, release=1761123044)
Nov 23 08:22:00 np0005532585.localdomain podman[81031]: 2025-11-23 08:22:00.296574712 +0000 UTC m=+0.351038679 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:22:00 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:22:00 np0005532585.localdomain podman[81030]: 2025-11-23 08:22:00.347310692 +0000 UTC m=+0.403025275 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=)
Nov 23 08:22:00 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:22:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:22:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4446 writes, 20K keys, 4446 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4446 writes, 451 syncs, 9.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:22:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:22:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 5196 writes, 22K keys, 5196 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5196 writes, 612 syncs, 8.49 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:22:09 np0005532585.localdomain sshd[81105]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:22:10 np0005532585.localdomain sshd[81105]: Invalid user support from 78.128.112.74 port 57964
Nov 23 08:22:10 np0005532585.localdomain sshd[81105]: Connection closed by invalid user support 78.128.112.74 port 57964 [preauth]
Nov 23 08:22:10 np0005532585.localdomain sudo[81120]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrcloqnoshszqqcptkuocchuexjfvbvr ; /usr/bin/python3
Nov 23 08:22:10 np0005532585.localdomain sudo[81120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 08:22:10 np0005532585.localdomain python3[81122]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 08:22:13 np0005532585.localdomain rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 08:22:13 np0005532585.localdomain rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 08:22:16 np0005532585.localdomain sudo[81253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:22:16 np0005532585.localdomain sudo[81253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:22:16 np0005532585.localdomain sudo[81253]: pam_unix(sudo:session): session closed for user root
Nov 23 08:22:16 np0005532585.localdomain sudo[81268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:22:16 np0005532585.localdomain sudo[81268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:22:17 np0005532585.localdomain sudo[81268]: pam_unix(sudo:session): session closed for user root
Nov 23 08:22:18 np0005532585.localdomain sudo[81120]: pam_unix(sudo:session): session closed for user root
Nov 23 08:22:18 np0005532585.localdomain sudo[81373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:22:18 np0005532585.localdomain sudo[81373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:22:18 np0005532585.localdomain sudo[81373]: pam_unix(sudo:session): session closed for user root
Nov 23 08:22:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:22:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:22:24 np0005532585.localdomain podman[81388]: 2025-11-23 08:22:24.00209739 +0000 UTC m=+0.058101649 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.12, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:22:24 np0005532585.localdomain podman[81388]: 2025-11-23 08:22:24.013231585 +0000 UTC m=+0.069235864 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 23 08:22:24 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:22:24 np0005532585.localdomain podman[81389]: 2025-11-23 08:22:24.04827564 +0000 UTC m=+0.098923128 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:22:24 np0005532585.localdomain podman[81389]: 2025-11-23 08:22:24.08634876 +0000 UTC m=+0.136996298 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044)
Nov 23 08:22:24 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:22:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:22:27 np0005532585.localdomain podman[81428]: 2025-11-23 08:22:27.016821499 +0000 UTC m=+0.073104905 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1)
Nov 23 08:22:27 np0005532585.localdomain podman[81428]: 2025-11-23 08:22:27.045212185 +0000 UTC m=+0.101495541 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:22:27 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:22:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:22:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:22:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:22:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:22:28 np0005532585.localdomain systemd[1]: tmp-crun.BMTKyZ.mount: Deactivated successfully.
Nov 23 08:22:28 np0005532585.localdomain podman[81455]: 2025-11-23 08:22:28.041732177 +0000 UTC m=+0.093658147 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1)
Nov 23 08:22:28 np0005532585.localdomain systemd[1]: tmp-crun.nYjnMs.mount: Deactivated successfully.
Nov 23 08:22:28 np0005532585.localdomain podman[81454]: 2025-11-23 08:22:28.090944124 +0000 UTC m=+0.146639611 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4)
Nov 23 08:22:28 np0005532585.localdomain podman[81454]: 2025-11-23 08:22:28.098609872 +0000 UTC m=+0.154305369 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 08:22:28 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:22:28 np0005532585.localdomain podman[81456]: 2025-11-23 08:22:28.147088077 +0000 UTC m=+0.195075986 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:22:28 np0005532585.localdomain podman[81456]: 2025-11-23 08:22:28.170156691 +0000 UTC m=+0.218144650 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc.)
Nov 23 08:22:28 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:22:28 np0005532585.localdomain podman[81455]: 2025-11-23 08:22:28.221511017 +0000 UTC m=+0.273437007 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:22:28 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:22:28 np0005532585.localdomain podman[81462]: 2025-11-23 08:22:28.07316591 +0000 UTC m=+0.114664574 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team)
Nov 23 08:22:28 np0005532585.localdomain podman[81462]: 2025-11-23 08:22:28.42071628 +0000 UTC m=+0.462214974 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:22:28 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:22:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:22:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:22:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:22:31 np0005532585.localdomain podman[81548]: 2025-11-23 08:22:31.020787826 +0000 UTC m=+0.079481367 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:22:31 np0005532585.localdomain systemd[1]: tmp-crun.SZarGf.mount: Deactivated successfully.
Nov 23 08:22:31 np0005532585.localdomain podman[81550]: 2025-11-23 08:22:31.038131607 +0000 UTC m=+0.088967465 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller)
Nov 23 08:22:31 np0005532585.localdomain podman[81549]: 2025-11-23 08:22:31.081108756 +0000 UTC m=+0.133296012 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, release=1761123044)
Nov 23 08:22:31 np0005532585.localdomain podman[81550]: 2025-11-23 08:22:31.133118862 +0000 UTC m=+0.183954750 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 23 08:22:31 np0005532585.localdomain podman[81549]: 2025-11-23 08:22:31.140354847 +0000 UTC m=+0.192542143 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:22:31 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:22:31 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:22:31 np0005532585.localdomain podman[81548]: 2025-11-23 08:22:31.198206859 +0000 UTC m=+0.256900430 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:22:31 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:22:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:22:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:22:55 np0005532585.localdomain systemd[1]: tmp-crun.6eTDD8.mount: Deactivated successfully.
Nov 23 08:22:55 np0005532585.localdomain podman[81668]: 2025-11-23 08:22:55.036420334 +0000 UTC m=+0.087476004 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:22:55 np0005532585.localdomain podman[81668]: 2025-11-23 08:22:55.048203896 +0000 UTC m=+0.099259546 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 08:22:55 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:22:55 np0005532585.localdomain podman[81669]: 2025-11-23 08:22:55.134684219 +0000 UTC m=+0.181605765 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z)
Nov 23 08:22:55 np0005532585.localdomain podman[81669]: 2025-11-23 08:22:55.174300773 +0000 UTC m=+0.221222329 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:22:55 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:22:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:22:58 np0005532585.localdomain systemd[1]: tmp-crun.mLA1dl.mount: Deactivated successfully.
Nov 23 08:22:58 np0005532585.localdomain podman[81709]: 2025-11-23 08:22:58.016424907 +0000 UTC m=+0.070982431 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.expose-services=)
Nov 23 08:22:58 np0005532585.localdomain podman[81709]: 2025-11-23 08:22:58.046176536 +0000 UTC m=+0.100734050 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step5, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 23 08:22:58 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:22:58 np0005532585.localdomain sshd[81735]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:22:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:22:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:22:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:22:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:22:59 np0005532585.localdomain systemd[1]: tmp-crun.027OmC.mount: Deactivated successfully.
Nov 23 08:22:59 np0005532585.localdomain podman[81738]: 2025-11-23 08:22:59.06517186 +0000 UTC m=+0.113863912 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 23 08:22:59 np0005532585.localdomain podman[81739]: 2025-11-23 08:22:59.032100952 +0000 UTC m=+0.078162745 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:22:59 np0005532585.localdomain podman[81736]: 2025-11-23 08:22:59.083082605 +0000 UTC m=+0.138000092 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z)
Nov 23 08:22:59 np0005532585.localdomain podman[81738]: 2025-11-23 08:22:59.116263426 +0000 UTC m=+0.164955468 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Nov 23 08:22:59 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:22:59 np0005532585.localdomain podman[81737]: 2025-11-23 08:22:59.128236243 +0000 UTC m=+0.178848082 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:22:59 np0005532585.localdomain podman[81736]: 2025-11-23 08:22:59.167218438 +0000 UTC m=+0.222135855 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:22:59 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:22:59 np0005532585.localdomain podman[81737]: 2025-11-23 08:22:59.178309179 +0000 UTC m=+0.228920998 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:22:59 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:22:59 np0005532585.localdomain podman[81739]: 2025-11-23 08:22:59.402395582 +0000 UTC m=+0.448457405 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_migration_target, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 23 08:22:59 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:23:00 np0005532585.localdomain systemd[1]: tmp-crun.zM9UEk.mount: Deactivated successfully.
Nov 23 08:23:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:23:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:23:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:23:02 np0005532585.localdomain podman[81831]: 2025-11-23 08:23:02.028431463 +0000 UTC m=+0.083624258 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Nov 23 08:23:02 np0005532585.localdomain podman[81830]: 2025-11-23 08:23:02.007474697 +0000 UTC m=+0.066708143 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:23:02 np0005532585.localdomain podman[81832]: 2025-11-23 08:23:02.073477348 +0000 UTC m=+0.126923111 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:23:02 np0005532585.localdomain podman[81831]: 2025-11-23 08:23:02.08522867 +0000 UTC m=+0.140421485 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044)
Nov 23 08:23:02 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:23:02 np0005532585.localdomain podman[81832]: 2025-11-23 08:23:02.098243208 +0000 UTC m=+0.151688971 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:23:02 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:23:02 np0005532585.localdomain podman[81830]: 2025-11-23 08:23:02.189156304 +0000 UTC m=+0.248389730 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:23:02 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:23:03 np0005532585.localdomain sshd[81735]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 08:23:03 np0005532585.localdomain sshd[81735]: Connection closed by 111.39.167.59 port 47974
Nov 23 08:23:05 np0005532585.localdomain sudo[81916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fusmsioiomstfonqwrbzmcfpgtyuuimg ; /usr/bin/python3
Nov 23 08:23:05 np0005532585.localdomain sudo[81916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 08:23:05 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:23:05 np0005532585.localdomain recover_tripleo_nova_virtqemud[81920]: 61756
Nov 23 08:23:05 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:23:05 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:23:05 np0005532585.localdomain python3[81918]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 08:23:08 np0005532585.localdomain rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 08:23:09 np0005532585.localdomain rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 23 08:23:12 np0005532585.localdomain sudo[81916]: pam_unix(sudo:session): session closed for user root
Nov 23 08:23:18 np0005532585.localdomain sudo[82108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:23:18 np0005532585.localdomain sudo[82108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:23:18 np0005532585.localdomain sudo[82108]: pam_unix(sudo:session): session closed for user root
Nov 23 08:23:19 np0005532585.localdomain sudo[82123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 08:23:19 np0005532585.localdomain sudo[82123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:23:19 np0005532585.localdomain sudo[82123]: pam_unix(sudo:session): session closed for user root
Nov 23 08:23:19 np0005532585.localdomain sudo[82158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:23:19 np0005532585.localdomain sudo[82158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:23:19 np0005532585.localdomain sudo[82158]: pam_unix(sudo:session): session closed for user root
Nov 23 08:23:19 np0005532585.localdomain sudo[82173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:23:19 np0005532585.localdomain sudo[82173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:23:20 np0005532585.localdomain sudo[82173]: pam_unix(sudo:session): session closed for user root
Nov 23 08:23:20 np0005532585.localdomain sudo[82221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:23:20 np0005532585.localdomain sudo[82221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:23:20 np0005532585.localdomain sudo[82221]: pam_unix(sudo:session): session closed for user root
Nov 23 08:23:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:23:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:23:26 np0005532585.localdomain podman[82236]: 2025-11-23 08:23:26.011836279 +0000 UTC m=+0.065981542 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:51:28Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:23:26 np0005532585.localdomain podman[82236]: 2025-11-23 08:23:26.022189417 +0000 UTC m=+0.076334660 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:23:26 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:23:26 np0005532585.localdomain podman[82237]: 2025-11-23 08:23:26.025049913 +0000 UTC m=+0.076735563 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:23:26 np0005532585.localdomain podman[82237]: 2025-11-23 08:23:26.104734523 +0000 UTC m=+0.156420193 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, container_name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:23:26 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:23:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:23:29 np0005532585.localdomain podman[82274]: 2025-11-23 08:23:29.032051603 +0000 UTC m=+0.087689951 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:23:29 np0005532585.localdomain podman[82274]: 2025-11-23 08:23:29.063327907 +0000 UTC m=+0.118966265 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, config_id=tripleo_step5, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:23:29 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:23:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:23:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:23:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:23:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:23:29 np0005532585.localdomain podman[82302]: 2025-11-23 08:23:29.99492641 +0000 UTC m=+0.055213310 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:23:30 np0005532585.localdomain podman[82302]: 2025-11-23 08:23:30.016867895 +0000 UTC m=+0.077154875 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:12:45Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:23:30 np0005532585.localdomain systemd[1]: tmp-crun.cJzA1c.mount: Deactivated successfully.
Nov 23 08:23:30 np0005532585.localdomain podman[82303]: 2025-11-23 08:23:30.030243675 +0000 UTC m=+0.084564877 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Nov 23 08:23:30 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:23:30 np0005532585.localdomain podman[82301]: 2025-11-23 08:23:30.072073854 +0000 UTC m=+0.128860089 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:23:30 np0005532585.localdomain podman[82301]: 2025-11-23 08:23:30.121204352 +0000 UTC m=+0.177990567 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12)
Nov 23 08:23:30 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:23:30 np0005532585.localdomain podman[82300]: 2025-11-23 08:23:30.139024574 +0000 UTC m=+0.198065687 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:23:30 np0005532585.localdomain podman[82300]: 2025-11-23 08:23:30.148202238 +0000 UTC m=+0.207243391 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 23 08:23:30 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:23:30 np0005532585.localdomain podman[82303]: 2025-11-23 08:23:30.39336177 +0000 UTC m=+0.447682962 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public)
Nov 23 08:23:30 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:23:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:23:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:23:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:23:32 np0005532585.localdomain systemd[1]: tmp-crun.LmGFK0.mount: Deactivated successfully.
Nov 23 08:23:33 np0005532585.localdomain systemd[1]: tmp-crun.PxXjXV.mount: Deactivated successfully.
Nov 23 08:23:33 np0005532585.localdomain podman[82391]: 2025-11-23 08:23:33.054734517 +0000 UTC m=+0.110649607 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:23:33 np0005532585.localdomain podman[82393]: 2025-11-23 08:23:33.00564985 +0000 UTC m=+0.063819237 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:23:33 np0005532585.localdomain podman[82392]: 2025-11-23 08:23:33.139352374 +0000 UTC m=+0.193265154 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:23:33 np0005532585.localdomain podman[82392]: 2025-11-23 08:23:33.169576417 +0000 UTC m=+0.223489177 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team)
Nov 23 08:23:33 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:23:33 np0005532585.localdomain podman[82393]: 2025-11-23 08:23:33.186975776 +0000 UTC m=+0.245145163 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com)
Nov 23 08:23:33 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:23:33 np0005532585.localdomain podman[82391]: 2025-11-23 08:23:33.252497553 +0000 UTC m=+0.308412653 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1761123044, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:23:33 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:23:34 np0005532585.localdomain python3[82480]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Nov 23 08:23:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:23:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:23:57 np0005532585.localdomain podman[82527]: 2025-11-23 08:23:57.031099102 +0000 UTC m=+0.081997620 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:23:57 np0005532585.localdomain podman[82527]: 2025-11-23 08:23:57.042434651 +0000 UTC m=+0.093333109 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container)
Nov 23 08:23:57 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:23:57 np0005532585.localdomain podman[82526]: 2025-11-23 08:23:57.13045796 +0000 UTC m=+0.185727138 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 23 08:23:57 np0005532585.localdomain podman[82526]: 2025-11-23 08:23:57.139973514 +0000 UTC m=+0.195242702 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:23:57 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:23:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:24:00 np0005532585.localdomain podman[82564]: 2025-11-23 08:24:00.015920519 +0000 UTC m=+0.073045613 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:24:00 np0005532585.localdomain podman[82564]: 2025-11-23 08:24:00.041728359 +0000 UTC m=+0.098853403 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.12)
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:24:00 np0005532585.localdomain podman[82590]: 2025-11-23 08:24:00.151961592 +0000 UTC m=+0.077011491 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:24:00 np0005532585.localdomain podman[82590]: 2025-11-23 08:24:00.210258603 +0000 UTC m=+0.135308472 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:24:00 np0005532585.localdomain podman[82624]: 2025-11-23 08:24:00.276944555 +0000 UTC m=+0.081310960 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12)
Nov 23 08:24:00 np0005532585.localdomain podman[82624]: 2025-11-23 08:24:00.314525507 +0000 UTC m=+0.118891942 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:24:00 np0005532585.localdomain podman[82610]: 2025-11-23 08:24:00.357696207 +0000 UTC m=+0.188202952 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:24:00 np0005532585.localdomain podman[82610]: 2025-11-23 08:24:00.390185977 +0000 UTC m=+0.220692722 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:24:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:24:01 np0005532585.localdomain podman[82662]: 2025-11-23 08:24:01.019242625 +0000 UTC m=+0.075906958 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, tcib_managed=true)
Nov 23 08:24:01 np0005532585.localdomain podman[82662]: 2025-11-23 08:24:01.403280905 +0000 UTC m=+0.459945248 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Nov 23 08:24:01 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:24:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:24:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:24:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:24:04 np0005532585.localdomain podman[82685]: 2025-11-23 08:24:04.014427919 +0000 UTC m=+0.076927028 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:24:04 np0005532585.localdomain systemd[1]: tmp-crun.2x1J0B.mount: Deactivated successfully.
Nov 23 08:24:04 np0005532585.localdomain podman[82686]: 2025-11-23 08:24:04.071958208 +0000 UTC m=+0.130107688 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Nov 23 08:24:04 np0005532585.localdomain podman[82687]: 2025-11-23 08:24:04.09781961 +0000 UTC m=+0.150535437 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:24:04 np0005532585.localdomain podman[82686]: 2025-11-23 08:24:04.104075557 +0000 UTC m=+0.162225037 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:24:04 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:24:04 np0005532585.localdomain podman[82687]: 2025-11-23 08:24:04.152284016 +0000 UTC m=+0.204999853 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, container_name=ovn_controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Nov 23 08:24:04 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:24:04 np0005532585.localdomain podman[82685]: 2025-11-23 08:24:04.192273681 +0000 UTC m=+0.254772840 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true)
Nov 23 08:24:04 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:24:05 np0005532585.localdomain systemd[1]: tmp-crun.mNEBVf.mount: Deactivated successfully.
Nov 23 08:24:20 np0005532585.localdomain sudo[82757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:24:21 np0005532585.localdomain sudo[82757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:24:21 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:24:21 np0005532585.localdomain sudo[82757]: pam_unix(sudo:session): session closed for user root
Nov 23 08:24:21 np0005532585.localdomain recover_tripleo_nova_virtqemud[82773]: 61756
Nov 23 08:24:21 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:24:21 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:24:21 np0005532585.localdomain sudo[82774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:24:21 np0005532585.localdomain sudo[82774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:24:21 np0005532585.localdomain sudo[82774]: pam_unix(sudo:session): session closed for user root
Nov 23 08:24:22 np0005532585.localdomain sudo[82822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:24:22 np0005532585.localdomain sudo[82822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:24:22 np0005532585.localdomain sudo[82822]: pam_unix(sudo:session): session closed for user root
Nov 23 08:24:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:24:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:24:28 np0005532585.localdomain podman[82838]: 2025-11-23 08:24:28.034694673 +0000 UTC m=+0.087470634 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:24:28 np0005532585.localdomain podman[82838]: 2025-11-23 08:24:28.045287509 +0000 UTC m=+0.098063450 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:24:28 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:24:28 np0005532585.localdomain systemd[1]: tmp-crun.hIH4GE.mount: Deactivated successfully.
Nov 23 08:24:28 np0005532585.localdomain podman[82837]: 2025-11-23 08:24:28.128181465 +0000 UTC m=+0.181171172 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, release=1761123044)
Nov 23 08:24:28 np0005532585.localdomain podman[82837]: 2025-11-23 08:24:28.136174614 +0000 UTC m=+0.189164251 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container)
Nov 23 08:24:28 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:24:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:24:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:24:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:24:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:24:31 np0005532585.localdomain systemd[1]: tmp-crun.b6iQna.mount: Deactivated successfully.
Nov 23 08:24:31 np0005532585.localdomain podman[82887]: 2025-11-23 08:24:31.070550163 +0000 UTC m=+0.115950444 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12)
Nov 23 08:24:31 np0005532585.localdomain podman[82878]: 2025-11-23 08:24:31.113472745 +0000 UTC m=+0.171608717 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Nov 23 08:24:31 np0005532585.localdomain podman[82879]: 2025-11-23 08:24:31.04199116 +0000 UTC m=+0.093365349 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 23 08:24:31 np0005532585.localdomain podman[82880]: 2025-11-23 08:24:31.100553669 +0000 UTC m=+0.150442605 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:24:31 np0005532585.localdomain podman[82879]: 2025-11-23 08:24:31.176119036 +0000 UTC m=+0.227493235 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 23 08:24:31 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:24:31 np0005532585.localdomain podman[82878]: 2025-11-23 08:24:31.197169395 +0000 UTC m=+0.255305367 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, managed_by=tripleo_ansible)
Nov 23 08:24:31 np0005532585.localdomain podman[82887]: 2025-11-23 08:24:31.2043927 +0000 UTC m=+0.249792961 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:24:31 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:24:31 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:24:31 np0005532585.localdomain podman[82880]: 2025-11-23 08:24:31.23083261 +0000 UTC m=+0.280721576 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:24:31 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:24:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:24:31 np0005532585.localdomain podman[82978]: 2025-11-23 08:24:31.990435467 +0000 UTC m=+0.051978814 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1)
Nov 23 08:24:32 np0005532585.localdomain podman[82978]: 2025-11-23 08:24:32.365376075 +0000 UTC m=+0.426919382 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_migration_target, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 23 08:24:32 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:24:34 np0005532585.localdomain sshd[80282]: Received disconnect from 38.102.83.114 port 37676:11: disconnected by user
Nov 23 08:24:34 np0005532585.localdomain sshd[80282]: Disconnected from user zuul 38.102.83.114 port 37676
Nov 23 08:24:34 np0005532585.localdomain sshd[80246]: pam_unix(sshd:session): session closed for user zuul
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: session-34.scope: Consumed 19.353s CPU time.
Nov 23 08:24:34 np0005532585.localdomain systemd-logind[761]: Session 34 logged out. Waiting for processes to exit.
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:24:34 np0005532585.localdomain systemd-logind[761]: Removed session 34.
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: tmp-crun.WT0TN7.mount: Deactivated successfully.
Nov 23 08:24:34 np0005532585.localdomain podman[83002]: 2025-11-23 08:24:34.70796627 +0000 UTC m=+0.085917337 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: tmp-crun.FFAgr5.mount: Deactivated successfully.
Nov 23 08:24:34 np0005532585.localdomain podman[83001]: 2025-11-23 08:24:34.767003744 +0000 UTC m=+0.147356763 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc.)
Nov 23 08:24:34 np0005532585.localdomain podman[83003]: 2025-11-23 08:24:34.809114451 +0000 UTC m=+0.183498132 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12)
Nov 23 08:24:34 np0005532585.localdomain podman[83002]: 2025-11-23 08:24:34.837919421 +0000 UTC m=+0.215870448 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:24:34 np0005532585.localdomain podman[83003]: 2025-11-23 08:24:34.859222578 +0000 UTC m=+0.233606179 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, container_name=ovn_controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:24:34 np0005532585.localdomain podman[83001]: 2025-11-23 08:24:34.950233596 +0000 UTC m=+0.330586565 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 23 08:24:34 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:24:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:24:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:24:59 np0005532585.localdomain podman[83125]: 2025-11-23 08:24:59.02831506 +0000 UTC m=+0.080693371 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 23 08:24:59 np0005532585.localdomain podman[83125]: 2025-11-23 08:24:59.041272508 +0000 UTC m=+0.093650809 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com)
Nov 23 08:24:59 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:24:59 np0005532585.localdomain podman[83126]: 2025-11-23 08:24:59.132176522 +0000 UTC m=+0.184202292 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, release=1761123044, vcs-type=git, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:24:59 np0005532585.localdomain podman[83126]: 2025-11-23 08:24:59.169358693 +0000 UTC m=+0.221384443 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:24:59 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:25:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:25:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:25:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:25:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:25:02 np0005532585.localdomain podman[83168]: 2025-11-23 08:25:02.04968954 +0000 UTC m=+0.095003299 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4)
Nov 23 08:25:02 np0005532585.localdomain podman[83165]: 2025-11-23 08:25:02.086904801 +0000 UTC m=+0.141731335 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:25:02 np0005532585.localdomain podman[83165]: 2025-11-23 08:25:02.094493178 +0000 UTC m=+0.149319762 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:25:02 np0005532585.localdomain podman[83168]: 2025-11-23 08:25:02.105020952 +0000 UTC m=+0.150334671 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_id=tripleo_step5, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64)
Nov 23 08:25:02 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:25:02 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:25:02 np0005532585.localdomain podman[83167]: 2025-11-23 08:25:02.139402129 +0000 UTC m=+0.187866322 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:12:45Z, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi)
Nov 23 08:25:02 np0005532585.localdomain podman[83166]: 2025-11-23 08:25:02.19871303 +0000 UTC m=+0.249887144 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Nov 23 08:25:02 np0005532585.localdomain podman[83167]: 2025-11-23 08:25:02.21679856 +0000 UTC m=+0.265262783 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 23 08:25:02 np0005532585.localdomain podman[83166]: 2025-11-23 08:25:02.228730277 +0000 UTC m=+0.279904361 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:25:02 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:25:02 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:25:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:25:03 np0005532585.localdomain podman[83269]: 2025-11-23 08:25:03.021258457 +0000 UTC m=+0.079889607 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute)
Nov 23 08:25:03 np0005532585.localdomain podman[83269]: 2025-11-23 08:25:03.413963186 +0000 UTC m=+0.472594376 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, tcib_managed=true, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:25:03 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:25:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:25:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:25:05 np0005532585.localdomain podman[83293]: 2025-11-23 08:25:05.006294612 +0000 UTC m=+0.068043353 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:25:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:25:05 np0005532585.localdomain systemd[1]: tmp-crun.jFzhTM.mount: Deactivated successfully.
Nov 23 08:25:05 np0005532585.localdomain podman[83323]: 2025-11-23 08:25:05.094530558 +0000 UTC m=+0.065054115 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 23 08:25:05 np0005532585.localdomain podman[83294]: 2025-11-23 08:25:05.062415238 +0000 UTC m=+0.119320184 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Nov 23 08:25:05 np0005532585.localdomain podman[83293]: 2025-11-23 08:25:05.119515054 +0000 UTC m=+0.181263735 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:25:05 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:25:05 np0005532585.localdomain podman[83294]: 2025-11-23 08:25:05.147265423 +0000 UTC m=+0.204170409 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:25:05 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:25:05 np0005532585.localdomain podman[83323]: 2025-11-23 08:25:05.290188712 +0000 UTC m=+0.260712229 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true)
Nov 23 08:25:05 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:25:22 np0005532585.localdomain sudo[83367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:25:22 np0005532585.localdomain sudo[83367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:25:22 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:25:22 np0005532585.localdomain sudo[83367]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:22 np0005532585.localdomain recover_tripleo_nova_virtqemud[83383]: 61756
Nov 23 08:25:22 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:25:22 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:25:22 np0005532585.localdomain sudo[83384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 08:25:22 np0005532585.localdomain sudo[83384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:25:23 np0005532585.localdomain podman[83471]: 2025-11-23 08:25:23.410498733 +0000 UTC m=+0.080060312 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 08:25:23 np0005532585.localdomain podman[83471]: 2025-11-23 08:25:23.515631053 +0000 UTC m=+0.185192642 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7)
Nov 23 08:25:23 np0005532585.localdomain sudo[83384]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:23 np0005532585.localdomain sudo[83538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:25:23 np0005532585.localdomain sudo[83538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:25:23 np0005532585.localdomain sudo[83538]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:23 np0005532585.localdomain sudo[83553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:25:23 np0005532585.localdomain sudo[83553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:25:24 np0005532585.localdomain sudo[83553]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:25 np0005532585.localdomain sudo[83601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:25:25 np0005532585.localdomain sudo[83601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:25:25 np0005532585.localdomain sudo[83601]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:25:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:25:30 np0005532585.localdomain podman[83616]: 2025-11-23 08:25:30.018923086 +0000 UTC m=+0.075128055 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, url=https://www.redhat.com)
Nov 23 08:25:30 np0005532585.localdomain podman[83616]: 2025-11-23 08:25:30.059294771 +0000 UTC m=+0.115499740 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:25:30 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:25:30 np0005532585.localdomain podman[83617]: 2025-11-23 08:25:30.075969769 +0000 UTC m=+0.132132447 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:25:30 np0005532585.localdomain podman[83617]: 2025-11-23 08:25:30.084231726 +0000 UTC m=+0.140394404 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.4)
Nov 23 08:25:30 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:25:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:25:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:25:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:25:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:25:33 np0005532585.localdomain systemd[1]: tmp-crun.QX6aIr.mount: Deactivated successfully.
Nov 23 08:25:33 np0005532585.localdomain podman[83656]: 2025-11-23 08:25:33.044930622 +0000 UTC m=+0.095884584 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Nov 23 08:25:33 np0005532585.localdomain podman[83656]: 2025-11-23 08:25:33.055052295 +0000 UTC m=+0.106006257 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-cron, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:25:33 np0005532585.localdomain podman[83662]: 2025-11-23 08:25:33.065085874 +0000 UTC m=+0.106923144 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, version=17.1.12, config_id=tripleo_step5, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044)
Nov 23 08:25:33 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:25:33 np0005532585.localdomain podman[83662]: 2025-11-23 08:25:33.096199703 +0000 UTC m=+0.138037003 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, version=17.1.12)
Nov 23 08:25:33 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:25:33 np0005532585.localdomain podman[83658]: 2025-11-23 08:25:33.111172221 +0000 UTC m=+0.153932248 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Nov 23 08:25:33 np0005532585.localdomain podman[83657]: 2025-11-23 08:25:33.161088511 +0000 UTC m=+0.207643492 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, distribution-scope=public)
Nov 23 08:25:33 np0005532585.localdomain podman[83657]: 2025-11-23 08:25:33.214663092 +0000 UTC m=+0.261218073 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:25:33 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:25:33 np0005532585.localdomain podman[83658]: 2025-11-23 08:25:33.26615729 +0000 UTC m=+0.308917357 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64)
Nov 23 08:25:33 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:25:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:25:34 np0005532585.localdomain podman[83753]: 2025-11-23 08:25:34.004324336 +0000 UTC m=+0.067548279 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 23 08:25:34 np0005532585.localdomain podman[83753]: 2025-11-23 08:25:34.380239094 +0000 UTC m=+0.443462987 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:25:34 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:25:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:25:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:25:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:25:36 np0005532585.localdomain podman[83854]: 2025-11-23 08:25:36.010916307 +0000 UTC m=+0.066776666 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 23 08:25:36 np0005532585.localdomain systemd[1]: tmp-crun.zrOZ8K.mount: Deactivated successfully.
Nov 23 08:25:36 np0005532585.localdomain podman[83855]: 2025-11-23 08:25:36.092472342 +0000 UTC m=+0.144499247 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible)
Nov 23 08:25:36 np0005532585.localdomain podman[83856]: 2025-11-23 08:25:36.170359929 +0000 UTC m=+0.218861548 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:25:36 np0005532585.localdomain podman[83855]: 2025-11-23 08:25:36.202716125 +0000 UTC m=+0.254743010 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, io.openshift.expose-services=)
Nov 23 08:25:36 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:25:36 np0005532585.localdomain podman[83856]: 2025-11-23 08:25:36.214684893 +0000 UTC m=+0.263186542 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=)
Nov 23 08:25:36 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:25:36 np0005532585.localdomain podman[83854]: 2025-11-23 08:25:36.270812889 +0000 UTC m=+0.326673308 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible)
Nov 23 08:25:36 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:25:41 np0005532585.localdomain sudo[84230]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpzf3aybgk/privsep.sock
Nov 23 08:25:41 np0005532585.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 23 08:25:41 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 23 08:25:41 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 23 08:25:41 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 23 08:25:41 np0005532585.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Queued start job for default target Main User Target.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Created slice User Application Slice.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Reached target Paths.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Reached target Timers.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Starting D-Bus User Message Bus Socket...
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Starting Create User's Volatile Files and Directories...
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Finished Create User's Volatile Files and Directories.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Listening on D-Bus User Message Bus Socket.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Reached target Sockets.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Reached target Basic System.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Reached target Main User Target.
Nov 23 08:25:41 np0005532585.localdomain systemd[84232]: Startup finished in 156ms.
Nov 23 08:25:41 np0005532585.localdomain systemd[1]: Started User Manager for UID 0.
Nov 23 08:25:41 np0005532585.localdomain systemd[1]: Started Session c11 of User root.
Nov 23 08:25:41 np0005532585.localdomain sudo[84230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 23 08:25:42 np0005532585.localdomain sudo[84230]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:43 np0005532585.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 23 08:25:43 np0005532585.localdomain kernel: device tapd3912d14-a3 entered promiscuous mode
Nov 23 08:25:43 np0005532585.localdomain NetworkManager[5975]: <info>  [1763886343.0359] manager: (tapd3912d14-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Nov 23 08:25:43 np0005532585.localdomain systemd-udevd[84267]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 08:25:43 np0005532585.localdomain NetworkManager[5975]: <info>  [1763886343.0556] device (tapd3912d14-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 08:25:43 np0005532585.localdomain NetworkManager[5975]: <info>  [1763886343.0564] device (tapd3912d14-a3): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 23 08:25:43 np0005532585.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 23 08:25:43 np0005532585.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 23 08:25:43 np0005532585.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 23 08:25:43 np0005532585.localdomain systemd-machined[84275]: New machine qemu-1-instance-00000002.
Nov 23 08:25:43 np0005532585.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Nov 23 08:25:43 np0005532585.localdomain NetworkManager[5975]: <info>  [1763886343.2965] manager: (tapbcac49fc-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Nov 23 08:25:43 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c1: link becomes ready
Nov 23 08:25:43 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c0: link becomes ready
Nov 23 08:25:43 np0005532585.localdomain NetworkManager[5975]: <info>  [1763886343.3419] device (tapbcac49fc-c0): carrier: link connected
Nov 23 08:25:43 np0005532585.localdomain kernel: device tapbcac49fc-c0 entered promiscuous mode
Nov 23 08:25:44 np0005532585.localdomain sudo[84370]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 haproxy -f /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf
Nov 23 08:25:44 np0005532585.localdomain sudo[84370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 23 08:25:45 np0005532585.localdomain podman[84392]: 2025-11-23 08:25:45.049232611 +0000 UTC m=+0.089528275 container create 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:25:45 np0005532585.localdomain systemd[1]: Started libpod-conmon-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799.scope.
Nov 23 08:25:45 np0005532585.localdomain podman[84392]: 2025-11-23 08:25:45.008451933 +0000 UTC m=+0.048747667 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 23 08:25:45 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:25:45 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/086324888e3aef5fa52615eb760fec66e6f8cc0743a0416438a6f968d544d4e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 08:25:45 np0005532585.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 23 08:25:45 np0005532585.localdomain podman[84392]: 2025-11-23 08:25:45.129681304 +0000 UTC m=+0.169976988 container init 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:25:45 np0005532585.localdomain podman[84392]: 2025-11-23 08:25:45.139884628 +0000 UTC m=+0.180180292 container start 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1)
Nov 23 08:25:45 np0005532585.localdomain sudo[84370]: pam_unix(sudo:session): session closed for user root
Nov 23 08:25:45 np0005532585.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 23 08:25:45 np0005532585.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 23 08:25:45 np0005532585.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 23 08:25:46 np0005532585.localdomain setroubleshoot[84409]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l a701375f-2314-41e7-b9ff-3bc5dfd0e157
Nov 23 08:25:46 np0005532585.localdomain setroubleshoot[84409]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                                
                                                                *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                                
                                                                If max_map_count is a virtualization target
                                                                Then you need to change the label on max_map_count'
                                                                Do
                                                                # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                                # restorecon -v 'max_map_count'
                                                                
                                                                *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                                
                                                                If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                                Then you should report this as a bug.
                                                                You can generate a local policy module to allow this access.
                                                                Do
                                                                allow this access for now by executing:
                                                                # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                                # semodule -X 300 -i my-qemukvm.pp
                                                                
Nov 23 08:25:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 08:25:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 08:25:55 np0005532585.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 23 08:25:56 np0005532585.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 23 08:26:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:26:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:26:01 np0005532585.localdomain systemd[1]: tmp-crun.NFmG9s.mount: Deactivated successfully.
Nov 23 08:26:01 np0005532585.localdomain podman[84475]: 2025-11-23 08:26:01.046982862 +0000 UTC m=+0.094931197 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, container_name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc.)
Nov 23 08:26:01 np0005532585.localdomain podman[84475]: 2025-11-23 08:26:01.06333273 +0000 UTC m=+0.111281145 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team)
Nov 23 08:26:01 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:26:01 np0005532585.localdomain podman[84476]: 2025-11-23 08:26:01.138794294 +0000 UTC m=+0.185916284 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z)
Nov 23 08:26:01 np0005532585.localdomain podman[84476]: 2025-11-23 08:26:01.179686085 +0000 UTC m=+0.226808135 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044)
Nov 23 08:26:01 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48596 [23/Nov/2025:08:26:01.431] listener listener/metadata 0/0/0/1122/1122 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48600 [23/Nov/2025:08:26:02.645] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48602 [23/Nov/2025:08:26:02.708] listener listener/metadata 0/0/0/10/10 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48606 [23/Nov/2025:08:26:02.760] listener listener/metadata 0/0/0/9/9 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48612 [23/Nov/2025:08:26:02.812] listener listener/metadata 0/0/0/8/8 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48620 [23/Nov/2025:08:26:02.865] listener listener/metadata 0/0/0/11/11 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48622 [23/Nov/2025:08:26:02.919] listener listener/metadata 0/0/0/10/10 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Nov 23 08:26:02 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48628 [23/Nov/2025:08:26:02.971] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48638 [23/Nov/2025:08:26:03.026] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48642 [23/Nov/2025:08:26:03.084] listener listener/metadata 0/0/0/11/11 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48658 [23/Nov/2025:08:26:03.142] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48664 [23/Nov/2025:08:26:03.185] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48680 [23/Nov/2025:08:26:03.229] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48692 [23/Nov/2025:08:26:03.268] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48704 [23/Nov/2025:08:26:03.320] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48714 [23/Nov/2025:08:26:03.377] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Nov 23 08:26:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:26:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:26:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:26:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: tmp-crun.4Bv17O.mount: Deactivated successfully.
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: tmp-crun.B2KBtL.mount: Deactivated successfully.
Nov 23 08:26:04 np0005532585.localdomain podman[84514]: 2025-11-23 08:26:04.018185452 +0000 UTC m=+0.068634601 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 23 08:26:04 np0005532585.localdomain podman[84512]: 2025-11-23 08:26:04.045293262 +0000 UTC m=+0.098065230 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Nov 23 08:26:04 np0005532585.localdomain podman[84514]: 2025-11-23 08:26:04.102324575 +0000 UTC m=+0.152773744 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_ipmi, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public)
Nov 23 08:26:04 np0005532585.localdomain podman[84515]: 2025-11-23 08:26:04.111067397 +0000 UTC m=+0.151761635 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_compute, release=1761123044)
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:26:04 np0005532585.localdomain podman[84512]: 2025-11-23 08:26:04.131435915 +0000 UTC m=+0.184207883 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond)
Nov 23 08:26:04 np0005532585.localdomain podman[84515]: 2025-11-23 08:26:04.133420434 +0000 UTC m=+0.174114682 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:26:04 np0005532585.localdomain podman[84513]: 2025-11-23 08:26:04.082925716 +0000 UTC m=+0.134213789 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com)
Nov 23 08:26:04 np0005532585.localdomain podman[84513]: 2025-11-23 08:26:04.213131244 +0000 UTC m=+0.264419357 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:26:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:26:05 np0005532585.localdomain podman[84606]: 2025-11-23 08:26:05.012486638 +0000 UTC m=+0.073859656 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 23 08:26:05 np0005532585.localdomain podman[84606]: 2025-11-23 08:26:05.408428064 +0000 UTC m=+0.469801042 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:26:05 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:26:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:26:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:26:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:26:07 np0005532585.localdomain systemd[1]: tmp-crun.m6kbau.mount: Deactivated successfully.
Nov 23 08:26:07 np0005532585.localdomain podman[84630]: 2025-11-23 08:26:07.009126712 +0000 UTC m=+0.070108175 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Nov 23 08:26:07 np0005532585.localdomain podman[84630]: 2025-11-23 08:26:07.053151707 +0000 UTC m=+0.114133150 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 23 08:26:07 np0005532585.localdomain systemd[1]: tmp-crun.6eHnDr.mount: Deactivated successfully.
Nov 23 08:26:07 np0005532585.localdomain podman[84631]: 2025-11-23 08:26:07.065114614 +0000 UTC m=+0.120001575 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller)
Nov 23 08:26:07 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:26:07 np0005532585.localdomain podman[84631]: 2025-11-23 08:26:07.11149826 +0000 UTC m=+0.166385031 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_controller)
Nov 23 08:26:07 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:26:07 np0005532585.localdomain podman[84629]: 2025-11-23 08:26:07.115442547 +0000 UTC m=+0.177588944 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:26:07 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 08:26:07 np0005532585.localdomain podman[84629]: 2025-11-23 08:26:07.304282467 +0000 UTC m=+0.366428864 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 23 08:26:07 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:26:25 np0005532585.localdomain sudo[84707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:26:25 np0005532585.localdomain sudo[84707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:26:25 np0005532585.localdomain sudo[84707]: pam_unix(sudo:session): session closed for user root
Nov 23 08:26:25 np0005532585.localdomain sudo[84722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:26:25 np0005532585.localdomain sudo[84722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:26:26 np0005532585.localdomain sudo[84722]: pam_unix(sudo:session): session closed for user root
Nov 23 08:26:26 np0005532585.localdomain sudo[84768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:26:26 np0005532585.localdomain sudo[84768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:26:26 np0005532585.localdomain sudo[84768]: pam_unix(sudo:session): session closed for user root
Nov 23 08:26:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:26:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:26:32 np0005532585.localdomain podman[84783]: 2025-11-23 08:26:32.043296088 +0000 UTC m=+0.061840007 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-type=git, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:26:32 np0005532585.localdomain podman[84784]: 2025-11-23 08:26:32.100300182 +0000 UTC m=+0.117139610 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., container_name=iscsid, batch=17.1_20251118.1)
Nov 23 08:26:32 np0005532585.localdomain podman[84784]: 2025-11-23 08:26:32.112190026 +0000 UTC m=+0.129029464 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible)
Nov 23 08:26:32 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:26:32 np0005532585.localdomain podman[84783]: 2025-11-23 08:26:32.132177713 +0000 UTC m=+0.150721712 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-collectd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4)
Nov 23 08:26:32 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:26:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:26:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:26:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:26:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:26:35 np0005532585.localdomain podman[84825]: 2025-11-23 08:26:35.022602051 +0000 UTC m=+0.073892178 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:26:35 np0005532585.localdomain podman[84823]: 2025-11-23 08:26:35.003172481 +0000 UTC m=+0.061055635 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:26:35 np0005532585.localdomain podman[84824]: 2025-11-23 08:26:35.064255485 +0000 UTC m=+0.118101368 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 23 08:26:35 np0005532585.localdomain podman[84825]: 2025-11-23 08:26:35.067870053 +0000 UTC m=+0.119160220 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true)
Nov 23 08:26:35 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:26:35 np0005532585.localdomain podman[84831]: 2025-11-23 08:26:35.115789864 +0000 UTC m=+0.162855555 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, vcs-type=git, container_name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:26:35 np0005532585.localdomain podman[84824]: 2025-11-23 08:26:35.135361398 +0000 UTC m=+0.189207281 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:26:35 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:26:35 np0005532585.localdomain podman[84831]: 2025-11-23 08:26:35.167439727 +0000 UTC m=+0.214505398 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z)
Nov 23 08:26:35 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:26:35 np0005532585.localdomain podman[84823]: 2025-11-23 08:26:35.186477885 +0000 UTC m=+0.244361089 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible)
Nov 23 08:26:35 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:26:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:26:36 np0005532585.localdomain systemd[1]: tmp-crun.lUHERN.mount: Deactivated successfully.
Nov 23 08:26:36 np0005532585.localdomain podman[84924]: 2025-11-23 08:26:36.020217167 +0000 UTC m=+0.076771824 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, version=17.1.12, container_name=nova_migration_target, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4)
Nov 23 08:26:36 np0005532585.localdomain podman[84924]: 2025-11-23 08:26:36.41440729 +0000 UTC m=+0.470961937 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=)
Nov 23 08:26:36 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:26:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:26:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:26:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:26:38 np0005532585.localdomain systemd[1]: tmp-crun.rZVE0d.mount: Deactivated successfully.
Nov 23 08:26:38 np0005532585.localdomain podman[84947]: 2025-11-23 08:26:38.010627475 +0000 UTC m=+0.072355793 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 08:26:38 np0005532585.localdomain podman[84949]: 2025-11-23 08:26:38.019775928 +0000 UTC m=+0.071425725 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller)
Nov 23 08:26:38 np0005532585.localdomain podman[84949]: 2025-11-23 08:26:38.094627954 +0000 UTC m=+0.146277731 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, tcib_managed=true)
Nov 23 08:26:38 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:26:38 np0005532585.localdomain podman[84948]: 2025-11-23 08:26:38.104715124 +0000 UTC m=+0.163466153 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Nov 23 08:26:38 np0005532585.localdomain podman[84948]: 2025-11-23 08:26:38.137184964 +0000 UTC m=+0.195935933 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Nov 23 08:26:38 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:26:38 np0005532585.localdomain podman[84947]: 2025-11-23 08:26:38.211208245 +0000 UTC m=+0.272936633 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:26:38 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:26:46 np0005532585.localdomain sshd[85021]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:26:49 np0005532585.localdomain sshd[85021]: Connection closed by authenticating user nobody 183.237.166.2 port 55448 [preauth]
Nov 23 08:27:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:27:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:27:03 np0005532585.localdomain podman[85070]: 2025-11-23 08:27:03.023039818 +0000 UTC m=+0.075679418 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, version=17.1.12)
Nov 23 08:27:03 np0005532585.localdomain podman[85069]: 2025-11-23 08:27:03.090962858 +0000 UTC m=+0.143029780 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:27:03 np0005532585.localdomain podman[85069]: 2025-11-23 08:27:03.103508267 +0000 UTC m=+0.155575199 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:27:03 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:27:03 np0005532585.localdomain podman[85070]: 2025-11-23 08:27:03.158223737 +0000 UTC m=+0.210863337 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, container_name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:27:03 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:27:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:27:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:27:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:27:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:27:06 np0005532585.localdomain systemd[1]: tmp-crun.NkWxa4.mount: Deactivated successfully.
Nov 23 08:27:06 np0005532585.localdomain podman[85109]: 2025-11-23 08:27:06.035224782 +0000 UTC m=+0.090651294 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, url=https://www.redhat.com, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, vcs-type=git)
Nov 23 08:27:06 np0005532585.localdomain podman[85116]: 2025-11-23 08:27:06.046184011 +0000 UTC m=+0.091091408 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64)
Nov 23 08:27:06 np0005532585.localdomain podman[85116]: 2025-11-23 08:27:06.071348351 +0000 UTC m=+0.116255798 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=)
Nov 23 08:27:06 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:27:06 np0005532585.localdomain podman[85111]: 2025-11-23 08:27:06.088112904 +0000 UTC m=+0.136860244 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044)
Nov 23 08:27:06 np0005532585.localdomain podman[85110]: 2025-11-23 08:27:06.125164403 +0000 UTC m=+0.176971680 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:27:06 np0005532585.localdomain podman[85111]: 2025-11-23 08:27:06.140221762 +0000 UTC m=+0.188969112 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 23 08:27:06 np0005532585.localdomain podman[85109]: 2025-11-23 08:27:06.14863013 +0000 UTC m=+0.204056592 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Nov 23 08:27:06 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:27:06 np0005532585.localdomain podman[85110]: 2025-11-23 08:27:06.156326805 +0000 UTC m=+0.208134052 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:27:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:27:06 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:27:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:27:07 np0005532585.localdomain podman[85200]: 2025-11-23 08:27:07.01829668 +0000 UTC m=+0.079839860 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:27:07 np0005532585.localdomain podman[85200]: 2025-11-23 08:27:07.387261525 +0000 UTC m=+0.448804605 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4)
Nov 23 08:27:07 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:27:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:27:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:27:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:27:09 np0005532585.localdomain podman[85222]: 2025-11-23 08:27:09.033163165 +0000 UTC m=+0.087453873 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:27:09 np0005532585.localdomain podman[85223]: 2025-11-23 08:27:09.078672653 +0000 UTC m=+0.129702017 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:27:09 np0005532585.localdomain podman[85223]: 2025-11-23 08:27:09.122215947 +0000 UTC m=+0.173245331 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 23 08:27:09 np0005532585.localdomain podman[85224]: 2025-11-23 08:27:09.132132003 +0000 UTC m=+0.180581855 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:27:09 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:27:09 np0005532585.localdomain podman[85224]: 2025-11-23 08:27:09.178157627 +0000 UTC m=+0.226607449 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Nov 23 08:27:09 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:27:09 np0005532585.localdomain podman[85222]: 2025-11-23 08:27:09.235221252 +0000 UTC m=+0.289511900 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team)
Nov 23 08:27:09 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:27:26 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:27:26 np0005532585.localdomain recover_tripleo_nova_virtqemud[85300]: 61756
Nov 23 08:27:26 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:27:26 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:27:27 np0005532585.localdomain sudo[85301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:27:27 np0005532585.localdomain sudo[85301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:27:27 np0005532585.localdomain sudo[85301]: pam_unix(sudo:session): session closed for user root
Nov 23 08:27:27 np0005532585.localdomain sudo[85316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:27:27 np0005532585.localdomain sudo[85316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:27:27 np0005532585.localdomain sudo[85316]: pam_unix(sudo:session): session closed for user root
Nov 23 08:27:28 np0005532585.localdomain sudo[85363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:27:28 np0005532585.localdomain sudo[85363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:27:28 np0005532585.localdomain sudo[85363]: pam_unix(sudo:session): session closed for user root
Nov 23 08:27:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:27:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:27:34 np0005532585.localdomain systemd[1]: tmp-crun.FD9ryO.mount: Deactivated successfully.
Nov 23 08:27:34 np0005532585.localdomain podman[85378]: 2025-11-23 08:27:34.044776944 +0000 UTC m=+0.096086138 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container)
Nov 23 08:27:34 np0005532585.localdomain podman[85379]: 2025-11-23 08:27:34.086171501 +0000 UTC m=+0.135420379 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, build-date=2025-11-18T23:44:13Z, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 23 08:27:34 np0005532585.localdomain podman[85379]: 2025-11-23 08:27:34.096878571 +0000 UTC m=+0.146127429 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Nov 23 08:27:34 np0005532585.localdomain podman[85378]: 2025-11-23 08:27:34.105853877 +0000 UTC m=+0.157163081 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container)
Nov 23 08:27:34 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:27:34 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:27:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:27:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:27:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:27:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:27:37 np0005532585.localdomain podman[85418]: 2025-11-23 08:27:37.041348971 +0000 UTC m=+0.081756871 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4)
Nov 23 08:27:37 np0005532585.localdomain podman[85418]: 2025-11-23 08:27:37.076593793 +0000 UTC m=+0.117001713 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4)
Nov 23 08:27:37 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:27:37 np0005532585.localdomain podman[85417]: 2025-11-23 08:27:37.100011217 +0000 UTC m=+0.145351834 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute)
Nov 23 08:27:37 np0005532585.localdomain podman[85417]: 2025-11-23 08:27:37.141752945 +0000 UTC m=+0.187093592 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:27:37 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:27:37 np0005532585.localdomain podman[85416]: 2025-11-23 08:27:37.160199672 +0000 UTC m=+0.208574866 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, release=1761123044)
Nov 23 08:27:37 np0005532585.localdomain podman[85416]: 2025-11-23 08:27:37.179155005 +0000 UTC m=+0.227530199 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 23 08:27:37 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:27:37 np0005532585.localdomain podman[85422]: 2025-11-23 08:27:37.25132253 +0000 UTC m=+0.289610252 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:27:37 np0005532585.localdomain podman[85422]: 2025-11-23 08:27:37.284290658 +0000 UTC m=+0.322578400 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 23 08:27:37 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:27:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:27:38 np0005532585.localdomain podman[85515]: 2025-11-23 08:27:38.018557222 +0000 UTC m=+0.076476833 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, version=17.1.12, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:27:38 np0005532585.localdomain podman[85515]: 2025-11-23 08:27:38.417324776 +0000 UTC m=+0.475244327 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1)
Nov 23 08:27:38 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:27:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:27:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:27:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:27:40 np0005532585.localdomain systemd[1]: tmp-crun.kqyZSe.mount: Deactivated successfully.
Nov 23 08:27:40 np0005532585.localdomain podman[85540]: 2025-11-23 08:27:40.047594428 +0000 UTC m=+0.088195627 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:27:40 np0005532585.localdomain podman[85538]: 2025-11-23 08:27:40.085953967 +0000 UTC m=+0.132650359 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:27:40 np0005532585.localdomain podman[85540]: 2025-11-23 08:27:40.099291462 +0000 UTC m=+0.139892631 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12)
Nov 23 08:27:40 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:27:40 np0005532585.localdomain podman[85539]: 2025-11-23 08:27:40.143362184 +0000 UTC m=+0.184947014 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:27:40 np0005532585.localdomain podman[85539]: 2025-11-23 08:27:40.223229533 +0000 UTC m=+0.264814363 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044)
Nov 23 08:27:40 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:27:40 np0005532585.localdomain podman[85538]: 2025-11-23 08:27:40.300551593 +0000 UTC m=+0.347247965 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 23 08:27:40 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:27:59 np0005532585.localdomain sshd[85660]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:28:01 np0005532585.localdomain sshd[85660]: Connection closed by authenticating user root 80.94.95.116 port 47596 [preauth]
Nov 23 08:28:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:28:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:28:05 np0005532585.localdomain podman[85662]: 2025-11-23 08:28:05.025832278 +0000 UTC m=+0.084222269 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1)
Nov 23 08:28:05 np0005532585.localdomain podman[85662]: 2025-11-23 08:28:05.035452774 +0000 UTC m=+0.093842785 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:28:05 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:28:05 np0005532585.localdomain systemd[1]: tmp-crun.PSRpGX.mount: Deactivated successfully.
Nov 23 08:28:05 np0005532585.localdomain podman[85663]: 2025-11-23 08:28:05.141098024 +0000 UTC m=+0.197543143 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=)
Nov 23 08:28:05 np0005532585.localdomain podman[85663]: 2025-11-23 08:28:05.152259019 +0000 UTC m=+0.208704098 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container)
Nov 23 08:28:05 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:28:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:28:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:28:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:28:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:28:08 np0005532585.localdomain podman[85700]: 2025-11-23 08:28:08.030707751 +0000 UTC m=+0.080963617 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:28:08 np0005532585.localdomain podman[85700]: 2025-11-23 08:28:08.043424905 +0000 UTC m=+0.093680751 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12)
Nov 23 08:28:08 np0005532585.localdomain podman[85706]: 2025-11-23 08:28:08.090303546 +0000 UTC m=+0.132162704 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:28:08 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:28:08 np0005532585.localdomain podman[85706]: 2025-11-23 08:28:08.122176719 +0000 UTC m=+0.164035877 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 23 08:28:08 np0005532585.localdomain podman[85701]: 2025-11-23 08:28:08.073608085 +0000 UTC m=+0.126156764 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git)
Nov 23 08:28:08 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:28:08 np0005532585.localdomain podman[85702]: 2025-11-23 08:28:08.190053048 +0000 UTC m=+0.235516922 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 23 08:28:08 np0005532585.localdomain podman[85701]: 2025-11-23 08:28:08.209077573 +0000 UTC m=+0.261626252 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:28:08 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:28:08 np0005532585.localdomain podman[85702]: 2025-11-23 08:28:08.228209612 +0000 UTC m=+0.273673476 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4)
Nov 23 08:28:08 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:28:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:28:09 np0005532585.localdomain podman[85795]: 2025-11-23 08:28:09.027091091 +0000 UTC m=+0.084670434 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 23 08:28:09 np0005532585.localdomain podman[85795]: 2025-11-23 08:28:09.39334484 +0000 UTC m=+0.450924233 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Nov 23 08:28:09 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:28:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:28:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:28:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:28:11 np0005532585.localdomain systemd[1]: tmp-crun.ftwYuc.mount: Deactivated successfully.
Nov 23 08:28:11 np0005532585.localdomain podman[85819]: 2025-11-23 08:28:11.028133665 +0000 UTC m=+0.081973858 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1)
Nov 23 08:28:11 np0005532585.localdomain systemd[1]: tmp-crun.tyO4QI.mount: Deactivated successfully.
Nov 23 08:28:11 np0005532585.localdomain podman[85820]: 2025-11-23 08:28:11.079426247 +0000 UTC m=+0.128238070 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git)
Nov 23 08:28:11 np0005532585.localdomain podman[85818]: 2025-11-23 08:28:11.132063562 +0000 UTC m=+0.184948214 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:28:11 np0005532585.localdomain podman[85819]: 2025-11-23 08:28:11.153612026 +0000 UTC m=+0.207452339 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, release=1761123044, architecture=x86_64)
Nov 23 08:28:11 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:28:11 np0005532585.localdomain podman[85820]: 2025-11-23 08:28:11.184586641 +0000 UTC m=+0.233398524 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z)
Nov 23 08:28:11 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:28:11 np0005532585.localdomain podman[85818]: 2025-11-23 08:28:11.333341763 +0000 UTC m=+0.386226445 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:28:11 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:28:28 np0005532585.localdomain sudo[85895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:28:28 np0005532585.localdomain sudo[85895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:28:28 np0005532585.localdomain sudo[85895]: pam_unix(sudo:session): session closed for user root
Nov 23 08:28:28 np0005532585.localdomain sudo[85910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:28:28 np0005532585.localdomain sudo[85910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:28:29 np0005532585.localdomain sudo[85910]: pam_unix(sudo:session): session closed for user root
Nov 23 08:28:30 np0005532585.localdomain sudo[85956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:28:30 np0005532585.localdomain sudo[85956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:28:30 np0005532585.localdomain sudo[85956]: pam_unix(sudo:session): session closed for user root
Nov 23 08:28:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:28:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:28:36 np0005532585.localdomain systemd[1]: tmp-crun.57wHdX.mount: Deactivated successfully.
Nov 23 08:28:36 np0005532585.localdomain podman[85971]: 2025-11-23 08:28:36.052960686 +0000 UTC m=+0.103904865 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:28:36 np0005532585.localdomain podman[85972]: 2025-11-23 08:28:36.093718853 +0000 UTC m=+0.142954078 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Nov 23 08:28:36 np0005532585.localdomain podman[85971]: 2025-11-23 08:28:36.115690162 +0000 UTC m=+0.166634391 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, container_name=collectd, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git)
Nov 23 08:28:36 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:28:36 np0005532585.localdomain podman[85972]: 2025-11-23 08:28:36.129474451 +0000 UTC m=+0.178709716 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, tcib_managed=true, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:28:36 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:28:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:28:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:28:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:28:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: tmp-crun.kcDgjR.mount: Deactivated successfully.
Nov 23 08:28:39 np0005532585.localdomain podman[86010]: 2025-11-23 08:28:39.021475492 +0000 UTC m=+0.078728345 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:28:39 np0005532585.localdomain podman[86015]: 2025-11-23 08:28:39.04120064 +0000 UTC m=+0.085984076 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public)
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: tmp-crun.i2TqBp.mount: Deactivated successfully.
Nov 23 08:28:39 np0005532585.localdomain podman[86018]: 2025-11-23 08:28:39.086405807 +0000 UTC m=+0.132220656 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 23 08:28:39 np0005532585.localdomain podman[86011]: 2025-11-23 08:28:39.099538255 +0000 UTC m=+0.147255105 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:28:39 np0005532585.localdomain podman[86018]: 2025-11-23 08:28:39.109184732 +0000 UTC m=+0.154999551 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:28:39 np0005532585.localdomain podman[86015]: 2025-11-23 08:28:39.128214737 +0000 UTC m=+0.172998193 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:12:45Z, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:28:39 np0005532585.localdomain podman[86011]: 2025-11-23 08:28:39.151226269 +0000 UTC m=+0.198943129 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Nov 23 08:28:39 np0005532585.localdomain podman[86010]: 2025-11-23 08:28:39.163360645 +0000 UTC m=+0.220613508 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, container_name=logrotate_crond, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:28:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:28:40 np0005532585.localdomain podman[86105]: 2025-11-23 08:28:40.031529508 +0000 UTC m=+0.087775173 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 23 08:28:40 np0005532585.localdomain podman[86105]: 2025-11-23 08:28:40.404332956 +0000 UTC m=+0.460578601 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 08:28:40 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:28:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:28:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:28:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:28:42 np0005532585.localdomain systemd[1]: tmp-crun.uzmJtn.mount: Deactivated successfully.
Nov 23 08:28:42 np0005532585.localdomain podman[86129]: 2025-11-23 08:28:42.027059618 +0000 UTC m=+0.080842573 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible)
Nov 23 08:28:42 np0005532585.localdomain podman[86128]: 2025-11-23 08:28:42.075359024 +0000 UTC m=+0.130552053 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:28:42 np0005532585.localdomain podman[86129]: 2025-11-23 08:28:42.095142673 +0000 UTC m=+0.148925638 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:28:42 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:28:42 np0005532585.localdomain podman[86130]: 2025-11-23 08:28:42.180686184 +0000 UTC m=+0.228574181 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 23 08:28:42 np0005532585.localdomain podman[86130]: 2025-11-23 08:28:42.195690921 +0000 UTC m=+0.243578888 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:28:42 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:28:42 np0005532585.localdomain podman[86128]: 2025-11-23 08:28:42.319969424 +0000 UTC m=+0.375162393 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:28:42 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:29:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:29:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:29:06 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:29:06 np0005532585.localdomain recover_tripleo_nova_virtqemud[86262]: 61756
Nov 23 08:29:06 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:29:06 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:29:07 np0005532585.localdomain systemd[1]: tmp-crun.Z8Ddif.mount: Deactivated successfully.
Nov 23 08:29:07 np0005532585.localdomain podman[86249]: 2025-11-23 08:29:07.02353872 +0000 UTC m=+0.079554222 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Nov 23 08:29:07 np0005532585.localdomain podman[86249]: 2025-11-23 08:29:07.036205082 +0000 UTC m=+0.092220564 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64)
Nov 23 08:29:07 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:29:07 np0005532585.localdomain podman[86250]: 2025-11-23 08:29:07.120331358 +0000 UTC m=+0.172491357 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.openshift.expose-services=)
Nov 23 08:29:07 np0005532585.localdomain podman[86250]: 2025-11-23 08:29:07.13013808 +0000 UTC m=+0.182298029 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=iscsid, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12)
Nov 23 08:29:07 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:29:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:29:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:29:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:29:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: tmp-crun.SW359e.mount: Deactivated successfully.
Nov 23 08:29:10 np0005532585.localdomain podman[86291]: 2025-11-23 08:29:10.04972829 +0000 UTC m=+0.104090982 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: tmp-crun.J5vcdh.mount: Deactivated successfully.
Nov 23 08:29:10 np0005532585.localdomain podman[86291]: 2025-11-23 08:29:10.134430454 +0000 UTC m=+0.188793186 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., distribution-scope=public, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 23 08:29:10 np0005532585.localdomain podman[86294]: 2025-11-23 08:29:10.083449522 +0000 UTC m=+0.129669785 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Nov 23 08:29:10 np0005532585.localdomain podman[86292]: 2025-11-23 08:29:10.145510786 +0000 UTC m=+0.197398009 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:29:10 np0005532585.localdomain podman[86293]: 2025-11-23 08:29:10.112782336 +0000 UTC m=+0.161429486 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 23 08:29:10 np0005532585.localdomain podman[86292]: 2025-11-23 08:29:10.182673539 +0000 UTC m=+0.234560772 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, version=17.1.12, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:29:10 np0005532585.localdomain podman[86293]: 2025-11-23 08:29:10.192572853 +0000 UTC m=+0.241219973 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4)
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:29:10 np0005532585.localdomain podman[86294]: 2025-11-23 08:29:10.265119631 +0000 UTC m=+0.311339974 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:29:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:29:11 np0005532585.localdomain podman[86386]: 2025-11-23 08:29:11.063291567 +0000 UTC m=+0.083974202 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z)
Nov 23 08:29:11 np0005532585.localdomain podman[86386]: 2025-11-23 08:29:11.434432181 +0000 UTC m=+0.455114836 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:29:11 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:29:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:29:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:29:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:29:13 np0005532585.localdomain podman[86409]: 2025-11-23 08:29:13.018441132 +0000 UTC m=+0.071495375 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:29:13 np0005532585.localdomain podman[86407]: 2025-11-23 08:29:13.070799268 +0000 UTC m=+0.127268440 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:29:13 np0005532585.localdomain podman[86409]: 2025-11-23 08:29:13.091185576 +0000 UTC m=+0.144239879 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 23 08:29:13 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:29:13 np0005532585.localdomain podman[86408]: 2025-11-23 08:29:13.167328047 +0000 UTC m=+0.220368280 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044)
Nov 23 08:29:13 np0005532585.localdomain podman[86408]: 2025-11-23 08:29:13.203154037 +0000 UTC m=+0.256194260 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:29:13 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:29:13 np0005532585.localdomain podman[86407]: 2025-11-23 08:29:13.284399311 +0000 UTC m=+0.340868513 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:29:13 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:29:30 np0005532585.localdomain sudo[86481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:29:30 np0005532585.localdomain sudo[86481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:29:30 np0005532585.localdomain sudo[86481]: pam_unix(sudo:session): session closed for user root
Nov 23 08:29:30 np0005532585.localdomain sudo[86496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:29:30 np0005532585.localdomain sudo[86496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:29:31 np0005532585.localdomain sudo[86496]: pam_unix(sudo:session): session closed for user root
Nov 23 08:29:31 np0005532585.localdomain sudo[86543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:29:31 np0005532585.localdomain sudo[86543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:29:31 np0005532585.localdomain sudo[86543]: pam_unix(sudo:session): session closed for user root
Nov 23 08:29:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:29:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:29:38 np0005532585.localdomain podman[86558]: 2025-11-23 08:29:38.034753402 +0000 UTC m=+0.087890407 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Nov 23 08:29:38 np0005532585.localdomain podman[86558]: 2025-11-23 08:29:38.044989207 +0000 UTC m=+0.098126222 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, name=rhosp17/openstack-collectd, version=17.1.12, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, architecture=x86_64)
Nov 23 08:29:38 np0005532585.localdomain systemd[1]: tmp-crun.OcE2eg.mount: Deactivated successfully.
Nov 23 08:29:38 np0005532585.localdomain podman[86559]: 2025-11-23 08:29:38.087148408 +0000 UTC m=+0.138216327 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:29:38 np0005532585.localdomain podman[86559]: 2025-11-23 08:29:38.096257119 +0000 UTC m=+0.147324978 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc.)
Nov 23 08:29:38 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:29:38 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:29:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:29:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:29:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:29:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:29:41 np0005532585.localdomain systemd[1]: tmp-crun.ICvGG9.mount: Deactivated successfully.
Nov 23 08:29:41 np0005532585.localdomain podman[86599]: 2025-11-23 08:29:41.03224067 +0000 UTC m=+0.084135217 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 23 08:29:41 np0005532585.localdomain podman[86598]: 2025-11-23 08:29:41.075698902 +0000 UTC m=+0.129498810 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044)
Nov 23 08:29:41 np0005532585.localdomain podman[86599]: 2025-11-23 08:29:41.081218687 +0000 UTC m=+0.133113214 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, release=1761123044, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z)
Nov 23 08:29:41 np0005532585.localdomain podman[86598]: 2025-11-23 08:29:41.088994225 +0000 UTC m=+0.142794133 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=)
Nov 23 08:29:41 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:29:41 np0005532585.localdomain podman[86606]: 2025-11-23 08:29:41.048063333 +0000 UTC m=+0.090439227 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Nov 23 08:29:41 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:29:41 np0005532585.localdomain podman[86606]: 2025-11-23 08:29:41.132356924 +0000 UTC m=+0.174732858 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Nov 23 08:29:41 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:29:41 np0005532585.localdomain podman[86600]: 2025-11-23 08:29:41.188955425 +0000 UTC m=+0.237889838 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 23 08:29:41 np0005532585.localdomain podman[86600]: 2025-11-23 08:29:41.218397131 +0000 UTC m=+0.267331624 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 23 08:29:41 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:29:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:29:42 np0005532585.localdomain systemd[1]: tmp-crun.WK1jwp.mount: Deactivated successfully.
Nov 23 08:29:42 np0005532585.localdomain podman[86693]: 2025-11-23 08:29:42.024497209 +0000 UTC m=+0.082257126 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true)
Nov 23 08:29:42 np0005532585.localdomain podman[86693]: 2025-11-23 08:29:42.405645672 +0000 UTC m=+0.463405599 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com)
Nov 23 08:29:42 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:29:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:29:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:29:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:29:44 np0005532585.localdomain podman[86713]: 2025-11-23 08:29:44.033804337 +0000 UTC m=+0.087677059 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 23 08:29:44 np0005532585.localdomain systemd[1]: tmp-crun.BmG7sU.mount: Deactivated successfully.
Nov 23 08:29:44 np0005532585.localdomain podman[86714]: 2025-11-23 08:29:44.104167765 +0000 UTC m=+0.153770622 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=)
Nov 23 08:29:44 np0005532585.localdomain podman[86714]: 2025-11-23 08:29:44.149247278 +0000 UTC m=+0.198850005 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Nov 23 08:29:44 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:29:44 np0005532585.localdomain podman[86715]: 2025-11-23 08:29:44.203954109 +0000 UTC m=+0.252476781 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:29:44 np0005532585.localdomain podman[86713]: 2025-11-23 08:29:44.225232096 +0000 UTC m=+0.279104818 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:29:44 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:29:44 np0005532585.localdomain podman[86715]: 2025-11-23 08:29:44.279680477 +0000 UTC m=+0.328203189 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z)
Nov 23 08:29:44 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:29:45 np0005532585.localdomain systemd[1]: tmp-crun.o6A1LZ.mount: Deactivated successfully.
Nov 23 08:30:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:30:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:30:09 np0005532585.localdomain podman[86833]: 2025-11-23 08:30:09.024827212 +0000 UTC m=+0.080684887 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:30:09 np0005532585.localdomain podman[86833]: 2025-11-23 08:30:09.035195302 +0000 UTC m=+0.091052947 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Nov 23 08:30:09 np0005532585.localdomain podman[86834]: 2025-11-23 08:30:08.992044899 +0000 UTC m=+0.052871291 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.12)
Nov 23 08:30:09 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:30:09 np0005532585.localdomain podman[86834]: 2025-11-23 08:30:09.074228804 +0000 UTC m=+0.135055196 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z)
Nov 23 08:30:09 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:30:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:30:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:30:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:30:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:30:12 np0005532585.localdomain systemd[1]: tmp-crun.RhdQvq.mount: Deactivated successfully.
Nov 23 08:30:12 np0005532585.localdomain podman[86873]: 2025-11-23 08:30:12.029830808 +0000 UTC m=+0.078400194 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:30:12 np0005532585.localdomain podman[86883]: 2025-11-23 08:30:12.058030805 +0000 UTC m=+0.104936909 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:30:12 np0005532585.localdomain podman[86883]: 2025-11-23 08:30:12.078170766 +0000 UTC m=+0.125076880 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 23 08:30:12 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:30:12 np0005532585.localdomain podman[86871]: 2025-11-23 08:30:12.009640506 +0000 UTC m=+0.067299731 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, version=17.1.12, vcs-type=git)
Nov 23 08:30:12 np0005532585.localdomain podman[86873]: 2025-11-23 08:30:12.128642441 +0000 UTC m=+0.177211807 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Nov 23 08:30:12 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:30:12 np0005532585.localdomain podman[86871]: 2025-11-23 08:30:12.138277168 +0000 UTC m=+0.195936363 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-cron-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 23 08:30:12 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:30:12 np0005532585.localdomain podman[86872]: 2025-11-23 08:30:12.201372064 +0000 UTC m=+0.255790726 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, vcs-type=git)
Nov 23 08:30:12 np0005532585.localdomain podman[86872]: 2025-11-23 08:30:12.224170969 +0000 UTC m=+0.278589651 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:30:12 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:30:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:30:13 np0005532585.localdomain podman[86968]: 2025-11-23 08:30:13.017416509 +0000 UTC m=+0.078870430 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 23 08:30:13 np0005532585.localdomain podman[86968]: 2025-11-23 08:30:13.362358 +0000 UTC m=+0.423811911 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target)
Nov 23 08:30:13 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:30:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:30:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:30:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:30:15 np0005532585.localdomain systemd[1]: tmp-crun.4vfuL6.mount: Deactivated successfully.
Nov 23 08:30:15 np0005532585.localdomain systemd[1]: tmp-crun.U3xnrA.mount: Deactivated successfully.
Nov 23 08:30:15 np0005532585.localdomain podman[86989]: 2025-11-23 08:30:15.082205661 +0000 UTC m=+0.139583770 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12)
Nov 23 08:30:15 np0005532585.localdomain podman[86991]: 2025-11-23 08:30:15.129280648 +0000 UTC m=+0.180552433 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1761123044, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:30:15 np0005532585.localdomain podman[86990]: 2025-11-23 08:30:15.047952841 +0000 UTC m=+0.102691636 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:30:15 np0005532585.localdomain podman[86991]: 2025-11-23 08:30:15.177362707 +0000 UTC m=+0.228634492 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044)
Nov 23 08:30:15 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:30:15 np0005532585.localdomain podman[86990]: 2025-11-23 08:30:15.22929927 +0000 UTC m=+0.284038055 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:30:15 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:30:15 np0005532585.localdomain podman[86989]: 2025-11-23 08:30:15.28686064 +0000 UTC m=+0.344238669 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:30:15 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:30:32 np0005532585.localdomain sudo[87064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:30:32 np0005532585.localdomain sudo[87064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:30:32 np0005532585.localdomain sudo[87064]: pam_unix(sudo:session): session closed for user root
Nov 23 08:30:32 np0005532585.localdomain sudo[87079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:30:32 np0005532585.localdomain sudo[87079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:30:32 np0005532585.localdomain sudo[87079]: pam_unix(sudo:session): session closed for user root
Nov 23 08:30:33 np0005532585.localdomain sudo[87127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:30:33 np0005532585.localdomain sudo[87127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:30:33 np0005532585.localdomain sudo[87127]: pam_unix(sudo:session): session closed for user root
Nov 23 08:30:36 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:30:36 np0005532585.localdomain recover_tripleo_nova_virtqemud[87143]: 61756
Nov 23 08:30:36 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:30:36 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:30:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:30:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:30:40 np0005532585.localdomain systemd[1]: tmp-crun.4E3HNn.mount: Deactivated successfully.
Nov 23 08:30:40 np0005532585.localdomain podman[87145]: 2025-11-23 08:30:40.083287495 +0000 UTC m=+0.137964038 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, version=17.1.12, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:30:40 np0005532585.localdomain podman[87145]: 2025-11-23 08:30:40.097314942 +0000 UTC m=+0.151991555 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:30:40 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:30:40 np0005532585.localdomain podman[87144]: 2025-11-23 08:30:40.084349699 +0000 UTC m=+0.138648631 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git)
Nov 23 08:30:40 np0005532585.localdomain podman[87144]: 2025-11-23 08:30:40.164250681 +0000 UTC m=+0.218549653 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 08:30:40 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:30:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:30:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:30:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:30:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:30:43 np0005532585.localdomain systemd[84232]: Created slice User Background Tasks Slice.
Nov 23 08:30:43 np0005532585.localdomain systemd[84232]: Starting Cleanup of User's Temporary Files and Directories...
Nov 23 08:30:43 np0005532585.localdomain systemd[1]: tmp-crun.wBmeyO.mount: Deactivated successfully.
Nov 23 08:30:43 np0005532585.localdomain systemd[84232]: Finished Cleanup of User's Temporary Files and Directories.
Nov 23 08:30:43 np0005532585.localdomain podman[87183]: 2025-11-23 08:30:43.017131478 +0000 UTC m=+0.073467747 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:30:43 np0005532585.localdomain podman[87185]: 2025-11-23 08:30:43.046870994 +0000 UTC m=+0.099250078 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi)
Nov 23 08:30:43 np0005532585.localdomain podman[87183]: 2025-11-23 08:30:43.052216295 +0000 UTC m=+0.108552584 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, release=1761123044, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4)
Nov 23 08:30:43 np0005532585.localdomain podman[87184]: 2025-11-23 08:30:43.059163375 +0000 UTC m=+0.113460529 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Nov 23 08:30:43 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:30:43 np0005532585.localdomain podman[87185]: 2025-11-23 08:30:43.066571141 +0000 UTC m=+0.118950335 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 08:30:43 np0005532585.localdomain podman[87191]: 2025-11-23 08:30:43.026828597 +0000 UTC m=+0.072137636 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:30:43 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:30:43 np0005532585.localdomain podman[87184]: 2025-11-23 08:30:43.110153147 +0000 UTC m=+0.164450291 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute)
Nov 23 08:30:43 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:30:43 np0005532585.localdomain podman[87191]: 2025-11-23 08:30:43.160515738 +0000 UTC m=+0.205824787 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:30:43 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:30:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:30:44 np0005532585.localdomain systemd[1]: tmp-crun.qXnwkn.mount: Deactivated successfully.
Nov 23 08:30:44 np0005532585.localdomain podman[87278]: 2025-11-23 08:30:44.035627003 +0000 UTC m=+0.093331090 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:30:44 np0005532585.localdomain podman[87278]: 2025-11-23 08:30:44.429455528 +0000 UTC m=+0.487159605 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1)
Nov 23 08:30:44 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:30:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:30:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:30:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:30:46 np0005532585.localdomain podman[87302]: 2025-11-23 08:30:46.010159335 +0000 UTC m=+0.069492992 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 23 08:30:46 np0005532585.localdomain systemd[1]: tmp-crun.I7CIu0.mount: Deactivated successfully.
Nov 23 08:30:46 np0005532585.localdomain podman[87303]: 2025-11-23 08:30:46.06757929 +0000 UTC m=+0.124428028 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:30:46 np0005532585.localdomain podman[87301]: 2025-11-23 08:30:46.0817032 +0000 UTC m=+0.138518187 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044)
Nov 23 08:30:46 np0005532585.localdomain podman[87302]: 2025-11-23 08:30:46.089906731 +0000 UTC m=+0.149240408 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12)
Nov 23 08:30:46 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:30:46 np0005532585.localdomain podman[87303]: 2025-11-23 08:30:46.109346389 +0000 UTC m=+0.166195187 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, distribution-scope=public, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.)
Nov 23 08:30:46 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:30:46 np0005532585.localdomain podman[87301]: 2025-11-23 08:30:46.251829291 +0000 UTC m=+0.308644358 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:30:46 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:31:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:31:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:31:11 np0005532585.localdomain podman[87419]: 2025-11-23 08:31:11.023794847 +0000 UTC m=+0.082745443 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 23 08:31:11 np0005532585.localdomain podman[87419]: 2025-11-23 08:31:11.063513515 +0000 UTC m=+0.122464111 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Nov 23 08:31:11 np0005532585.localdomain podman[87420]: 2025-11-23 08:31:11.078004481 +0000 UTC m=+0.134371914 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:31:11 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:31:11 np0005532585.localdomain podman[87420]: 2025-11-23 08:31:11.090388249 +0000 UTC m=+0.146755672 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, maintainer=OpenStack TripleO Team)
Nov 23 08:31:11 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:31:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:31:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:31:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:31:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: tmp-crun.lYiY8x.mount: Deactivated successfully.
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: tmp-crun.hP44WH.mount: Deactivated successfully.
Nov 23 08:31:14 np0005532585.localdomain podman[87459]: 2025-11-23 08:31:14.010248854 +0000 UTC m=+0.060543509 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, tcib_managed=true, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1)
Nov 23 08:31:14 np0005532585.localdomain podman[87460]: 2025-11-23 08:31:14.070400249 +0000 UTC m=+0.120325582 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:31:14 np0005532585.localdomain podman[87458]: 2025-11-23 08:31:14.038642758 +0000 UTC m=+0.085214903 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:31:14 np0005532585.localdomain podman[87458]: 2025-11-23 08:31:14.120265703 +0000 UTC m=+0.166837838 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:31:14 np0005532585.localdomain podman[87460]: 2025-11-23 08:31:14.141359641 +0000 UTC m=+0.191284984 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:31:14 np0005532585.localdomain podman[87457]: 2025-11-23 08:31:14.124084126 +0000 UTC m=+0.176487739 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron)
Nov 23 08:31:14 np0005532585.localdomain podman[87459]: 2025-11-23 08:31:14.195360709 +0000 UTC m=+0.245655454 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 23 08:31:14 np0005532585.localdomain podman[87457]: 2025-11-23 08:31:14.208302425 +0000 UTC m=+0.260706048 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, tcib_managed=true, version=17.1.12)
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:31:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:31:15 np0005532585.localdomain podman[87552]: 2025-11-23 08:31:15.00896817 +0000 UTC m=+0.070514980 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_migration_target, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 23 08:31:15 np0005532585.localdomain podman[87552]: 2025-11-23 08:31:15.385927746 +0000 UTC m=+0.447474616 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:31:15 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:31:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:31:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:31:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:31:17 np0005532585.localdomain podman[87575]: 2025-11-23 08:31:17.021675334 +0000 UTC m=+0.079132427 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 23 08:31:17 np0005532585.localdomain podman[87577]: 2025-11-23 08:31:17.047106022 +0000 UTC m=+0.096121324 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 23 08:31:17 np0005532585.localdomain podman[87577]: 2025-11-23 08:31:17.068228121 +0000 UTC m=+0.117243393 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=)
Nov 23 08:31:17 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:31:17 np0005532585.localdomain podman[87576]: 2025-11-23 08:31:17.134869395 +0000 UTC m=+0.186450679 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, release=1761123044, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:31:17 np0005532585.localdomain podman[87576]: 2025-11-23 08:31:17.179255193 +0000 UTC m=+0.230836497 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:31:17 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:31:17 np0005532585.localdomain podman[87575]: 2025-11-23 08:31:17.209161025 +0000 UTC m=+0.266618098 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z)
Nov 23 08:31:17 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:31:33 np0005532585.localdomain sudo[87651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:31:33 np0005532585.localdomain sudo[87651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:31:33 np0005532585.localdomain sudo[87651]: pam_unix(sudo:session): session closed for user root
Nov 23 08:31:33 np0005532585.localdomain sudo[87666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:31:33 np0005532585.localdomain sudo[87666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:31:34 np0005532585.localdomain sudo[87666]: pam_unix(sudo:session): session closed for user root
Nov 23 08:31:36 np0005532585.localdomain sudo[87715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:31:36 np0005532585.localdomain sudo[87715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:31:36 np0005532585.localdomain sudo[87715]: pam_unix(sudo:session): session closed for user root
Nov 23 08:31:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:31:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:31:42 np0005532585.localdomain podman[87730]: 2025-11-23 08:31:42.033918684 +0000 UTC m=+0.085987348 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:31:42 np0005532585.localdomain podman[87730]: 2025-11-23 08:31:42.048449971 +0000 UTC m=+0.100518635 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:31:42 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:31:42 np0005532585.localdomain systemd[1]: tmp-crun.0xRh27.mount: Deactivated successfully.
Nov 23 08:31:42 np0005532585.localdomain podman[87731]: 2025-11-23 08:31:42.165013871 +0000 UTC m=+0.215789233 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:31:42 np0005532585.localdomain podman[87731]: 2025-11-23 08:31:42.203290772 +0000 UTC m=+0.254066104 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:31:42 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:31:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:31:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:31:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:31:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:31:45 np0005532585.localdomain systemd[1]: tmp-crun.NWKL3x.mount: Deactivated successfully.
Nov 23 08:31:45 np0005532585.localdomain podman[87768]: 2025-11-23 08:31:45.009199391 +0000 UTC m=+0.062343876 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-cron, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:31:45 np0005532585.localdomain podman[87770]: 2025-11-23 08:31:45.05922843 +0000 UTC m=+0.107276301 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc.)
Nov 23 08:31:45 np0005532585.localdomain podman[87768]: 2025-11-23 08:31:45.092676736 +0000 UTC m=+0.145821271 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Nov 23 08:31:45 np0005532585.localdomain podman[87770]: 2025-11-23 08:31:45.100380374 +0000 UTC m=+0.148428165 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git)
Nov 23 08:31:45 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:31:45 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:31:45 np0005532585.localdomain podman[87776]: 2025-11-23 08:31:45.141869309 +0000 UTC m=+0.183298968 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:31:45 np0005532585.localdomain podman[87769]: 2025-11-23 08:31:45.168358791 +0000 UTC m=+0.219262284 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:31:45 np0005532585.localdomain podman[87769]: 2025-11-23 08:31:45.221243202 +0000 UTC m=+0.272146735 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Nov 23 08:31:45 np0005532585.localdomain podman[87776]: 2025-11-23 08:31:45.227339988 +0000 UTC m=+0.268769607 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, name=rhosp17/openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 23 08:31:45 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:31:45 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:31:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:31:46 np0005532585.localdomain systemd[1]: tmp-crun.dxy7YN.mount: Deactivated successfully.
Nov 23 08:31:46 np0005532585.localdomain podman[87869]: 2025-11-23 08:31:46.023325813 +0000 UTC m=+0.079398335 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:31:46 np0005532585.localdomain podman[87869]: 2025-11-23 08:31:46.39629879 +0000 UTC m=+0.452371282 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com)
Nov 23 08:31:46 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:31:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:31:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:31:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:31:48 np0005532585.localdomain systemd[1]: tmp-crun.YIp3G2.mount: Deactivated successfully.
Nov 23 08:31:48 np0005532585.localdomain podman[87894]: 2025-11-23 08:31:48.043794827 +0000 UTC m=+0.082290378 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 23 08:31:48 np0005532585.localdomain podman[87894]: 2025-11-23 08:31:48.092355589 +0000 UTC m=+0.130851200 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:31:48 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:31:48 np0005532585.localdomain podman[87892]: 2025-11-23 08:31:48.096162581 +0000 UTC m=+0.144656674 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:31:48 np0005532585.localdomain podman[87893]: 2025-11-23 08:31:48.155663885 +0000 UTC m=+0.199069494 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 23 08:31:48 np0005532585.localdomain podman[87893]: 2025-11-23 08:31:48.191046364 +0000 UTC m=+0.234451963 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:31:48 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:31:48 np0005532585.localdomain podman[87892]: 2025-11-23 08:31:48.267174312 +0000 UTC m=+0.315668355 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:31:48 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:32:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:32:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 398 writes, 1428 keys, 398 commit groups, 1.0 writes per commit group, ingest: 1.75 MB, 0.00 MB/s
                                                          Interval WAL: 398 writes, 167 syncs, 2.38 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:32:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:32:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 540 writes, 2288 keys, 540 commit groups, 1.0 writes per commit group, ingest: 2.77 MB, 0.00 MB/s
                                                          Interval WAL: 540 writes, 176 syncs, 3.07 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:32:06 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:32:06 np0005532585.localdomain recover_tripleo_nova_virtqemud[88011]: 61756
Nov 23 08:32:06 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:32:06 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:32:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:32:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:32:13 np0005532585.localdomain podman[88013]: 2025-11-23 08:32:13.026027834 +0000 UTC m=+0.075663875 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true)
Nov 23 08:32:13 np0005532585.localdomain podman[88013]: 2025-11-23 08:32:13.062503577 +0000 UTC m=+0.112139648 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4)
Nov 23 08:32:13 np0005532585.localdomain podman[88012]: 2025-11-23 08:32:13.090098425 +0000 UTC m=+0.139709036 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:32:13 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:32:13 np0005532585.localdomain podman[88012]: 2025-11-23 08:32:13.12848061 +0000 UTC m=+0.178091201 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, container_name=collectd, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible)
Nov 23 08:32:13 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:32:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:32:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:32:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:32:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:32:16 np0005532585.localdomain systemd[1]: tmp-crun.fEoeEy.mount: Deactivated successfully.
Nov 23 08:32:16 np0005532585.localdomain podman[88055]: 2025-11-23 08:32:16.032856106 +0000 UTC m=+0.078904600 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Nov 23 08:32:16 np0005532585.localdomain podman[88054]: 2025-11-23 08:32:16.078308738 +0000 UTC m=+0.128228756 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Nov 23 08:32:16 np0005532585.localdomain podman[88055]: 2025-11-23 08:32:16.085292203 +0000 UTC m=+0.131340657 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:32:16 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:32:16 np0005532585.localdomain podman[88053]: 2025-11-23 08:32:16.139957432 +0000 UTC m=+0.190447918 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:32:16 np0005532585.localdomain podman[88054]: 2025-11-23 08:32:16.152187015 +0000 UTC m=+0.202107093 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 08:32:16 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:32:16 np0005532585.localdomain podman[88053]: 2025-11-23 08:32:16.178309685 +0000 UTC m=+0.228800161 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z)
Nov 23 08:32:16 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:32:16 np0005532585.localdomain podman[88052]: 2025-11-23 08:32:16.237994115 +0000 UTC m=+0.290118214 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:32:16 np0005532585.localdomain podman[88052]: 2025-11-23 08:32:16.275363677 +0000 UTC m=+0.327487776 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 23 08:32:16 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:32:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:32:16 np0005532585.localdomain podman[88152]: 2025-11-23 08:32:16.991468693 +0000 UTC m=+0.052872202 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:32:17 np0005532585.localdomain podman[88152]: 2025-11-23 08:32:17.386256802 +0000 UTC m=+0.447660271 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 23 08:32:17 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:32:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:32:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:32:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:32:19 np0005532585.localdomain podman[88175]: 2025-11-23 08:32:19.017260597 +0000 UTC m=+0.074502418 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.)
Nov 23 08:32:19 np0005532585.localdomain podman[88177]: 2025-11-23 08:32:19.063372239 +0000 UTC m=+0.112959984 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z)
Nov 23 08:32:19 np0005532585.localdomain systemd[1]: tmp-crun.15pflp.mount: Deactivated successfully.
Nov 23 08:32:19 np0005532585.localdomain podman[88176]: 2025-11-23 08:32:19.136548903 +0000 UTC m=+0.188391611 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true)
Nov 23 08:32:19 np0005532585.localdomain podman[88177]: 2025-11-23 08:32:19.158646034 +0000 UTC m=+0.208233809 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4)
Nov 23 08:32:19 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:32:19 np0005532585.localdomain podman[88176]: 2025-11-23 08:32:19.178328847 +0000 UTC m=+0.230171625 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, version=17.1.12)
Nov 23 08:32:19 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:32:19 np0005532585.localdomain podman[88175]: 2025-11-23 08:32:19.254428825 +0000 UTC m=+0.311670716 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:32:19 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:32:36 np0005532585.localdomain sudo[88247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:32:36 np0005532585.localdomain sudo[88247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:32:36 np0005532585.localdomain sudo[88247]: pam_unix(sudo:session): session closed for user root
Nov 23 08:32:36 np0005532585.localdomain sudo[88262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:32:36 np0005532585.localdomain sudo[88262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:32:37 np0005532585.localdomain sudo[88262]: pam_unix(sudo:session): session closed for user root
Nov 23 08:32:38 np0005532585.localdomain sudo[88309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:32:38 np0005532585.localdomain sudo[88309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:32:38 np0005532585.localdomain sudo[88309]: pam_unix(sudo:session): session closed for user root
Nov 23 08:32:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:32:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:32:44 np0005532585.localdomain systemd[1]: tmp-crun.AaS38S.mount: Deactivated successfully.
Nov 23 08:32:44 np0005532585.localdomain podman[88325]: 2025-11-23 08:32:44.021065743 +0000 UTC m=+0.080491600 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Nov 23 08:32:44 np0005532585.localdomain podman[88325]: 2025-11-23 08:32:44.038365829 +0000 UTC m=+0.097791706 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid)
Nov 23 08:32:44 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:32:44 np0005532585.localdomain podman[88324]: 2025-11-23 08:32:44.040400455 +0000 UTC m=+0.097653112 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true)
Nov 23 08:32:44 np0005532585.localdomain podman[88324]: 2025-11-23 08:32:44.12479948 +0000 UTC m=+0.182052107 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Nov 23 08:32:44 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:32:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:32:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:32:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:32:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: tmp-crun.9nhUUL.mount: Deactivated successfully.
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: tmp-crun.IPBnPM.mount: Deactivated successfully.
Nov 23 08:32:47 np0005532585.localdomain podman[88365]: 2025-11-23 08:32:47.098164037 +0000 UTC m=+0.148674083 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:32:47 np0005532585.localdomain podman[88365]: 2025-11-23 08:32:47.127280333 +0000 UTC m=+0.177790379 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:32:47 np0005532585.localdomain podman[88367]: 2025-11-23 08:32:47.06004222 +0000 UTC m=+0.101485755 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, tcib_managed=true, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:32:47 np0005532585.localdomain podman[88364]: 2025-11-23 08:32:47.131563692 +0000 UTC m=+0.182752311 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.openshift.expose-services=, version=17.1.12, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z)
Nov 23 08:32:47 np0005532585.localdomain podman[88366]: 2025-11-23 08:32:47.190818967 +0000 UTC m=+0.238717639 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:32:47 np0005532585.localdomain podman[88364]: 2025-11-23 08:32:47.210454609 +0000 UTC m=+0.261643258 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, release=1761123044, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:32:47 np0005532585.localdomain podman[88366]: 2025-11-23 08:32:47.239217984 +0000 UTC m=+0.287116646 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:32:47 np0005532585.localdomain podman[88367]: 2025-11-23 08:32:47.294363568 +0000 UTC m=+0.335807123 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:32:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:32:48 np0005532585.localdomain podman[88461]: 2025-11-23 08:32:48.019850615 +0000 UTC m=+0.077623988 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:32:48 np0005532585.localdomain podman[88461]: 2025-11-23 08:32:48.397041528 +0000 UTC m=+0.454814941 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 23 08:32:48 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:32:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:32:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:32:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:32:49 np0005532585.localdomain podman[88483]: 2025-11-23 08:32:49.993051668 +0000 UTC m=+0.055087203 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:32:50 np0005532585.localdomain systemd[1]: tmp-crun.X5V4nq.mount: Deactivated successfully.
Nov 23 08:32:50 np0005532585.localdomain podman[88484]: 2025-11-23 08:32:50.029976856 +0000 UTC m=+0.084098186 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:32:50 np0005532585.localdomain podman[88485]: 2025-11-23 08:32:50.090831264 +0000 UTC m=+0.139141348 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T23:34:05Z)
Nov 23 08:32:50 np0005532585.localdomain podman[88485]: 2025-11-23 08:32:50.114795605 +0000 UTC m=+0.163105779 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64)
Nov 23 08:32:50 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:32:50 np0005532585.localdomain podman[88484]: 2025-11-23 08:32:50.14201488 +0000 UTC m=+0.196136200 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:32:50 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:32:50 np0005532585.localdomain podman[88483]: 2025-11-23 08:32:50.207021721 +0000 UTC m=+0.269057246 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:32:50 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:32:50 np0005532585.localdomain systemd[1]: tmp-crun.vgEYd0.mount: Deactivated successfully.
Nov 23 08:33:08 np0005532585.localdomain sshd[88602]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:33:09 np0005532585.localdomain sshd[88602]: Invalid user blank from 97.70.129.101 port 51332
Nov 23 08:33:10 np0005532585.localdomain sshd[88602]: Connection closed by invalid user blank 97.70.129.101 port 51332 [preauth]
Nov 23 08:33:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:33:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:33:15 np0005532585.localdomain podman[88604]: 2025-11-23 08:33:15.032854667 +0000 UTC m=+0.087616419 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true)
Nov 23 08:33:15 np0005532585.localdomain podman[88604]: 2025-11-23 08:33:15.042018802 +0000 UTC m=+0.096780574 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:33:15 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:33:15 np0005532585.localdomain systemd[1]: tmp-crun.qoVCcp.mount: Deactivated successfully.
Nov 23 08:33:15 np0005532585.localdomain podman[88605]: 2025-11-23 08:33:15.134080194 +0000 UTC m=+0.186350796 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true)
Nov 23 08:33:15 np0005532585.localdomain podman[88605]: 2025-11-23 08:33:15.167591911 +0000 UTC m=+0.219862533 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid)
Nov 23 08:33:15 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:33:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:33:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:33:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:33:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:33:18 np0005532585.localdomain podman[88644]: 2025-11-23 08:33:18.017914729 +0000 UTC m=+0.071920365 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, version=17.1.12)
Nov 23 08:33:18 np0005532585.localdomain systemd[1]: tmp-crun.O1jLOz.mount: Deactivated successfully.
Nov 23 08:33:18 np0005532585.localdomain podman[88644]: 2025-11-23 08:33:18.073107965 +0000 UTC m=+0.127113581 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, release=1761123044)
Nov 23 08:33:18 np0005532585.localdomain podman[88643]: 2025-11-23 08:33:18.080871614 +0000 UTC m=+0.136044707 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.)
Nov 23 08:33:18 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:33:18 np0005532585.localdomain podman[88643]: 2025-11-23 08:33:18.110196168 +0000 UTC m=+0.165369261 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:33:18 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:33:18 np0005532585.localdomain podman[88642]: 2025-11-23 08:33:18.129165908 +0000 UTC m=+0.186853632 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:33:18 np0005532585.localdomain podman[88642]: 2025-11-23 08:33:18.134754128 +0000 UTC m=+0.192441812 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4)
Nov 23 08:33:18 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:33:18 np0005532585.localdomain podman[88645]: 2025-11-23 08:33:18.188354112 +0000 UTC m=+0.238534894 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Nov 23 08:33:18 np0005532585.localdomain podman[88645]: 2025-11-23 08:33:18.215269867 +0000 UTC m=+0.265450649 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5)
Nov 23 08:33:18 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:33:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:33:19 np0005532585.localdomain podman[88743]: 2025-11-23 08:33:19.084233931 +0000 UTC m=+0.062605756 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp17/openstack-nova-compute)
Nov 23 08:33:19 np0005532585.localdomain podman[88743]: 2025-11-23 08:33:19.469948308 +0000 UTC m=+0.448320093 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Nov 23 08:33:19 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:33:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:33:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:33:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:33:21 np0005532585.localdomain systemd[1]: tmp-crun.wSjpeH.mount: Deactivated successfully.
Nov 23 08:33:21 np0005532585.localdomain podman[88766]: 2025-11-23 08:33:21.009027786 +0000 UTC m=+0.070283131 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public)
Nov 23 08:33:21 np0005532585.localdomain podman[88767]: 2025-11-23 08:33:21.020520906 +0000 UTC m=+0.076465171 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:33:21 np0005532585.localdomain podman[88768]: 2025-11-23 08:33:21.087246792 +0000 UTC m=+0.140013104 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 23 08:33:21 np0005532585.localdomain podman[88767]: 2025-11-23 08:33:21.10768501 +0000 UTC m=+0.163629305 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team)
Nov 23 08:33:21 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:33:21 np0005532585.localdomain podman[88768]: 2025-11-23 08:33:21.159435774 +0000 UTC m=+0.212202146 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:33:21 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:33:21 np0005532585.localdomain podman[88766]: 2025-11-23 08:33:21.186292378 +0000 UTC m=+0.247547743 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:33:21 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:33:38 np0005532585.localdomain sudo[88841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:33:38 np0005532585.localdomain sudo[88841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:33:38 np0005532585.localdomain sudo[88841]: pam_unix(sudo:session): session closed for user root
Nov 23 08:33:38 np0005532585.localdomain sudo[88856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 08:33:38 np0005532585.localdomain sudo[88856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:33:39 np0005532585.localdomain sudo[88856]: pam_unix(sudo:session): session closed for user root
Nov 23 08:33:39 np0005532585.localdomain sudo[88892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:33:39 np0005532585.localdomain sudo[88892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:33:39 np0005532585.localdomain sudo[88892]: pam_unix(sudo:session): session closed for user root
Nov 23 08:33:39 np0005532585.localdomain sudo[88907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:33:39 np0005532585.localdomain sudo[88907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:33:40 np0005532585.localdomain sudo[88907]: pam_unix(sudo:session): session closed for user root
Nov 23 08:33:40 np0005532585.localdomain sudo[88955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:33:40 np0005532585.localdomain sudo[88955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:33:40 np0005532585.localdomain sudo[88955]: pam_unix(sudo:session): session closed for user root
Nov 23 08:33:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:33:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:33:46 np0005532585.localdomain podman[88971]: 2025-11-23 08:33:46.033672358 +0000 UTC m=+0.077960109 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Nov 23 08:33:46 np0005532585.localdomain podman[88971]: 2025-11-23 08:33:46.069047235 +0000 UTC m=+0.113334966 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-iscsid-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public)
Nov 23 08:33:46 np0005532585.localdomain podman[88970]: 2025-11-23 08:33:46.086841948 +0000 UTC m=+0.134225189 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Nov 23 08:33:46 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:33:46 np0005532585.localdomain podman[88970]: 2025-11-23 08:33:46.094149012 +0000 UTC m=+0.141532253 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 23 08:33:46 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:33:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:33:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:33:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:33:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:33:49 np0005532585.localdomain systemd[1]: tmp-crun.vHSG9w.mount: Deactivated successfully.
Nov 23 08:33:49 np0005532585.localdomain podman[89017]: 2025-11-23 08:33:49.086734466 +0000 UTC m=+0.131643906 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, distribution-scope=public)
Nov 23 08:33:49 np0005532585.localdomain podman[89009]: 2025-11-23 08:33:49.053906941 +0000 UTC m=+0.111675574 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:33:49 np0005532585.localdomain podman[89009]: 2025-11-23 08:33:49.137271412 +0000 UTC m=+0.195040045 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:33:49 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:33:49 np0005532585.localdomain podman[89011]: 2025-11-23 08:33:49.138385238 +0000 UTC m=+0.188945740 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git)
Nov 23 08:33:49 np0005532585.localdomain podman[89010]: 2025-11-23 08:33:49.194938387 +0000 UTC m=+0.246604044 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vcs-type=git)
Nov 23 08:33:49 np0005532585.localdomain podman[89017]: 2025-11-23 08:33:49.213597168 +0000 UTC m=+0.258506608 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:33:49 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:33:49 np0005532585.localdomain podman[89010]: 2025-11-23 08:33:49.269723773 +0000 UTC m=+0.321389420 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044)
Nov 23 08:33:49 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:33:49 np0005532585.localdomain podman[89011]: 2025-11-23 08:33:49.320955581 +0000 UTC m=+0.371516123 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z)
Nov 23 08:33:49 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:33:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:33:50 np0005532585.localdomain podman[89104]: 2025-11-23 08:33:50.023149009 +0000 UTC m=+0.075606553 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:33:50 np0005532585.localdomain podman[89104]: 2025-11-23 08:33:50.407239064 +0000 UTC m=+0.459696568 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true)
Nov 23 08:33:50 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:33:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:33:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:33:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:33:52 np0005532585.localdomain systemd[1]: tmp-crun.KkkSE0.mount: Deactivated successfully.
Nov 23 08:33:52 np0005532585.localdomain podman[89127]: 2025-11-23 08:33:52.027010038 +0000 UTC m=+0.086855385 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible)
Nov 23 08:33:52 np0005532585.localdomain podman[89128]: 2025-11-23 08:33:52.037508616 +0000 UTC m=+0.092654422 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible)
Nov 23 08:33:52 np0005532585.localdomain podman[89129]: 2025-11-23 08:33:52.091042008 +0000 UTC m=+0.145248694 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:33:52 np0005532585.localdomain podman[89128]: 2025-11-23 08:33:52.102257268 +0000 UTC m=+0.157403074 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git)
Nov 23 08:33:52 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:33:52 np0005532585.localdomain podman[89129]: 2025-11-23 08:33:52.116368192 +0000 UTC m=+0.170574908 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4)
Nov 23 08:33:52 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:33:52 np0005532585.localdomain podman[89127]: 2025-11-23 08:33:52.237077745 +0000 UTC m=+0.296923062 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.)
Nov 23 08:33:52 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:33:58 np0005532585.localdomain sshd[89201]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:34:00 np0005532585.localdomain sshd[89201]: Received disconnect from 117.5.148.56 port 52376:11:  [preauth]
Nov 23 08:34:00 np0005532585.localdomain sshd[89201]: Disconnected from authenticating user root 117.5.148.56 port 52376 [preauth]
Nov 23 08:34:00 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:34:00 np0005532585.localdomain recover_tripleo_nova_virtqemud[89227]: 61756
Nov 23 08:34:00 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:34:00 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:34:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:34:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:34:17 np0005532585.localdomain systemd[1]: tmp-crun.xirYir.mount: Deactivated successfully.
Nov 23 08:34:17 np0005532585.localdomain podman[89228]: 2025-11-23 08:34:17.025454607 +0000 UTC m=+0.083836708 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:34:17 np0005532585.localdomain podman[89228]: 2025-11-23 08:34:17.059662467 +0000 UTC m=+0.118044508 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3)
Nov 23 08:34:17 np0005532585.localdomain podman[89229]: 2025-11-23 08:34:17.071056903 +0000 UTC m=+0.123680479 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:34:17 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:34:17 np0005532585.localdomain podman[89229]: 2025-11-23 08:34:17.103382623 +0000 UTC m=+0.156006259 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com)
Nov 23 08:34:17 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:34:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:34:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:34:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:34:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:34:20 np0005532585.localdomain systemd[1]: tmp-crun.X9pnuX.mount: Deactivated successfully.
Nov 23 08:34:20 np0005532585.localdomain podman[89268]: 2025-11-23 08:34:20.025341923 +0000 UTC m=+0.079302172 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:34:20 np0005532585.localdomain podman[89267]: 2025-11-23 08:34:20.067931064 +0000 UTC m=+0.121502520 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:34:20 np0005532585.localdomain podman[89268]: 2025-11-23 08:34:20.072064056 +0000 UTC m=+0.126024325 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public)
Nov 23 08:34:20 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:34:20 np0005532585.localdomain podman[89267]: 2025-11-23 08:34:20.104334405 +0000 UTC m=+0.157905881 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container)
Nov 23 08:34:20 np0005532585.localdomain podman[89271]: 2025-11-23 08:34:20.110821353 +0000 UTC m=+0.153818769 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute)
Nov 23 08:34:20 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:34:20 np0005532585.localdomain podman[89271]: 2025-11-23 08:34:20.138261666 +0000 UTC m=+0.181259122 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:34:20 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:34:20 np0005532585.localdomain podman[89269]: 2025-11-23 08:34:20.151409688 +0000 UTC m=+0.198167295 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:34:20 np0005532585.localdomain podman[89269]: 2025-11-23 08:34:20.178625314 +0000 UTC m=+0.225382911 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Nov 23 08:34:20 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:34:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:34:21 np0005532585.localdomain podman[89367]: 2025-11-23 08:34:21.025998162 +0000 UTC m=+0.082078132 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, release=1761123044)
Nov 23 08:34:21 np0005532585.localdomain podman[89367]: 2025-11-23 08:34:21.395826579 +0000 UTC m=+0.451906569 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target)
Nov 23 08:34:21 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:34:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:34:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:34:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:34:22 np0005532585.localdomain systemd[1]: tmp-crun.tAariD.mount: Deactivated successfully.
Nov 23 08:34:23 np0005532585.localdomain podman[89390]: 2025-11-23 08:34:22.996629192 +0000 UTC m=+0.058681458 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:34:23 np0005532585.localdomain podman[89389]: 2025-11-23 08:34:23.05067551 +0000 UTC m=+0.113713468 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:34:23 np0005532585.localdomain podman[89391]: 2025-11-23 08:34:23.02173125 +0000 UTC m=+0.080524612 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git)
Nov 23 08:34:23 np0005532585.localdomain podman[89390]: 2025-11-23 08:34:23.085365917 +0000 UTC m=+0.147418213 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 23 08:34:23 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:34:23 np0005532585.localdomain podman[89391]: 2025-11-23 08:34:23.106366472 +0000 UTC m=+0.165159894 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:34:23 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:34:23 np0005532585.localdomain podman[89389]: 2025-11-23 08:34:23.246541951 +0000 UTC m=+0.309579879 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:34:23 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:34:23 np0005532585.localdomain systemd[1]: tmp-crun.dFbzus.mount: Deactivated successfully.
Nov 23 08:34:39 np0005532585.localdomain sshd[89466]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:34:39 np0005532585.localdomain sshd[89466]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 08:34:39 np0005532585.localdomain sshd[89466]: Connection closed by 193.32.162.146 port 57440
Nov 23 08:34:41 np0005532585.localdomain sudo[89467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:34:41 np0005532585.localdomain sudo[89467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:34:41 np0005532585.localdomain sudo[89467]: pam_unix(sudo:session): session closed for user root
Nov 23 08:34:41 np0005532585.localdomain sudo[89482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:34:41 np0005532585.localdomain sudo[89482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:34:41 np0005532585.localdomain sudo[89482]: pam_unix(sudo:session): session closed for user root
Nov 23 08:34:42 np0005532585.localdomain sudo[89528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:34:42 np0005532585.localdomain sudo[89528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:34:42 np0005532585.localdomain sudo[89528]: pam_unix(sudo:session): session closed for user root
Nov 23 08:34:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:34:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:34:48 np0005532585.localdomain systemd[1]: tmp-crun.fPjKmo.mount: Deactivated successfully.
Nov 23 08:34:48 np0005532585.localdomain podman[89543]: 2025-11-23 08:34:48.051139024 +0000 UTC m=+0.104389350 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:34:48 np0005532585.localdomain podman[89544]: 2025-11-23 08:34:48.010115254 +0000 UTC m=+0.064569878 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:34:48 np0005532585.localdomain podman[89544]: 2025-11-23 08:34:48.094162067 +0000 UTC m=+0.148616661 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step3, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:34:48 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:34:48 np0005532585.localdomain podman[89543]: 2025-11-23 08:34:48.115216334 +0000 UTC m=+0.168466740 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044)
Nov 23 08:34:48 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:34:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:34:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:34:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:34:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:34:51 np0005532585.localdomain systemd[1]: tmp-crun.3LYz0X.mount: Deactivated successfully.
Nov 23 08:34:51 np0005532585.localdomain podman[89581]: 2025-11-23 08:34:51.103730317 +0000 UTC m=+0.145727348 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi)
Nov 23 08:34:51 np0005532585.localdomain podman[89580]: 2025-11-23 08:34:51.149045475 +0000 UTC m=+0.192530834 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:34:51 np0005532585.localdomain podman[89581]: 2025-11-23 08:34:51.159384518 +0000 UTC m=+0.201381559 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:34:51 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:34:51 np0005532585.localdomain podman[89579]: 2025-11-23 08:34:51.068556206 +0000 UTC m=+0.116679614 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=)
Nov 23 08:34:51 np0005532585.localdomain podman[89580]: 2025-11-23 08:34:51.186256383 +0000 UTC m=+0.229741742 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:11:48Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Nov 23 08:34:51 np0005532585.localdomain podman[89585]: 2025-11-23 08:34:51.199706595 +0000 UTC m=+0.235400293 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:34:51 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:34:51 np0005532585.localdomain podman[89585]: 2025-11-23 08:34:51.224495292 +0000 UTC m=+0.260189030 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step5)
Nov 23 08:34:51 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:34:51 np0005532585.localdomain podman[89579]: 2025-11-23 08:34:51.250129167 +0000 UTC m=+0.298252495 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Nov 23 08:34:51 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:34:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:34:52 np0005532585.localdomain podman[89675]: 2025-11-23 08:34:52.023802174 +0000 UTC m=+0.080484760 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, io.buildah.version=1.41.4)
Nov 23 08:34:52 np0005532585.localdomain systemd[1]: tmp-crun.45Ly8Y.mount: Deactivated successfully.
Nov 23 08:34:52 np0005532585.localdomain podman[89675]: 2025-11-23 08:34:52.418709397 +0000 UTC m=+0.475391973 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 23 08:34:52 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:34:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:34:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:34:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:34:54 np0005532585.localdomain podman[89699]: 2025-11-23 08:34:54.037830011 +0000 UTC m=+0.094688187 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 23 08:34:54 np0005532585.localdomain podman[89701]: 2025-11-23 08:34:54.07636993 +0000 UTC m=+0.128745323 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:34:54 np0005532585.localdomain podman[89700]: 2025-11-23 08:34:54.130551053 +0000 UTC m=+0.184570518 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z)
Nov 23 08:34:54 np0005532585.localdomain podman[89701]: 2025-11-23 08:34:54.15594282 +0000 UTC m=+0.208318213 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Nov 23 08:34:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:34:54 np0005532585.localdomain podman[89700]: 2025-11-23 08:34:54.195264605 +0000 UTC m=+0.249284030 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:34:54 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:34:54 np0005532585.localdomain podman[89699]: 2025-11-23 08:34:54.264348477 +0000 UTC m=+0.321206623 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=)
Nov 23 08:34:54 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:35:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:35:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:35:18 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:35:18 np0005532585.localdomain recover_tripleo_nova_virtqemud[89802]: 61756
Nov 23 08:35:18 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:35:18 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:35:19 np0005532585.localdomain systemd[1]: tmp-crun.OjCDGM.mount: Deactivated successfully.
Nov 23 08:35:19 np0005532585.localdomain podman[89795]: 2025-11-23 08:35:19.043158492 +0000 UTC m=+0.098098170 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:35:19 np0005532585.localdomain podman[89794]: 2025-11-23 08:35:19.0861396 +0000 UTC m=+0.143645206 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:35:19 np0005532585.localdomain podman[89795]: 2025-11-23 08:35:19.110342613 +0000 UTC m=+0.165282241 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, config_id=tripleo_step3)
Nov 23 08:35:19 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:35:19 np0005532585.localdomain podman[89794]: 2025-11-23 08:35:19.120989739 +0000 UTC m=+0.178495315 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:35:19 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:35:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:35:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:35:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:35:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:35:22 np0005532585.localdomain podman[89836]: 2025-11-23 08:35:22.026842581 +0000 UTC m=+0.081354446 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 23 08:35:22 np0005532585.localdomain podman[89836]: 2025-11-23 08:35:22.040306214 +0000 UTC m=+0.094818119 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true)
Nov 23 08:35:22 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:35:22 np0005532585.localdomain podman[89843]: 2025-11-23 08:35:22.099227181 +0000 UTC m=+0.144213874 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_id=tripleo_step5, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:35:22 np0005532585.localdomain podman[89838]: 2025-11-23 08:35:22.136968468 +0000 UTC m=+0.186013425 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:35:22 np0005532585.localdomain podman[89837]: 2025-11-23 08:35:22.182972409 +0000 UTC m=+0.232270194 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, tcib_managed=true)
Nov 23 08:35:22 np0005532585.localdomain podman[89838]: 2025-11-23 08:35:22.19440682 +0000 UTC m=+0.243451797 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Nov 23 08:35:22 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:35:22 np0005532585.localdomain podman[89843]: 2025-11-23 08:35:22.213597608 +0000 UTC m=+0.258584301 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:35:22 np0005532585.localdomain podman[89837]: 2025-11-23 08:35:22.236068797 +0000 UTC m=+0.285366522 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, tcib_managed=true, architecture=x86_64)
Nov 23 08:35:22 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:35:22 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:35:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:35:23 np0005532585.localdomain podman[89935]: 2025-11-23 08:35:23.001734801 +0000 UTC m=+0.060564449 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z)
Nov 23 08:35:23 np0005532585.localdomain podman[89935]: 2025-11-23 08:35:23.35742784 +0000 UTC m=+0.416257518 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 23 08:35:23 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:35:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:35:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:35:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:35:25 np0005532585.localdomain systemd[1]: tmp-crun.tQGMWw.mount: Deactivated successfully.
Nov 23 08:35:25 np0005532585.localdomain podman[89958]: 2025-11-23 08:35:25.026950284 +0000 UTC m=+0.085770352 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 23 08:35:25 np0005532585.localdomain podman[89960]: 2025-11-23 08:35:25.064300349 +0000 UTC m=+0.115924886 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:35:25 np0005532585.localdomain podman[89960]: 2025-11-23 08:35:25.109725672 +0000 UTC m=+0.161350219 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 23 08:35:25 np0005532585.localdomain podman[89959]: 2025-11-23 08:35:25.117018196 +0000 UTC m=+0.172341297 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:35:25 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:35:25 np0005532585.localdomain podman[89959]: 2025-11-23 08:35:25.149160512 +0000 UTC m=+0.204483623 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:35:25 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:35:25 np0005532585.localdomain podman[89958]: 2025-11-23 08:35:25.211333979 +0000 UTC m=+0.270154037 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 23 08:35:25 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:35:42 np0005532585.localdomain sudo[90030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:35:42 np0005532585.localdomain sudo[90030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:35:42 np0005532585.localdomain sudo[90030]: pam_unix(sudo:session): session closed for user root
Nov 23 08:35:42 np0005532585.localdomain sudo[90045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 08:35:42 np0005532585.localdomain sudo[90045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:35:43 np0005532585.localdomain systemd[1]: tmp-crun.MnPHTp.mount: Deactivated successfully.
Nov 23 08:35:43 np0005532585.localdomain podman[90134]: 2025-11-23 08:35:43.456082132 +0000 UTC m=+0.061833888 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, name=rhceph, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=)
Nov 23 08:35:43 np0005532585.localdomain podman[90134]: 2025-11-23 08:35:43.544392901 +0000 UTC m=+0.150144707 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12)
Nov 23 08:35:43 np0005532585.localdomain sudo[90045]: pam_unix(sudo:session): session closed for user root
Nov 23 08:35:43 np0005532585.localdomain sudo[90200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:35:43 np0005532585.localdomain sudo[90200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:35:43 np0005532585.localdomain sudo[90200]: pam_unix(sudo:session): session closed for user root
Nov 23 08:35:43 np0005532585.localdomain sudo[90215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:35:43 np0005532585.localdomain sudo[90215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:35:44 np0005532585.localdomain sudo[90215]: pam_unix(sudo:session): session closed for user root
Nov 23 08:35:45 np0005532585.localdomain sudo[90263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:35:45 np0005532585.localdomain sudo[90263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:35:45 np0005532585.localdomain sudo[90263]: pam_unix(sudo:session): session closed for user root
Nov 23 08:35:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:35:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:35:50 np0005532585.localdomain podman[90279]: 2025-11-23 08:35:50.017851284 +0000 UTC m=+0.073893868 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12)
Nov 23 08:35:50 np0005532585.localdomain podman[90279]: 2025-11-23 08:35:50.033363539 +0000 UTC m=+0.089406233 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:35:50 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:35:50 np0005532585.localdomain systemd[1]: tmp-crun.vIACbQ.mount: Deactivated successfully.
Nov 23 08:35:50 np0005532585.localdomain podman[90278]: 2025-11-23 08:35:50.126999292 +0000 UTC m=+0.184216961 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:35:50 np0005532585.localdomain podman[90278]: 2025-11-23 08:35:50.135692368 +0000 UTC m=+0.192910037 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-collectd)
Nov 23 08:35:50 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:35:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:35:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:35:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:35:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:35:53 np0005532585.localdomain podman[90316]: 2025-11-23 08:35:53.042777819 +0000 UTC m=+0.096035566 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Nov 23 08:35:53 np0005532585.localdomain podman[90316]: 2025-11-23 08:35:53.052276771 +0000 UTC m=+0.105534558 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, container_name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z)
Nov 23 08:35:53 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:35:53 np0005532585.localdomain podman[90318]: 2025-11-23 08:35:53.09397641 +0000 UTC m=+0.140867612 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:35:53 np0005532585.localdomain podman[90318]: 2025-11-23 08:35:53.123474375 +0000 UTC m=+0.170365527 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi)
Nov 23 08:35:53 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:35:53 np0005532585.localdomain podman[90317]: 2025-11-23 08:35:53.12919101 +0000 UTC m=+0.179020882 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, config_id=tripleo_step4)
Nov 23 08:35:53 np0005532585.localdomain podman[90319]: 2025-11-23 08:35:53.189117468 +0000 UTC m=+0.232685247 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public)
Nov 23 08:35:53 np0005532585.localdomain podman[90317]: 2025-11-23 08:35:53.213357361 +0000 UTC m=+0.263187293 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:35:53 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:35:53 np0005532585.localdomain podman[90319]: 2025-11-23 08:35:53.245377444 +0000 UTC m=+0.288945253 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Nov 23 08:35:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:35:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:35:54 np0005532585.localdomain podman[90415]: 2025-11-23 08:35:54.021064504 +0000 UTC m=+0.078589941 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, vcs-type=git)
Nov 23 08:35:54 np0005532585.localdomain podman[90415]: 2025-11-23 08:35:54.413532431 +0000 UTC m=+0.471057858 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, distribution-scope=public, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4)
Nov 23 08:35:54 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:35:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:35:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:35:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:35:56 np0005532585.localdomain systemd[1]: tmp-crun.YiNM0b.mount: Deactivated successfully.
Nov 23 08:35:56 np0005532585.localdomain podman[90438]: 2025-11-23 08:35:56.016940038 +0000 UTC m=+0.078585292 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, container_name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:35:56 np0005532585.localdomain podman[90445]: 2025-11-23 08:35:56.078226547 +0000 UTC m=+0.129707568 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:35:56 np0005532585.localdomain podman[90439]: 2025-11-23 08:35:56.044147802 +0000 UTC m=+0.097457610 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 23 08:35:56 np0005532585.localdomain podman[90445]: 2025-11-23 08:35:56.096301191 +0000 UTC m=+0.147782252 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.4, distribution-scope=public)
Nov 23 08:35:56 np0005532585.localdomain podman[90439]: 2025-11-23 08:35:56.133249605 +0000 UTC m=+0.186559393 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 23 08:35:56 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:35:56 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:35:56 np0005532585.localdomain podman[90438]: 2025-11-23 08:35:56.1953507 +0000 UTC m=+0.256995964 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:35:56 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:35:57 np0005532585.localdomain systemd[1]: tmp-crun.Fyex4t.mount: Deactivated successfully.
Nov 23 08:36:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:36:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:36:21 np0005532585.localdomain systemd[1]: tmp-crun.iPLPhF.mount: Deactivated successfully.
Nov 23 08:36:21 np0005532585.localdomain podman[90533]: 2025-11-23 08:36:21.037063384 +0000 UTC m=+0.092583120 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3)
Nov 23 08:36:21 np0005532585.localdomain systemd[1]: tmp-crun.gR5Ff3.mount: Deactivated successfully.
Nov 23 08:36:21 np0005532585.localdomain podman[90534]: 2025-11-23 08:36:21.076217575 +0000 UTC m=+0.128767920 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4)
Nov 23 08:36:21 np0005532585.localdomain podman[90533]: 2025-11-23 08:36:21.099617333 +0000 UTC m=+0.155137079 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 08:36:21 np0005532585.localdomain podman[90534]: 2025-11-23 08:36:21.113405956 +0000 UTC m=+0.165956302 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3)
Nov 23 08:36:21 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:36:21 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:36:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:36:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:36:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:36:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:36:24 np0005532585.localdomain systemd[1]: tmp-crun.83HA05.mount: Deactivated successfully.
Nov 23 08:36:24 np0005532585.localdomain podman[90577]: 2025-11-23 08:36:24.046401793 +0000 UTC m=+0.083629586 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=nova_compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true)
Nov 23 08:36:24 np0005532585.localdomain podman[90577]: 2025-11-23 08:36:24.101148301 +0000 UTC m=+0.138376054 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:36:24 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:36:24 np0005532585.localdomain podman[90575]: 2025-11-23 08:36:24.15164557 +0000 UTC m=+0.193700402 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:36:24 np0005532585.localdomain podman[90575]: 2025-11-23 08:36:24.181352722 +0000 UTC m=+0.223407574 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:36:24 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:36:24 np0005532585.localdomain podman[90576]: 2025-11-23 08:36:24.197910349 +0000 UTC m=+0.239524567 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64)
Nov 23 08:36:24 np0005532585.localdomain podman[90574]: 2025-11-23 08:36:24.104529865 +0000 UTC m=+0.150687732 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:36:24 np0005532585.localdomain podman[90576]: 2025-11-23 08:36:24.231207631 +0000 UTC m=+0.272821839 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:36:24 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:36:24 np0005532585.localdomain podman[90574]: 2025-11-23 08:36:24.241999901 +0000 UTC m=+0.288157858 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-type=git, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:36:24 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:36:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:36:25 np0005532585.localdomain podman[90670]: 2025-11-23 08:36:25.016677631 +0000 UTC m=+0.078452097 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 08:36:25 np0005532585.localdomain systemd[1]: tmp-crun.Dbhz9h.mount: Deactivated successfully.
Nov 23 08:36:25 np0005532585.localdomain podman[90670]: 2025-11-23 08:36:25.415475102 +0000 UTC m=+0.477249608 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:36:25 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:36:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:36:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:36:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:36:26 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:36:26 np0005532585.localdomain recover_tripleo_nova_virtqemud[90708]: 61756
Nov 23 08:36:26 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:36:26 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:36:27 np0005532585.localdomain systemd[1]: tmp-crun.EkF1r9.mount: Deactivated successfully.
Nov 23 08:36:27 np0005532585.localdomain podman[90693]: 2025-11-23 08:36:27.037700026 +0000 UTC m=+0.096249863 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:36:27 np0005532585.localdomain podman[90695]: 2025-11-23 08:36:27.088398521 +0000 UTC m=+0.139339234 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 23 08:36:27 np0005532585.localdomain podman[90695]: 2025-11-23 08:36:27.131667808 +0000 UTC m=+0.182608521 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com)
Nov 23 08:36:27 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:36:27 np0005532585.localdomain podman[90694]: 2025-11-23 08:36:27.16723522 +0000 UTC m=+0.223547738 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 23 08:36:27 np0005532585.localdomain podman[90694]: 2025-11-23 08:36:27.200571751 +0000 UTC m=+0.256884299 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64)
Nov 23 08:36:27 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:36:27 np0005532585.localdomain podman[90693]: 2025-11-23 08:36:27.23313331 +0000 UTC m=+0.291683117 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12)
Nov 23 08:36:27 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:36:28 np0005532585.localdomain systemd[1]: tmp-crun.5PRTtI.mount: Deactivated successfully.
Nov 23 08:36:45 np0005532585.localdomain sudo[90771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:36:45 np0005532585.localdomain sudo[90771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:36:45 np0005532585.localdomain sudo[90771]: pam_unix(sudo:session): session closed for user root
Nov 23 08:36:45 np0005532585.localdomain sudo[90786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:36:45 np0005532585.localdomain sudo[90786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:36:46 np0005532585.localdomain sudo[90786]: pam_unix(sudo:session): session closed for user root
Nov 23 08:36:47 np0005532585.localdomain sudo[90832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:36:47 np0005532585.localdomain sudo[90832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:36:47 np0005532585.localdomain sudo[90832]: pam_unix(sudo:session): session closed for user root
Nov 23 08:36:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:36:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:36:52 np0005532585.localdomain systemd[1]: tmp-crun.1djAn2.mount: Deactivated successfully.
Nov 23 08:36:52 np0005532585.localdomain podman[90848]: 2025-11-23 08:36:52.058289645 +0000 UTC m=+0.116865665 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.12)
Nov 23 08:36:52 np0005532585.localdomain podman[90849]: 2025-11-23 08:36:52.103286465 +0000 UTC m=+0.159044819 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, release=1761123044)
Nov 23 08:36:52 np0005532585.localdomain podman[90848]: 2025-11-23 08:36:52.118213693 +0000 UTC m=+0.176789703 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, architecture=x86_64, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Nov 23 08:36:52 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:36:52 np0005532585.localdomain podman[90849]: 2025-11-23 08:36:52.137863136 +0000 UTC m=+0.193621450 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true)
Nov 23 08:36:52 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:36:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:36:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:36:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:36:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: tmp-crun.8zCyb8.mount: Deactivated successfully.
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: tmp-crun.sKvl5s.mount: Deactivated successfully.
Nov 23 08:36:55 np0005532585.localdomain podman[90899]: 2025-11-23 08:36:55.051602601 +0000 UTC m=+0.098554274 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com)
Nov 23 08:36:55 np0005532585.localdomain podman[90888]: 2025-11-23 08:36:55.083710676 +0000 UTC m=+0.140524361 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Nov 23 08:36:55 np0005532585.localdomain podman[90899]: 2025-11-23 08:36:55.100157811 +0000 UTC m=+0.147109504 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:36:55 np0005532585.localdomain podman[90890]: 2025-11-23 08:36:55.148456862 +0000 UTC m=+0.199273963 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 23 08:36:55 np0005532585.localdomain podman[90889]: 2025-11-23 08:36:55.018252858 +0000 UTC m=+0.075717074 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:36:55 np0005532585.localdomain podman[90890]: 2025-11-23 08:36:55.174181371 +0000 UTC m=+0.224998482 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:36:55 np0005532585.localdomain podman[90889]: 2025-11-23 08:36:55.198563639 +0000 UTC m=+0.256027855 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:36:55 np0005532585.localdomain podman[90888]: 2025-11-23 08:36:55.221073529 +0000 UTC m=+0.277887204 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:36:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:36:56 np0005532585.localdomain systemd[1]: tmp-crun.P6RBAC.mount: Deactivated successfully.
Nov 23 08:36:56 np0005532585.localdomain podman[90988]: 2025-11-23 08:36:56.030609598 +0000 UTC m=+0.085230085 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true)
Nov 23 08:36:56 np0005532585.localdomain podman[90988]: 2025-11-23 08:36:56.407163647 +0000 UTC m=+0.461784074 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 23 08:36:56 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:36:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:36:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:36:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:36:58 np0005532585.localdomain systemd[1]: tmp-crun.wpbGj5.mount: Deactivated successfully.
Nov 23 08:36:58 np0005532585.localdomain podman[91015]: 2025-11-23 08:36:58.034409795 +0000 UTC m=+0.089611940 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:36:58 np0005532585.localdomain podman[91016]: 2025-11-23 08:36:58.00915152 +0000 UTC m=+0.064689025 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.openshift.expose-services=, architecture=x86_64)
Nov 23 08:36:58 np0005532585.localdomain podman[91014]: 2025-11-23 08:36:58.069800291 +0000 UTC m=+0.129113921 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Nov 23 08:36:58 np0005532585.localdomain podman[91016]: 2025-11-23 08:36:58.095287352 +0000 UTC m=+0.150824867 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z)
Nov 23 08:36:58 np0005532585.localdomain podman[91015]: 2025-11-23 08:36:58.103273617 +0000 UTC m=+0.158475642 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 23 08:36:58 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:36:58 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:36:58 np0005532585.localdomain podman[91014]: 2025-11-23 08:36:58.243265691 +0000 UTC m=+0.302579291 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:36:58 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:37:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:37:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:37:23 np0005532585.localdomain podman[91088]: 2025-11-23 08:37:23.033753231 +0000 UTC m=+0.085431182 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:37:23 np0005532585.localdomain podman[91088]: 2025-11-23 08:37:23.046198892 +0000 UTC m=+0.097876863 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:37:23 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:37:23 np0005532585.localdomain systemd[1]: tmp-crun.eSETIk.mount: Deactivated successfully.
Nov 23 08:37:23 np0005532585.localdomain podman[91089]: 2025-11-23 08:37:23.143674172 +0000 UTC m=+0.192263327 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:37:23 np0005532585.localdomain podman[91089]: 2025-11-23 08:37:23.157262689 +0000 UTC m=+0.205851844 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:37:23 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:37:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:37:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:37:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:37:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:37:26 np0005532585.localdomain podman[91128]: 2025-11-23 08:37:26.035199466 +0000 UTC m=+0.090728143 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond)
Nov 23 08:37:26 np0005532585.localdomain podman[91136]: 2025-11-23 08:37:26.042393697 +0000 UTC m=+0.087179535 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=nova_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:37:26 np0005532585.localdomain podman[91136]: 2025-11-23 08:37:26.062539315 +0000 UTC m=+0.107325183 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:37:26 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:37:26 np0005532585.localdomain podman[91128]: 2025-11-23 08:37:26.07119347 +0000 UTC m=+0.126722107 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 23 08:37:26 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:37:26 np0005532585.localdomain podman[91130]: 2025-11-23 08:37:26.127433145 +0000 UTC m=+0.175222305 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:37:26 np0005532585.localdomain podman[91129]: 2025-11-23 08:37:26.14356923 +0000 UTC m=+0.194803585 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:37:26 np0005532585.localdomain podman[91130]: 2025-11-23 08:37:26.177274204 +0000 UTC m=+0.225063424 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Nov 23 08:37:26 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:37:26 np0005532585.localdomain podman[91129]: 2025-11-23 08:37:26.199264738 +0000 UTC m=+0.250499103 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 23 08:37:26 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:37:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:37:27 np0005532585.localdomain podman[91227]: 2025-11-23 08:37:27.026128408 +0000 UTC m=+0.082622944 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:37:27 np0005532585.localdomain podman[91227]: 2025-11-23 08:37:27.42037068 +0000 UTC m=+0.476865206 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:37:27 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:37:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:37:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:37:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:37:29 np0005532585.localdomain podman[91250]: 2025-11-23 08:37:29.022125297 +0000 UTC m=+0.077470697 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:37:29 np0005532585.localdomain systemd[1]: tmp-crun.n8Udr8.mount: Deactivated successfully.
Nov 23 08:37:29 np0005532585.localdomain podman[91251]: 2025-11-23 08:37:29.083040994 +0000 UTC m=+0.133397751 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:37:29 np0005532585.localdomain podman[91252]: 2025-11-23 08:37:29.05546394 +0000 UTC m=+0.103012751 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, version=17.1.12)
Nov 23 08:37:29 np0005532585.localdomain podman[91251]: 2025-11-23 08:37:29.114096667 +0000 UTC m=+0.164453424 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:37:29 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:37:29 np0005532585.localdomain podman[91252]: 2025-11-23 08:37:29.141182448 +0000 UTC m=+0.188731229 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git)
Nov 23 08:37:29 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:37:29 np0005532585.localdomain podman[91250]: 2025-11-23 08:37:29.228323971 +0000 UTC m=+0.283669411 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team)
Nov 23 08:37:29 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:37:47 np0005532585.localdomain sudo[91326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:37:47 np0005532585.localdomain sudo[91326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:37:47 np0005532585.localdomain sudo[91326]: pam_unix(sudo:session): session closed for user root
Nov 23 08:37:47 np0005532585.localdomain sudo[91341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:37:47 np0005532585.localdomain sudo[91341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:37:48 np0005532585.localdomain sudo[91341]: pam_unix(sudo:session): session closed for user root
Nov 23 08:37:48 np0005532585.localdomain sudo[91388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:37:48 np0005532585.localdomain sudo[91388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:37:48 np0005532585.localdomain sudo[91388]: pam_unix(sudo:session): session closed for user root
Nov 23 08:37:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:37:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:37:54 np0005532585.localdomain systemd[1]: tmp-crun.CEAxQD.mount: Deactivated successfully.
Nov 23 08:37:54 np0005532585.localdomain podman[91403]: 2025-11-23 08:37:54.074453858 +0000 UTC m=+0.126039267 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 08:37:54 np0005532585.localdomain podman[91404]: 2025-11-23 08:37:54.049206034 +0000 UTC m=+0.100998179 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:37:54 np0005532585.localdomain podman[91403]: 2025-11-23 08:37:54.112282999 +0000 UTC m=+0.163868448 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z)
Nov 23 08:37:54 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:37:54 np0005532585.localdomain podman[91404]: 2025-11-23 08:37:54.131447577 +0000 UTC m=+0.183239752 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:37:54 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:37:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:37:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:37:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:37:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:37:57 np0005532585.localdomain podman[91440]: 2025-11-23 08:37:57.04252658 +0000 UTC m=+0.093280211 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64)
Nov 23 08:37:57 np0005532585.localdomain podman[91440]: 2025-11-23 08:37:57.057506809 +0000 UTC m=+0.108260490 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vcs-type=git, name=rhosp17/openstack-cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=)
Nov 23 08:37:57 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:37:57 np0005532585.localdomain systemd[1]: tmp-crun.qr5Qbr.mount: Deactivated successfully.
Nov 23 08:37:57 np0005532585.localdomain podman[91442]: 2025-11-23 08:37:57.161559201 +0000 UTC m=+0.204400470 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z, tcib_managed=true, managed_by=tripleo_ansible)
Nov 23 08:37:57 np0005532585.localdomain podman[91441]: 2025-11-23 08:37:57.213936887 +0000 UTC m=+0.257278492 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z)
Nov 23 08:37:57 np0005532585.localdomain podman[91441]: 2025-11-23 08:37:57.24921476 +0000 UTC m=+0.292556365 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:37:57 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:37:57 np0005532585.localdomain podman[91442]: 2025-11-23 08:37:57.272491583 +0000 UTC m=+0.315332842 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:37:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:37:57 np0005532585.localdomain podman[91447]: 2025-11-23 08:37:57.352200508 +0000 UTC m=+0.391567210 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:37:57 np0005532585.localdomain podman[91447]: 2025-11-23 08:37:57.387423748 +0000 UTC m=+0.426790510 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Nov 23 08:37:57 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:37:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:37:58 np0005532585.localdomain podman[91538]: 2025-11-23 08:37:58.024609731 +0000 UTC m=+0.077560119 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:37:58 np0005532585.localdomain systemd[1]: tmp-crun.ZYSkNp.mount: Deactivated successfully.
Nov 23 08:37:58 np0005532585.localdomain podman[91538]: 2025-11-23 08:37:58.398328924 +0000 UTC m=+0.451279312 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 23 08:37:58 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:37:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:37:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:37:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:38:00 np0005532585.localdomain systemd[1]: tmp-crun.2nJEwu.mount: Deactivated successfully.
Nov 23 08:38:00 np0005532585.localdomain podman[91561]: 2025-11-23 08:38:00.048578149 +0000 UTC m=+0.103668641 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=metrics_qdr, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=)
Nov 23 08:38:00 np0005532585.localdomain podman[91562]: 2025-11-23 08:38:00.098032956 +0000 UTC m=+0.150597300 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:38:00 np0005532585.localdomain podman[91563]: 2025-11-23 08:38:00.152066803 +0000 UTC m=+0.199181910 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:38:00 np0005532585.localdomain podman[91562]: 2025-11-23 08:38:00.174226953 +0000 UTC m=+0.226791307 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 08:38:00 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:38:00 np0005532585.localdomain podman[91563]: 2025-11-23 08:38:00.226211576 +0000 UTC m=+0.273326773 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true)
Nov 23 08:38:00 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:38:00 np0005532585.localdomain podman[91561]: 2025-11-23 08:38:00.303424295 +0000 UTC m=+0.358514777 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:38:00 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:38:01 np0005532585.localdomain systemd[1]: tmp-crun.GmnmPR.mount: Deactivated successfully.
Nov 23 08:38:16 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:38:16 np0005532585.localdomain recover_tripleo_nova_virtqemud[91638]: 61756
Nov 23 08:38:16 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:38:16 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:38:17 np0005532585.localdomain sshd[91639]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:38:17 np0005532585.localdomain sshd[91639]: Invalid user ubuntu from 193.32.162.146 port 33956
Nov 23 08:38:18 np0005532585.localdomain sshd[91639]: Connection closed by invalid user ubuntu 193.32.162.146 port 33956 [preauth]
Nov 23 08:38:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:38:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:38:25 np0005532585.localdomain systemd[1]: tmp-crun.fh5b1E.mount: Deactivated successfully.
Nov 23 08:38:25 np0005532585.localdomain podman[91642]: 2025-11-23 08:38:25.044118202 +0000 UTC m=+0.095337956 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid)
Nov 23 08:38:25 np0005532585.localdomain podman[91642]: 2025-11-23 08:38:25.083302693 +0000 UTC m=+0.134522387 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true)
Nov 23 08:38:25 np0005532585.localdomain podman[91641]: 2025-11-23 08:38:25.094512427 +0000 UTC m=+0.145666999 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, distribution-scope=public, architecture=x86_64, tcib_managed=true)
Nov 23 08:38:25 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:38:25 np0005532585.localdomain podman[91641]: 2025-11-23 08:38:25.107564887 +0000 UTC m=+0.158719459 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 23 08:38:25 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:38:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:38:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:38:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:38:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:38:28 np0005532585.localdomain systemd[1]: tmp-crun.OJODhX.mount: Deactivated successfully.
Nov 23 08:38:28 np0005532585.localdomain podman[91681]: 2025-11-23 08:38:28.03786026 +0000 UTC m=+0.084527323 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z)
Nov 23 08:38:28 np0005532585.localdomain podman[91682]: 2025-11-23 08:38:28.065276951 +0000 UTC m=+0.107395244 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:38:28 np0005532585.localdomain podman[91681]: 2025-11-23 08:38:28.135557167 +0000 UTC m=+0.182224230 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Nov 23 08:38:28 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:38:28 np0005532585.localdomain podman[91682]: 2025-11-23 08:38:28.148969638 +0000 UTC m=+0.191087921 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 23 08:38:28 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:38:28 np0005532585.localdomain podman[91680]: 2025-11-23 08:38:28.140677964 +0000 UTC m=+0.190820703 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1)
Nov 23 08:38:28 np0005532585.localdomain podman[91683]: 2025-11-23 08:38:28.100166441 +0000 UTC m=+0.140079517 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:38:28 np0005532585.localdomain podman[91680]: 2025-11-23 08:38:28.225282069 +0000 UTC m=+0.275424878 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 23 08:38:28 np0005532585.localdomain podman[91683]: 2025-11-23 08:38:28.232273194 +0000 UTC m=+0.272186320 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:38:28 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:38:28 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:38:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:38:29 np0005532585.localdomain podman[91778]: 2025-11-23 08:38:29.022350225 +0000 UTC m=+0.079225871 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:38:29 np0005532585.localdomain systemd[1]: tmp-crun.8XXO5q.mount: Deactivated successfully.
Nov 23 08:38:29 np0005532585.localdomain podman[91778]: 2025-11-23 08:38:29.423728256 +0000 UTC m=+0.480603872 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:38:29 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:38:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:38:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:38:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:38:31 np0005532585.localdomain systemd[1]: tmp-crun.iQBf3Z.mount: Deactivated successfully.
Nov 23 08:38:31 np0005532585.localdomain podman[91803]: 2025-11-23 08:38:31.03617966 +0000 UTC m=+0.092139626 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:38:31 np0005532585.localdomain podman[91803]: 2025-11-23 08:38:31.079627913 +0000 UTC m=+0.135587879 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:38:31 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:38:31 np0005532585.localdomain podman[91804]: 2025-11-23 08:38:31.130653927 +0000 UTC m=+0.183650093 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:38:31 np0005532585.localdomain podman[91802]: 2025-11-23 08:38:31.080719236 +0000 UTC m=+0.140117209 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Nov 23 08:38:31 np0005532585.localdomain podman[91804]: 2025-11-23 08:38:31.210694933 +0000 UTC m=+0.263691059 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true)
Nov 23 08:38:31 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:38:31 np0005532585.localdomain podman[91802]: 2025-11-23 08:38:31.25953237 +0000 UTC m=+0.318930273 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:38:31 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:38:48 np0005532585.localdomain sudo[91876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:38:48 np0005532585.localdomain sudo[91876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:38:48 np0005532585.localdomain sudo[91876]: pam_unix(sudo:session): session closed for user root
Nov 23 08:38:48 np0005532585.localdomain sudo[91891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:38:48 np0005532585.localdomain sudo[91891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:38:49 np0005532585.localdomain sudo[91891]: pam_unix(sudo:session): session closed for user root
Nov 23 08:38:50 np0005532585.localdomain sudo[91939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:38:50 np0005532585.localdomain sudo[91939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:38:50 np0005532585.localdomain sudo[91939]: pam_unix(sudo:session): session closed for user root
Nov 23 08:38:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:38:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:38:56 np0005532585.localdomain systemd[1]: tmp-crun.mZgwQR.mount: Deactivated successfully.
Nov 23 08:38:56 np0005532585.localdomain podman[91955]: 2025-11-23 08:38:56.052090615 +0000 UTC m=+0.099354807 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64)
Nov 23 08:38:56 np0005532585.localdomain podman[91954]: 2025-11-23 08:38:56.082500428 +0000 UTC m=+0.131795023 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:38:56 np0005532585.localdomain podman[91954]: 2025-11-23 08:38:56.097697235 +0000 UTC m=+0.146991820 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-collectd)
Nov 23 08:38:56 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:38:56 np0005532585.localdomain podman[91955]: 2025-11-23 08:38:56.181875127 +0000 UTC m=+0.229139339 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:38:56 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:38:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:38:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:38:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:38:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:38:59 np0005532585.localdomain systemd[1]: tmp-crun.h7k7EM.mount: Deactivated successfully.
Nov 23 08:38:59 np0005532585.localdomain podman[91995]: 2025-11-23 08:38:59.043535453 +0000 UTC m=+0.094881171 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com)
Nov 23 08:38:59 np0005532585.localdomain podman[91996]: 2025-11-23 08:38:59.07798921 +0000 UTC m=+0.128410919 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4)
Nov 23 08:38:59 np0005532585.localdomain podman[91995]: 2025-11-23 08:38:59.090752702 +0000 UTC m=+0.142098400 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 23 08:38:59 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:38:59 np0005532585.localdomain podman[91996]: 2025-11-23 08:38:59.135524025 +0000 UTC m=+0.185945714 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:38:59 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:38:59 np0005532585.localdomain podman[91994]: 2025-11-23 08:38:59.136562877 +0000 UTC m=+0.188981217 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, release=1761123044, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1)
Nov 23 08:38:59 np0005532585.localdomain podman[91997]: 2025-11-23 08:38:59.192346437 +0000 UTC m=+0.240202028 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Nov 23 08:38:59 np0005532585.localdomain podman[91997]: 2025-11-23 08:38:59.213011692 +0000 UTC m=+0.260867253 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:38:59 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:38:59 np0005532585.localdomain podman[91994]: 2025-11-23 08:38:59.267116911 +0000 UTC m=+0.319535271 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, container_name=logrotate_crond, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron)
Nov 23 08:38:59 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:38:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:39:00 np0005532585.localdomain podman[92091]: 2025-11-23 08:39:00.017001121 +0000 UTC m=+0.076251611 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target)
Nov 23 08:39:00 np0005532585.localdomain podman[92091]: 2025-11-23 08:39:00.408983253 +0000 UTC m=+0.468233733 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:39:00 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:39:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:39:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:39:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:39:02 np0005532585.localdomain systemd[1]: tmp-crun.mnfebG.mount: Deactivated successfully.
Nov 23 08:39:02 np0005532585.localdomain systemd[1]: tmp-crun.HPOVpi.mount: Deactivated successfully.
Nov 23 08:39:02 np0005532585.localdomain podman[92117]: 2025-11-23 08:39:02.092492947 +0000 UTC m=+0.145056429 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:39:02 np0005532585.localdomain podman[92118]: 2025-11-23 08:39:02.144467092 +0000 UTC m=+0.196043764 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ovn_controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller)
Nov 23 08:39:02 np0005532585.localdomain podman[92116]: 2025-11-23 08:39:02.059840576 +0000 UTC m=+0.114899265 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd)
Nov 23 08:39:02 np0005532585.localdomain podman[92117]: 2025-11-23 08:39:02.173349117 +0000 UTC m=+0.225912559 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc.)
Nov 23 08:39:02 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:39:02 np0005532585.localdomain podman[92118]: 2025-11-23 08:39:02.194436814 +0000 UTC m=+0.246013496 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Nov 23 08:39:02 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:39:02 np0005532585.localdomain podman[92116]: 2025-11-23 08:39:02.259999805 +0000 UTC m=+0.315058564 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:39:02 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:39:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:39:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:39:27 np0005532585.localdomain podman[92190]: 2025-11-23 08:39:27.029833527 +0000 UTC m=+0.085448311 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:39:27 np0005532585.localdomain podman[92190]: 2025-11-23 08:39:27.065383478 +0000 UTC m=+0.120998252 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:39:27 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:39:27 np0005532585.localdomain podman[92191]: 2025-11-23 08:39:27.073842807 +0000 UTC m=+0.125840800 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:39:27 np0005532585.localdomain podman[92191]: 2025-11-23 08:39:27.156372538 +0000 UTC m=+0.208370611 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, version=17.1.12, vcs-type=git, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:39:27 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:39:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:39:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:39:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:39:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: tmp-crun.DqxTrD.mount: Deactivated successfully.
Nov 23 08:39:30 np0005532585.localdomain podman[92233]: 2025-11-23 08:39:30.032202259 +0000 UTC m=+0.078372475 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 23 08:39:30 np0005532585.localdomain podman[92233]: 2025-11-23 08:39:30.097560883 +0000 UTC m=+0.143731139 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: tmp-crun.p3syHU.mount: Deactivated successfully.
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:39:30 np0005532585.localdomain podman[92234]: 2025-11-23 08:39:30.145780103 +0000 UTC m=+0.189551145 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Nov 23 08:39:30 np0005532585.localdomain podman[92237]: 2025-11-23 08:39:30.101852665 +0000 UTC m=+0.142595244 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:39:30 np0005532585.localdomain podman[92232]: 2025-11-23 08:39:30.184927962 +0000 UTC m=+0.236503894 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp17/openstack-cron, vendor=Red Hat, Inc.)
Nov 23 08:39:30 np0005532585.localdomain podman[92234]: 2025-11-23 08:39:30.202200103 +0000 UTC m=+0.245971105 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible)
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:39:30 np0005532585.localdomain podman[92232]: 2025-11-23 08:39:30.22105358 +0000 UTC m=+0.272629502 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:39:30 np0005532585.localdomain podman[92237]: 2025-11-23 08:39:30.236422962 +0000 UTC m=+0.277165531 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:39:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:39:31 np0005532585.localdomain podman[92328]: 2025-11-23 08:39:31.02865715 +0000 UTC m=+0.081711367 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Nov 23 08:39:31 np0005532585.localdomain podman[92328]: 2025-11-23 08:39:31.403257898 +0000 UTC m=+0.456312125 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:39:31 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:39:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:39:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:39:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:39:33 np0005532585.localdomain systemd[1]: tmp-crun.CmdiHl.mount: Deactivated successfully.
Nov 23 08:39:33 np0005532585.localdomain systemd[1]: tmp-crun.63lwyq.mount: Deactivated successfully.
Nov 23 08:39:33 np0005532585.localdomain podman[92353]: 2025-11-23 08:39:33.052374268 +0000 UTC m=+0.105325402 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 08:39:33 np0005532585.localdomain podman[92351]: 2025-11-23 08:39:33.02014599 +0000 UTC m=+0.082299816 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:39:33 np0005532585.localdomain podman[92353]: 2025-11-23 08:39:33.078342664 +0000 UTC m=+0.131293758 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 23 08:39:33 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:39:33 np0005532585.localdomain podman[92352]: 2025-11-23 08:39:33.154511551 +0000 UTC m=+0.209822917 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:39:33 np0005532585.localdomain podman[92352]: 2025-11-23 08:39:33.197383765 +0000 UTC m=+0.252695161 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:39:33 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:39:33 np0005532585.localdomain podman[92351]: 2025-11-23 08:39:33.208090604 +0000 UTC m=+0.270244480 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:39:33 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:39:34 np0005532585.localdomain systemd[1]: tmp-crun.qeVNoq.mount: Deactivated successfully.
Nov 23 08:39:38 np0005532585.localdomain sshd[92425]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:39:41 np0005532585.localdomain sshd[92425]: Invalid user user1 from 80.94.95.116 port 16488
Nov 23 08:39:41 np0005532585.localdomain sshd[92425]: Connection closed by invalid user user1 80.94.95.116 port 16488 [preauth]
Nov 23 08:39:50 np0005532585.localdomain sudo[92427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:39:50 np0005532585.localdomain sudo[92427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:39:50 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:39:50 np0005532585.localdomain sudo[92427]: pam_unix(sudo:session): session closed for user root
Nov 23 08:39:50 np0005532585.localdomain recover_tripleo_nova_virtqemud[92443]: 61756
Nov 23 08:39:50 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:39:50 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:39:50 np0005532585.localdomain sudo[92444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:39:50 np0005532585.localdomain sudo[92444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:39:51 np0005532585.localdomain sudo[92444]: pam_unix(sudo:session): session closed for user root
Nov 23 08:39:51 np0005532585.localdomain sudo[92491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:39:51 np0005532585.localdomain sudo[92491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:39:51 np0005532585.localdomain sudo[92491]: pam_unix(sudo:session): session closed for user root
Nov 23 08:39:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:39:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:39:58 np0005532585.localdomain systemd[1]: tmp-crun.ILbzxL.mount: Deactivated successfully.
Nov 23 08:39:58 np0005532585.localdomain podman[92507]: 2025-11-23 08:39:58.029563236 +0000 UTC m=+0.081459337 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Nov 23 08:39:58 np0005532585.localdomain podman[92507]: 2025-11-23 08:39:58.038977945 +0000 UTC m=+0.090874036 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 23 08:39:58 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:39:58 np0005532585.localdomain systemd[1]: tmp-crun.QQG5Sn.mount: Deactivated successfully.
Nov 23 08:39:58 np0005532585.localdomain podman[92506]: 2025-11-23 08:39:58.091153441 +0000 UTC m=+0.142502625 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.openshift.expose-services=, container_name=collectd)
Nov 23 08:39:58 np0005532585.localdomain podman[92506]: 2025-11-23 08:39:58.098327902 +0000 UTC m=+0.149677066 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Nov 23 08:39:58 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:40:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:40:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:40:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:40:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: tmp-crun.DjXWpP.mount: Deactivated successfully.
Nov 23 08:40:01 np0005532585.localdomain podman[92547]: 2025-11-23 08:40:01.023498498 +0000 UTC m=+0.077678141 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12)
Nov 23 08:40:01 np0005532585.localdomain podman[92547]: 2025-11-23 08:40:01.076048065 +0000 UTC m=+0.130227688 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: tmp-crun.9UwZs9.mount: Deactivated successfully.
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:40:01 np0005532585.localdomain podman[92545]: 2025-11-23 08:40:01.07882616 +0000 UTC m=+0.137292714 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:40:01 np0005532585.localdomain podman[92548]: 2025-11-23 08:40:01.134464991 +0000 UTC m=+0.186834198 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, version=17.1.12, container_name=nova_compute, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:40:01 np0005532585.localdomain podman[92548]: 2025-11-23 08:40:01.164150155 +0000 UTC m=+0.216519322 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:40:01 np0005532585.localdomain podman[92546]: 2025-11-23 08:40:01.180840178 +0000 UTC m=+0.236435534 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, tcib_managed=true)
Nov 23 08:40:01 np0005532585.localdomain podman[92546]: 2025-11-23 08:40:01.203693391 +0000 UTC m=+0.259288747 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:40:01 np0005532585.localdomain podman[92545]: 2025-11-23 08:40:01.215398051 +0000 UTC m=+0.273864625 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:40:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:40:02 np0005532585.localdomain systemd[1]: tmp-crun.670c6r.mount: Deactivated successfully.
Nov 23 08:40:02 np0005532585.localdomain podman[92639]: 2025-11-23 08:40:02.037746519 +0000 UTC m=+0.094264591 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, tcib_managed=true)
Nov 23 08:40:02 np0005532585.localdomain podman[92639]: 2025-11-23 08:40:02.452115637 +0000 UTC m=+0.508633709 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:40:02 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:40:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:40:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:40:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:40:04 np0005532585.localdomain podman[92660]: 2025-11-23 08:40:04.035021551 +0000 UTC m=+0.092163686 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:40:04 np0005532585.localdomain podman[92661]: 2025-11-23 08:40:04.086139393 +0000 UTC m=+0.142310249 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Nov 23 08:40:04 np0005532585.localdomain podman[92662]: 2025-11-23 08:40:04.136501452 +0000 UTC m=+0.188343534 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller)
Nov 23 08:40:04 np0005532585.localdomain podman[92661]: 2025-11-23 08:40:04.150225335 +0000 UTC m=+0.206396241 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64)
Nov 23 08:40:04 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:40:04 np0005532585.localdomain podman[92662]: 2025-11-23 08:40:04.206816655 +0000 UTC m=+0.258658767 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 23 08:40:04 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully.
Nov 23 08:40:04 np0005532585.localdomain podman[92660]: 2025-11-23 08:40:04.266679477 +0000 UTC m=+0.323821552 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Nov 23 08:40:04 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:40:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:40:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:40:29 np0005532585.localdomain systemd[1]: tmp-crun.rPL3Gl.mount: Deactivated successfully.
Nov 23 08:40:29 np0005532585.localdomain podman[92735]: 2025-11-23 08:40:29.022360732 +0000 UTC m=+0.076786693 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, distribution-scope=public, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=)
Nov 23 08:40:29 np0005532585.localdomain podman[92735]: 2025-11-23 08:40:29.034197756 +0000 UTC m=+0.088623737 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Nov 23 08:40:29 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:40:29 np0005532585.localdomain systemd[1]: tmp-crun.vJ8Fr1.mount: Deactivated successfully.
Nov 23 08:40:29 np0005532585.localdomain podman[92734]: 2025-11-23 08:40:29.088039052 +0000 UTC m=+0.144800385 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 23 08:40:29 np0005532585.localdomain podman[92734]: 2025-11-23 08:40:29.096480532 +0000 UTC m=+0.153241835 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-collectd)
Nov 23 08:40:29 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:40:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:40:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:40:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:40:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:40:32 np0005532585.localdomain systemd[1]: tmp-crun.KxR9N6.mount: Deactivated successfully.
Nov 23 08:40:32 np0005532585.localdomain podman[92773]: 2025-11-23 08:40:32.045531473 +0000 UTC m=+0.100943616 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 08:40:32 np0005532585.localdomain podman[92773]: 2025-11-23 08:40:32.052017012 +0000 UTC m=+0.107429145 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:40:32 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:40:32 np0005532585.localdomain podman[92777]: 2025-11-23 08:40:32.093440377 +0000 UTC m=+0.139986097 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z)
Nov 23 08:40:32 np0005532585.localdomain podman[92775]: 2025-11-23 08:40:32.136666936 +0000 UTC m=+0.184615560 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:40:32 np0005532585.localdomain podman[92777]: 2025-11-23 08:40:32.144280531 +0000 UTC m=+0.190826291 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:40:32 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:40:32 np0005532585.localdomain podman[92774]: 2025-11-23 08:40:32.193235346 +0000 UTC m=+0.245853043 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12)
Nov 23 08:40:32 np0005532585.localdomain podman[92775]: 2025-11-23 08:40:32.217325718 +0000 UTC m=+0.265274292 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:40:32 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:40:32 np0005532585.localdomain podman[92774]: 2025-11-23 08:40:32.269040788 +0000 UTC m=+0.321658495 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute)
Nov 23 08:40:32 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:40:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:40:33 np0005532585.localdomain podman[92872]: 2025-11-23 08:40:33.01834259 +0000 UTC m=+0.075547125 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 23 08:40:33 np0005532585.localdomain podman[92872]: 2025-11-23 08:40:33.355003666 +0000 UTC m=+0.412208161 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:40:33 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:40:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:40:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:40:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:40:35 np0005532585.localdomain podman[92896]: 2025-11-23 08:40:35.029853039 +0000 UTC m=+0.081797267 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:40:35 np0005532585.localdomain podman[92895]: 2025-11-23 08:40:35.090158534 +0000 UTC m=+0.144191267 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-type=git)
Nov 23 08:40:35 np0005532585.localdomain podman[92896]: 2025-11-23 08:40:35.102405491 +0000 UTC m=+0.154349699 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Nov 23 08:40:35 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:40:35 np0005532585.localdomain podman[92897]: 2025-11-23 08:40:35.150951024 +0000 UTC m=+0.198782216 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com)
Nov 23 08:40:35 np0005532585.localdomain podman[92897]: 2025-11-23 08:40:35.187653383 +0000 UTC m=+0.235484575 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Nov 23 08:40:35 np0005532585.localdomain podman[92897]: unhealthy
Nov 23 08:40:35 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:40:35 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:40:35 np0005532585.localdomain podman[92895]: 2025-11-23 08:40:35.320585443 +0000 UTC m=+0.374618186 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:40:35 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:40:36 np0005532585.localdomain systemd[1]: tmp-crun.rKTh85.mount: Deactivated successfully.
Nov 23 08:40:52 np0005532585.localdomain sudo[92975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:40:52 np0005532585.localdomain sudo[92975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:40:52 np0005532585.localdomain sudo[92975]: pam_unix(sudo:session): session closed for user root
Nov 23 08:40:52 np0005532585.localdomain sudo[92990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:40:52 np0005532585.localdomain sudo[92990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:40:52 np0005532585.localdomain sudo[92990]: pam_unix(sudo:session): session closed for user root
Nov 23 08:40:53 np0005532585.localdomain sudo[93037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:40:53 np0005532585.localdomain sudo[93037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:40:53 np0005532585.localdomain sudo[93037]: pam_unix(sudo:session): session closed for user root
Nov 23 08:40:53 np0005532585.localdomain sudo[93052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 08:40:53 np0005532585.localdomain sudo[93052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:40:53 np0005532585.localdomain sudo[93052]: pam_unix(sudo:session): session closed for user root
Nov 23 08:40:57 np0005532585.localdomain sudo[93085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:40:57 np0005532585.localdomain sudo[93085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:40:57 np0005532585.localdomain sudo[93085]: pam_unix(sudo:session): session closed for user root
Nov 23 08:40:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:40:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:41:00 np0005532585.localdomain podman[93101]: 2025-11-23 08:41:00.033706838 +0000 UTC m=+0.083628433 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:41:00 np0005532585.localdomain podman[93101]: 2025-11-23 08:41:00.039275819 +0000 UTC m=+0.089197394 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:41:00 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:41:00 np0005532585.localdomain podman[93100]: 2025-11-23 08:41:00.078483785 +0000 UTC m=+0.127167152 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container)
Nov 23 08:41:00 np0005532585.localdomain podman[93100]: 2025-11-23 08:41:00.09128326 +0000 UTC m=+0.139966677 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:41:00 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:41:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:41:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:41:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:41:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: tmp-crun.5NDzvE.mount: Deactivated successfully.
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: tmp-crun.u0dC1O.mount: Deactivated successfully.
Nov 23 08:41:03 np0005532585.localdomain podman[93139]: 2025-11-23 08:41:03.09166191 +0000 UTC m=+0.146174348 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Nov 23 08:41:03 np0005532585.localdomain podman[93139]: 2025-11-23 08:41:03.097754397 +0000 UTC m=+0.152266815 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64)
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:41:03 np0005532585.localdomain podman[93140]: 2025-11-23 08:41:03.139733488 +0000 UTC m=+0.192384529 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:41:03 np0005532585.localdomain podman[93142]: 2025-11-23 08:41:03.051638219 +0000 UTC m=+0.100416631 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://www.redhat.com, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:41:03 np0005532585.localdomain podman[93142]: 2025-11-23 08:41:03.180775541 +0000 UTC m=+0.229553943 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:41:03 np0005532585.localdomain podman[93140]: 2025-11-23 08:41:03.191944425 +0000 UTC m=+0.244595466 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:41:03 np0005532585.localdomain podman[93141]: 2025-11-23 08:41:03.186696313 +0000 UTC m=+0.237535568 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 23 08:41:03 np0005532585.localdomain podman[93141]: 2025-11-23 08:41:03.270334386 +0000 UTC m=+0.321173691 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:12:45Z, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044)
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:41:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:41:04 np0005532585.localdomain podman[93237]: 2025-11-23 08:41:04.020121411 +0000 UTC m=+0.075664248 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, version=17.1.12)
Nov 23 08:41:04 np0005532585.localdomain podman[93237]: 2025-11-23 08:41:04.361616727 +0000 UTC m=+0.417159524 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:41:04 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:41:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:41:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:41:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:41:06 np0005532585.localdomain systemd[1]: tmp-crun.2KcJ25.mount: Deactivated successfully.
Nov 23 08:41:06 np0005532585.localdomain podman[93260]: 2025-11-23 08:41:06.037158781 +0000 UTC m=+0.095494459 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:41:06 np0005532585.localdomain podman[93262]: 2025-11-23 08:41:06.076114389 +0000 UTC m=+0.127755460 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:41:06 np0005532585.localdomain podman[93261]: 2025-11-23 08:41:06.134260578 +0000 UTC m=+0.189316025 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 23 08:41:06 np0005532585.localdomain podman[93262]: 2025-11-23 08:41:06.159748442 +0000 UTC m=+0.211389523 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=)
Nov 23 08:41:06 np0005532585.localdomain podman[93262]: unhealthy
Nov 23 08:41:06 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:41:06 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:41:06 np0005532585.localdomain podman[93261]: 2025-11-23 08:41:06.207154611 +0000 UTC m=+0.262210068 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 23 08:41:06 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully.
Nov 23 08:41:06 np0005532585.localdomain podman[93260]: 2025-11-23 08:41:06.265777414 +0000 UTC m=+0.324113022 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 23 08:41:06 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:41:30 np0005532585.localdomain sshd[93338]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:41:30 np0005532585.localdomain recover_tripleo_nova_virtqemud[93353]: 61756
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: tmp-crun.YGdY9t.mount: Deactivated successfully.
Nov 23 08:41:30 np0005532585.localdomain podman[93341]: 2025-11-23 08:41:30.447148049 +0000 UTC m=+0.092934240 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20251118.1)
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: tmp-crun.LzdEvV.mount: Deactivated successfully.
Nov 23 08:41:30 np0005532585.localdomain podman[93341]: 2025-11-23 08:41:30.487242792 +0000 UTC m=+0.133028933 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, distribution-scope=public, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:41:30 np0005532585.localdomain podman[93340]: 2025-11-23 08:41:30.491143713 +0000 UTC m=+0.138948746 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:41:30 np0005532585.localdomain podman[93340]: 2025-11-23 08:41:30.571219945 +0000 UTC m=+0.219024988 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:41:30 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:41:30 np0005532585.localdomain sshd[93338]: Invalid user validator from 193.32.162.146 port 43102
Nov 23 08:41:30 np0005532585.localdomain sshd[93338]: Connection closed by invalid user validator 193.32.162.146 port 43102 [preauth]
Nov 23 08:41:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:41:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:41:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:41:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:41:34 np0005532585.localdomain podman[93384]: 2025-11-23 08:41:34.033284208 +0000 UTC m=+0.081594621 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:41:34 np0005532585.localdomain podman[93384]: 2025-11-23 08:41:34.055146201 +0000 UTC m=+0.103456604 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 23 08:41:34 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:41:34 np0005532585.localdomain podman[93385]: 2025-11-23 08:41:34.127914839 +0000 UTC m=+0.169355111 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Nov 23 08:41:34 np0005532585.localdomain podman[93385]: 2025-11-23 08:41:34.148140611 +0000 UTC m=+0.189580863 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z)
Nov 23 08:41:34 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:41:34 np0005532585.localdomain podman[93389]: 2025-11-23 08:41:34.18610263 +0000 UTC m=+0.224784277 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Nov 23 08:41:34 np0005532585.localdomain podman[93389]: 2025-11-23 08:41:34.20271624 +0000 UTC m=+0.241397877 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 23 08:41:34 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:41:34 np0005532585.localdomain podman[93383]: 2025-11-23 08:41:34.244972851 +0000 UTC m=+0.291412376 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1761123044, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, name=rhosp17/openstack-cron, io.buildah.version=1.41.4)
Nov 23 08:41:34 np0005532585.localdomain podman[93383]: 2025-11-23 08:41:34.252281395 +0000 UTC m=+0.298720990 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, version=17.1.12)
Nov 23 08:41:34 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:41:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:41:35 np0005532585.localdomain podman[93482]: 2025-11-23 08:41:35.013498873 +0000 UTC m=+0.074919066 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:41:35 np0005532585.localdomain systemd[1]: tmp-crun.aQF0qe.mount: Deactivated successfully.
Nov 23 08:41:35 np0005532585.localdomain podman[93482]: 2025-11-23 08:41:35.393239744 +0000 UTC m=+0.454659977 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:41:35 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:41:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:41:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:41:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:41:37 np0005532585.localdomain systemd[1]: tmp-crun.weqJQz.mount: Deactivated successfully.
Nov 23 08:41:37 np0005532585.localdomain podman[93507]: 2025-11-23 08:41:37.091362864 +0000 UTC m=+0.129352471 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 08:41:37 np0005532585.localdomain podman[93505]: 2025-11-23 08:41:37.061763793 +0000 UTC m=+0.106388004 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:41:37 np0005532585.localdomain podman[93507]: 2025-11-23 08:41:37.136285186 +0000 UTC m=+0.174274813 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:41:37 np0005532585.localdomain podman[93507]: unhealthy
Nov 23 08:41:37 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:41:37 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:41:37 np0005532585.localdomain podman[93506]: 2025-11-23 08:41:37.144925111 +0000 UTC m=+0.185250039 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:41:37 np0005532585.localdomain podman[93506]: 2025-11-23 08:41:37.231262267 +0000 UTC m=+0.271587125 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, batch=17.1_20251118.1)
Nov 23 08:41:37 np0005532585.localdomain podman[93506]: unhealthy
Nov 23 08:41:37 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:41:37 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:41:37 np0005532585.localdomain podman[93505]: 2025-11-23 08:41:37.312457345 +0000 UTC m=+0.357081576 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:46Z, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64)
Nov 23 08:41:37 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:41:38 np0005532585.localdomain systemd[1]: tmp-crun.i0oLgG.mount: Deactivated successfully.
Nov 23 08:41:45 np0005532585.localdomain sshd[93574]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:41:47 np0005532585.localdomain sshd[93574]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 08:41:47 np0005532585.localdomain sshd[93574]: Connection closed by 58.211.39.86 port 59027
Nov 23 08:41:57 np0005532585.localdomain sudo[93575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:41:57 np0005532585.localdomain sudo[93575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:41:57 np0005532585.localdomain sudo[93575]: pam_unix(sudo:session): session closed for user root
Nov 23 08:41:57 np0005532585.localdomain sudo[93590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:41:57 np0005532585.localdomain sudo[93590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:41:57 np0005532585.localdomain sudo[93590]: pam_unix(sudo:session): session closed for user root
Nov 23 08:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:42:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:42:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:42:01 np0005532585.localdomain systemd[1]: tmp-crun.V3v1P3.mount: Deactivated successfully.
Nov 23 08:42:01 np0005532585.localdomain podman[93637]: 2025-11-23 08:42:01.035671518 +0000 UTC m=+0.088751682 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3)
Nov 23 08:42:01 np0005532585.localdomain podman[93637]: 2025-11-23 08:42:01.045157409 +0000 UTC m=+0.098237593 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:42:01 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:42:01 np0005532585.localdomain podman[93638]: 2025-11-23 08:42:01.109910401 +0000 UTC m=+0.163154490 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:42:01 np0005532585.localdomain podman[93638]: 2025-11-23 08:42:01.121155777 +0000 UTC m=+0.174399886 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Nov 23 08:42:01 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:42:02 np0005532585.localdomain sudo[93675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:42:02 np0005532585.localdomain sudo[93675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:42:02 np0005532585.localdomain sudo[93675]: pam_unix(sudo:session): session closed for user root
Nov 23 08:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:42:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:42:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:42:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:42:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:42:05 np0005532585.localdomain podman[93695]: 2025-11-23 08:42:05.02203042 +0000 UTC m=+0.068632123 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git)
Nov 23 08:42:05 np0005532585.localdomain podman[93692]: 2025-11-23 08:42:05.082751388 +0000 UTC m=+0.129000949 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 23 08:42:05 np0005532585.localdomain podman[93695]: 2025-11-23 08:42:05.098013627 +0000 UTC m=+0.144615380 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:42:05 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:42:05 np0005532585.localdomain podman[93692]: 2025-11-23 08:42:05.111269755 +0000 UTC m=+0.157519346 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com)
Nov 23 08:42:05 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:42:05 np0005532585.localdomain podman[93690]: 2025-11-23 08:42:05.194442093 +0000 UTC m=+0.247133113 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-cron-container)
Nov 23 08:42:05 np0005532585.localdomain podman[93690]: 2025-11-23 08:42:05.23137462 +0000 UTC m=+0.284065670 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc.)
Nov 23 08:42:05 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:42:05 np0005532585.localdomain podman[93691]: 2025-11-23 08:42:05.24566737 +0000 UTC m=+0.295162552 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:42:05 np0005532585.localdomain podman[93691]: 2025-11-23 08:42:05.302384844 +0000 UTC m=+0.351880016 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:42:05 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:42:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:42:06 np0005532585.localdomain systemd[1]: tmp-crun.A1BgES.mount: Deactivated successfully.
Nov 23 08:42:06 np0005532585.localdomain podman[93787]: 2025-11-23 08:42:06.022644122 +0000 UTC m=+0.076109113 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:42:06 np0005532585.localdomain podman[93787]: 2025-11-23 08:42:06.386287318 +0000 UTC m=+0.439752269 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=)
Nov 23 08:42:06 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:42:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:42:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:42:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:42:08 np0005532585.localdomain systemd[1]: tmp-crun.V1MuL1.mount: Deactivated successfully.
Nov 23 08:42:08 np0005532585.localdomain podman[93810]: 2025-11-23 08:42:08.03511766 +0000 UTC m=+0.086637585 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 23 08:42:08 np0005532585.localdomain podman[93809]: 2025-11-23 08:42:08.105533757 +0000 UTC m=+0.156661920 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:42:08 np0005532585.localdomain podman[93810]: 2025-11-23 08:42:08.131542447 +0000 UTC m=+0.183062402 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:42:08 np0005532585.localdomain podman[93810]: unhealthy
Nov 23 08:42:08 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:42:08 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:42:08 np0005532585.localdomain podman[93811]: 2025-11-23 08:42:08.198465656 +0000 UTC m=+0.243942005 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 08:42:08 np0005532585.localdomain podman[93811]: 2025-11-23 08:42:08.241394386 +0000 UTC m=+0.286870745 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:42:08 np0005532585.localdomain podman[93811]: unhealthy
Nov 23 08:42:08 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:42:08 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:42:08 np0005532585.localdomain podman[93809]: 2025-11-23 08:42:08.322646756 +0000 UTC m=+0.373774849 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:42:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:42:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:42:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:42:31 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:42:31 np0005532585.localdomain recover_tripleo_nova_virtqemud[93889]: 61756
Nov 23 08:42:31 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:42:31 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:42:32 np0005532585.localdomain systemd[1]: tmp-crun.hUTLgH.mount: Deactivated successfully.
Nov 23 08:42:32 np0005532585.localdomain podman[93877]: 2025-11-23 08:42:32.013926075 +0000 UTC m=+0.072286905 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, version=17.1.12, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:42:32 np0005532585.localdomain podman[93877]: 2025-11-23 08:42:32.023114337 +0000 UTC m=+0.081475217 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Nov 23 08:42:32 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:42:32 np0005532585.localdomain podman[93876]: 2025-11-23 08:42:32.090683516 +0000 UTC m=+0.147740915 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 23 08:42:32 np0005532585.localdomain podman[93876]: 2025-11-23 08:42:32.102226602 +0000 UTC m=+0.159284041 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:42:32 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:42:34 np0005532585.localdomain sshd[93915]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:42:34 np0005532585.localdomain sshd[93916]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:42:34 np0005532585.localdomain sshd[93916]: error: kex_exchange_identification: read: Connection reset by peer
Nov 23 08:42:34 np0005532585.localdomain sshd[93916]: Connection reset by 45.140.17.97 port 12473
Nov 23 08:42:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:42:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:42:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:42:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:42:36 np0005532585.localdomain systemd[1]: tmp-crun.6nE2KU.mount: Deactivated successfully.
Nov 23 08:42:36 np0005532585.localdomain podman[93918]: 2025-11-23 08:42:36.047864949 +0000 UTC m=+0.100718469 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:42:36 np0005532585.localdomain podman[93917]: 2025-11-23 08:42:36.103295915 +0000 UTC m=+0.156114074 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 23 08:42:36 np0005532585.localdomain podman[93917]: 2025-11-23 08:42:36.110157915 +0000 UTC m=+0.162976034 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, release=1761123044, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:42:36 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:42:36 np0005532585.localdomain podman[93919]: 2025-11-23 08:42:36.077999827 +0000 UTC m=+0.132176467 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:42:36 np0005532585.localdomain podman[93919]: 2025-11-23 08:42:36.158237424 +0000 UTC m=+0.212414004 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4)
Nov 23 08:42:36 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:42:36 np0005532585.localdomain podman[93918]: 2025-11-23 08:42:36.179602912 +0000 UTC m=+0.232456432 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z)
Nov 23 08:42:36 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:42:36 np0005532585.localdomain podman[93920]: 2025-11-23 08:42:36.233739507 +0000 UTC m=+0.281643725 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:42:36 np0005532585.localdomain podman[93920]: 2025-11-23 08:42:36.276339078 +0000 UTC m=+0.324243356 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute)
Nov 23 08:42:36 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:42:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:42:37 np0005532585.localdomain podman[94013]: 2025-11-23 08:42:37.023170572 +0000 UTC m=+0.079869257 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:42:37 np0005532585.localdomain podman[94013]: 2025-11-23 08:42:37.431272027 +0000 UTC m=+0.487970652 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 08:42:37 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:42:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:42:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:42:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:42:39 np0005532585.localdomain podman[94037]: 2025-11-23 08:42:39.026438358 +0000 UTC m=+0.076808473 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 08:42:39 np0005532585.localdomain podman[94037]: 2025-11-23 08:42:39.044495285 +0000 UTC m=+0.094865350 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Nov 23 08:42:39 np0005532585.localdomain podman[94037]: unhealthy
Nov 23 08:42:39 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:42:39 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:42:39 np0005532585.localdomain podman[94036]: 2025-11-23 08:42:39.131408638 +0000 UTC m=+0.185795707 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64)
Nov 23 08:42:39 np0005532585.localdomain podman[94036]: 2025-11-23 08:42:39.145380448 +0000 UTC m=+0.199767507 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044)
Nov 23 08:42:39 np0005532585.localdomain podman[94036]: unhealthy
Nov 23 08:42:39 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:42:39 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:42:39 np0005532585.localdomain podman[94035]: 2025-11-23 08:42:39.234584252 +0000 UTC m=+0.292311234 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git)
Nov 23 08:42:39 np0005532585.localdomain podman[94035]: 2025-11-23 08:42:39.454366193 +0000 UTC m=+0.512093155 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 23 08:42:39 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:43:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:43:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:43:03 np0005532585.localdomain sudo[94122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:43:03 np0005532585.localdomain systemd[1]: tmp-crun.0Ki2Jy.mount: Deactivated successfully.
Nov 23 08:43:03 np0005532585.localdomain sudo[94122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:43:03 np0005532585.localdomain sudo[94122]: pam_unix(sudo:session): session closed for user root
Nov 23 08:43:03 np0005532585.localdomain podman[94104]: 2025-11-23 08:43:03.034011809 +0000 UTC m=+0.087132841 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, container_name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:43:03 np0005532585.localdomain podman[94104]: 2025-11-23 08:43:03.070245314 +0000 UTC m=+0.123366326 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4)
Nov 23 08:43:03 np0005532585.localdomain podman[94103]: 2025-11-23 08:43:03.089982431 +0000 UTC m=+0.144666062 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Nov 23 08:43:03 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:43:03 np0005532585.localdomain sudo[94147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:43:03 np0005532585.localdomain sudo[94147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:43:03 np0005532585.localdomain podman[94103]: 2025-11-23 08:43:03.106249241 +0000 UTC m=+0.160932862 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, release=1761123044, architecture=x86_64, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:43:03 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:43:03 np0005532585.localdomain sudo[94147]: pam_unix(sudo:session): session closed for user root
Nov 23 08:43:04 np0005532585.localdomain sudo[94201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:43:04 np0005532585.localdomain sudo[94201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:43:04 np0005532585.localdomain sudo[94201]: pam_unix(sudo:session): session closed for user root
Nov 23 08:43:04 np0005532585.localdomain sudo[94216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 08:43:04 np0005532585.localdomain sudo[94216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 2025-11-23 08:43:04.623166716 +0000 UTC m=+0.073000047 container create 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 2025-11-23 08:43:04.595585098 +0000 UTC m=+0.045418449 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 08:43:04 np0005532585.localdomain systemd[1]: Started libpod-conmon-4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2.scope.
Nov 23 08:43:04 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 2025-11-23 08:43:04.746255803 +0000 UTC m=+0.196089134 container init 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, release=553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, name=rhceph, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 2025-11-23 08:43:04.756967922 +0000 UTC m=+0.206801263 container start 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, architecture=x86_64, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 2025-11-23 08:43:04.757388505 +0000 UTC m=+0.207221876 container attach 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 08:43:04 np0005532585.localdomain stupefied_keldysh[94288]: 167 167
Nov 23 08:43:04 np0005532585.localdomain systemd[1]: libpod-4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2.scope: Deactivated successfully.
Nov 23 08:43:04 np0005532585.localdomain podman[94272]: 2025-11-23 08:43:04.761082409 +0000 UTC m=+0.210915760 container died 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64)
Nov 23 08:43:04 np0005532585.localdomain podman[94293]: 2025-11-23 08:43:04.856524745 +0000 UTC m=+0.081982284 container remove 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Nov 23 08:43:04 np0005532585.localdomain systemd[1]: libpod-conmon-4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2.scope: Deactivated successfully.
Nov 23 08:43:05 np0005532585.localdomain podman[94314]: 
Nov 23 08:43:05 np0005532585.localdomain podman[94314]: 2025-11-23 08:43:05.069052503 +0000 UTC m=+0.067143777 container create 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 08:43:05 np0005532585.localdomain systemd[1]: Started libpod-conmon-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope.
Nov 23 08:43:05 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 08:43:05 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 08:43:05 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 08:43:05 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 08:43:05 np0005532585.localdomain podman[94314]: 2025-11-23 08:43:05.134758014 +0000 UTC m=+0.132849298 container init 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, architecture=x86_64, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=)
Nov 23 08:43:05 np0005532585.localdomain podman[94314]: 2025-11-23 08:43:05.037594525 +0000 UTC m=+0.035685839 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 08:43:05 np0005532585.localdomain podman[94314]: 2025-11-23 08:43:05.144517835 +0000 UTC m=+0.142609109 container start 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Nov 23 08:43:05 np0005532585.localdomain podman[94314]: 2025-11-23 08:43:05.144787273 +0000 UTC m=+0.142878617 container attach 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, name=rhceph, architecture=x86_64, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, release=553)
Nov 23 08:43:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5458e3939bc568714f5f971b053d115fd385613127ac957a70b6a71eb4798fc7-merged.mount: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]: [
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:     {
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "available": false,
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "ceph_device": false,
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "lsm_data": {},
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "lvs": [],
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "path": "/dev/sr0",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "rejected_reasons": [
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "Insufficient space (<5GB)",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "Has a FileSystem"
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         ],
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         "sys_api": {
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "actuators": null,
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "device_nodes": "sr0",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "human_readable_size": "482.00 KB",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "id_bus": "ata",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "model": "QEMU DVD-ROM",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "nr_requests": "2",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "partitions": {},
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "path": "/dev/sr0",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "removable": "1",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "rev": "2.5+",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "ro": "0",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "rotational": "1",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "sas_address": "",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "sas_device_handle": "",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "scheduler_mode": "mq-deadline",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "sectors": 0,
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "sectorsize": "2048",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "size": 493568.0,
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "support_discard": "0",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "type": "disk",
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:             "vendor": "QEMU"
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:         }
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]:     }
Nov 23 08:43:06 np0005532585.localdomain priceless_wing[94330]: ]
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: libpod-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: libpod-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope: Consumed 1.041s CPU time.
Nov 23 08:43:06 np0005532585.localdomain podman[94314]: 2025-11-23 08:43:06.161247911 +0000 UTC m=+1.159339175 container died 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c-merged.mount: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain podman[96444]: 2025-11-23 08:43:06.250072814 +0000 UTC m=+0.081083946 container remove 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: libpod-conmon-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain sudo[94216]: pam_unix(sudo:session): session closed for user root
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:43:06 np0005532585.localdomain podman[96454]: 2025-11-23 08:43:06.325831784 +0000 UTC m=+0.129261637 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:43:06 np0005532585.localdomain podman[96452]: 2025-11-23 08:43:06.329829278 +0000 UTC m=+0.133015753 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-11-18T22:49:32Z, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12)
Nov 23 08:43:06 np0005532585.localdomain podman[96452]: 2025-11-23 08:43:06.335730169 +0000 UTC m=+0.138916614 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044)
Nov 23 08:43:06 np0005532585.localdomain podman[96454]: 2025-11-23 08:43:06.344951373 +0000 UTC m=+0.148381256 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public)
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain podman[96499]: 2025-11-23 08:43:06.405508526 +0000 UTC m=+0.070576093 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Nov 23 08:43:06 np0005532585.localdomain podman[96499]: 2025-11-23 08:43:06.425775039 +0000 UTC m=+0.090842606 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain podman[96476]: 2025-11-23 08:43:06.473313882 +0000 UTC m=+0.224838718 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:43:06 np0005532585.localdomain podman[96476]: 2025-11-23 08:43:06.491630275 +0000 UTC m=+0.243155021 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:43:06 np0005532585.localdomain systemd[1]: tmp-crun.w0MmA0.mount: Deactivated successfully.
Nov 23 08:43:07 np0005532585.localdomain sudo[96556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:43:07 np0005532585.localdomain sudo[96556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:43:07 np0005532585.localdomain sudo[96556]: pam_unix(sudo:session): session closed for user root
Nov 23 08:43:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:43:08 np0005532585.localdomain systemd[1]: tmp-crun.zoNRhs.mount: Deactivated successfully.
Nov 23 08:43:08 np0005532585.localdomain podman[96571]: 2025-11-23 08:43:08.020976412 +0000 UTC m=+0.079205548 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12)
Nov 23 08:43:08 np0005532585.localdomain podman[96571]: 2025-11-23 08:43:08.390305553 +0000 UTC m=+0.448534719 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:43:08 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:43:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:43:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:43:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:43:10 np0005532585.localdomain podman[96593]: 2025-11-23 08:43:10.030071288 +0000 UTC m=+0.088622188 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:43:10 np0005532585.localdomain systemd[1]: tmp-crun.TuYjPP.mount: Deactivated successfully.
Nov 23 08:43:10 np0005532585.localdomain podman[96600]: 2025-11-23 08:43:10.140818244 +0000 UTC m=+0.186870750 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public)
Nov 23 08:43:10 np0005532585.localdomain podman[96600]: 2025-11-23 08:43:10.154155025 +0000 UTC m=+0.200207551 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:43:10 np0005532585.localdomain podman[96600]: unhealthy
Nov 23 08:43:10 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:43:10 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:43:10 np0005532585.localdomain podman[96594]: 2025-11-23 08:43:10.11826808 +0000 UTC m=+0.168210285 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:43:10 np0005532585.localdomain podman[96594]: 2025-11-23 08:43:10.197990912 +0000 UTC m=+0.247933117 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Nov 23 08:43:10 np0005532585.localdomain podman[96594]: unhealthy
Nov 23 08:43:10 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:43:10 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:43:10 np0005532585.localdomain podman[96593]: 2025-11-23 08:43:10.255741789 +0000 UTC m=+0.314292669 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64)
Nov 23 08:43:10 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:43:11 np0005532585.localdomain systemd[1]: tmp-crun.r6xsUY.mount: Deactivated successfully.
Nov 23 08:43:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:43:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:43:34 np0005532585.localdomain podman[96662]: 2025-11-23 08:43:34.044668274 +0000 UTC m=+0.097960395 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20251118.1)
Nov 23 08:43:34 np0005532585.localdomain podman[96662]: 2025-11-23 08:43:34.05625189 +0000 UTC m=+0.109544001 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:43:34 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:43:34 np0005532585.localdomain podman[96663]: 2025-11-23 08:43:34.143060781 +0000 UTC m=+0.191616636 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044)
Nov 23 08:43:34 np0005532585.localdomain podman[96663]: 2025-11-23 08:43:34.156269187 +0000 UTC m=+0.204825052 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:43:34 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:43:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:43:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:43:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:43:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:43:37 np0005532585.localdomain podman[96701]: 2025-11-23 08:43:37.054262317 +0000 UTC m=+0.097927204 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=logrotate_crond, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:43:37 np0005532585.localdomain podman[96701]: 2025-11-23 08:43:37.092620197 +0000 UTC m=+0.136285084 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 23 08:43:37 np0005532585.localdomain podman[96702]: 2025-11-23 08:43:37.106621088 +0000 UTC m=+0.146334352 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:43:37 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:43:37 np0005532585.localdomain podman[96704]: 2025-11-23 08:43:37.16192448 +0000 UTC m=+0.194423313 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public)
Nov 23 08:43:37 np0005532585.localdomain podman[96702]: 2025-11-23 08:43:37.189329632 +0000 UTC m=+0.229042836 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 23 08:43:37 np0005532585.localdomain podman[96703]: 2025-11-23 08:43:37.216106726 +0000 UTC m=+0.253288992 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:43:37 np0005532585.localdomain podman[96704]: 2025-11-23 08:43:37.219310634 +0000 UTC m=+0.251809527 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:43:37 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:43:37 np0005532585.localdomain podman[96703]: 2025-11-23 08:43:37.248340027 +0000 UTC m=+0.285522333 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:43:37 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:43:37 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:43:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:43:39 np0005532585.localdomain systemd[1]: tmp-crun.ebH6Rh.mount: Deactivated successfully.
Nov 23 08:43:39 np0005532585.localdomain podman[96802]: 2025-11-23 08:43:39.030348738 +0000 UTC m=+0.087543725 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:43:39 np0005532585.localdomain podman[96802]: 2025-11-23 08:43:39.469248329 +0000 UTC m=+0.526443346 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z)
Nov 23 08:43:39 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:43:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:43:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:43:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:43:40 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:43:40 np0005532585.localdomain recover_tripleo_nova_virtqemud[96844]: 61756
Nov 23 08:43:40 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:43:40 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: tmp-crun.BV2T1r.mount: Deactivated successfully.
Nov 23 08:43:41 np0005532585.localdomain podman[96827]: 2025-11-23 08:43:41.040188066 +0000 UTC m=+0.083385067 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: tmp-crun.Rx4fON.mount: Deactivated successfully.
Nov 23 08:43:41 np0005532585.localdomain podman[96827]: 2025-11-23 08:43:41.086417648 +0000 UTC m=+0.129614639 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z)
Nov 23 08:43:41 np0005532585.localdomain podman[96825]: 2025-11-23 08:43:41.089342988 +0000 UTC m=+0.139812122 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 23 08:43:41 np0005532585.localdomain podman[96826]: 2025-11-23 08:43:41.124920022 +0000 UTC m=+0.171363113 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, distribution-scope=public)
Nov 23 08:43:41 np0005532585.localdomain podman[96827]: unhealthy
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:43:41 np0005532585.localdomain podman[96826]: 2025-11-23 08:43:41.191739138 +0000 UTC m=+0.238182249 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1)
Nov 23 08:43:41 np0005532585.localdomain podman[96826]: unhealthy
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:43:41 np0005532585.localdomain podman[96825]: 2025-11-23 08:43:41.260527094 +0000 UTC m=+0.310996208 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:43:41 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:43:54 np0005532585.localdomain sshd[96895]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:43:58 np0005532585.localdomain sshd[96895]: Invalid user debian from 211.142.244.197 port 48653
Nov 23 08:43:59 np0005532585.localdomain sshd[96895]: Connection closed by invalid user debian 211.142.244.197 port 48653 [preauth]
Nov 23 08:44:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:44:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:44:05 np0005532585.localdomain podman[96897]: 2025-11-23 08:44:05.048118945 +0000 UTC m=+0.099027847 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 08:44:05 np0005532585.localdomain podman[96897]: 2025-11-23 08:44:05.055376509 +0000 UTC m=+0.106285411 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3)
Nov 23 08:44:05 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:44:05 np0005532585.localdomain podman[96898]: 2025-11-23 08:44:05.138453224 +0000 UTC m=+0.189535002 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z)
Nov 23 08:44:05 np0005532585.localdomain podman[96898]: 2025-11-23 08:44:05.148326088 +0000 UTC m=+0.199407876 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:44:05 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:44:07 np0005532585.localdomain sudo[96933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:44:07 np0005532585.localdomain sudo[96933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:44:07 np0005532585.localdomain sudo[96933]: pam_unix(sudo:session): session closed for user root
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:44:07 np0005532585.localdomain sudo[96969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 08:44:07 np0005532585.localdomain sudo[96969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:44:07 np0005532585.localdomain podman[96949]: 2025-11-23 08:44:07.671717865 +0000 UTC m=+0.096007865 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 23 08:44:07 np0005532585.localdomain podman[96948]: 2025-11-23 08:44:07.726626654 +0000 UTC m=+0.154447452 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:44:07 np0005532585.localdomain podman[96949]: 2025-11-23 08:44:07.730303667 +0000 UTC m=+0.154593627 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team)
Nov 23 08:44:07 np0005532585.localdomain podman[96948]: 2025-11-23 08:44:07.739225612 +0000 UTC m=+0.167046390 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:44:07 np0005532585.localdomain podman[96950]: 2025-11-23 08:44:07.836615747 +0000 UTC m=+0.258232164 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:44:07 np0005532585.localdomain podman[96952]: 2025-11-23 08:44:07.884469799 +0000 UTC m=+0.302377022 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:44:07 np0005532585.localdomain podman[96950]: 2025-11-23 08:44:07.901515015 +0000 UTC m=+0.323131402 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:44:07 np0005532585.localdomain podman[96952]: 2025-11-23 08:44:07.915193825 +0000 UTC m=+0.333100998 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:44:07 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:44:08 np0005532585.localdomain sudo[96969]: pam_unix(sudo:session): session closed for user root
Nov 23 08:44:08 np0005532585.localdomain sudo[97076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:44:08 np0005532585.localdomain sudo[97076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:44:08 np0005532585.localdomain sudo[97076]: pam_unix(sudo:session): session closed for user root
Nov 23 08:44:08 np0005532585.localdomain sudo[97091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:44:08 np0005532585.localdomain sudo[97091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:44:08 np0005532585.localdomain systemd[1]: tmp-crun.Lm25ff.mount: Deactivated successfully.
Nov 23 08:44:08 np0005532585.localdomain sudo[97091]: pam_unix(sudo:session): session closed for user root
Nov 23 08:44:09 np0005532585.localdomain sudo[97137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:44:09 np0005532585.localdomain sudo[97137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:44:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:44:09 np0005532585.localdomain sudo[97137]: pam_unix(sudo:session): session closed for user root
Nov 23 08:44:09 np0005532585.localdomain podman[97152]: 2025-11-23 08:44:09.687876127 +0000 UTC m=+0.074923835 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 23 08:44:10 np0005532585.localdomain podman[97152]: 2025-11-23 08:44:10.096290832 +0000 UTC m=+0.483338510 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 23 08:44:10 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:44:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:44:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:44:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:44:12 np0005532585.localdomain systemd[1]: tmp-crun.hTgjrD.mount: Deactivated successfully.
Nov 23 08:44:12 np0005532585.localdomain podman[97176]: 2025-11-23 08:44:12.034319661 +0000 UTC m=+0.092328321 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, config_id=tripleo_step1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:44:12 np0005532585.localdomain podman[97178]: 2025-11-23 08:44:12.082370599 +0000 UTC m=+0.135590232 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z)
Nov 23 08:44:12 np0005532585.localdomain podman[97178]: 2025-11-23 08:44:12.096641948 +0000 UTC m=+0.149861601 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:34:05Z, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller)
Nov 23 08:44:12 np0005532585.localdomain podman[97178]: unhealthy
Nov 23 08:44:12 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:44:12 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:44:12 np0005532585.localdomain podman[97177]: 2025-11-23 08:44:12.180715194 +0000 UTC m=+0.236278329 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:44:12 np0005532585.localdomain podman[97176]: 2025-11-23 08:44:12.215245626 +0000 UTC m=+0.273254336 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:44:12 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:44:12 np0005532585.localdomain podman[97177]: 2025-11-23 08:44:12.272977723 +0000 UTC m=+0.328540868 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:44:12 np0005532585.localdomain podman[97177]: unhealthy
Nov 23 08:44:12 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:44:12 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:44:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:44:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:44:35 np0005532585.localdomain podman[97243]: 2025-11-23 08:44:35.992453089 +0000 UTC m=+0.054204188 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, container_name=collectd, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=)
Nov 23 08:44:36 np0005532585.localdomain podman[97243]: 2025-11-23 08:44:36.002149288 +0000 UTC m=+0.063900397 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 23 08:44:36 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:44:36 np0005532585.localdomain podman[97244]: 2025-11-23 08:44:36.058337656 +0000 UTC m=+0.116076232 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, name=rhosp17/openstack-iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:44:36 np0005532585.localdomain podman[97244]: 2025-11-23 08:44:36.097263183 +0000 UTC m=+0.155001789 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:44:36 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:44:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:44:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:44:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:44:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:44:38 np0005532585.localdomain systemd[1]: tmp-crun.MGs51J.mount: Deactivated successfully.
Nov 23 08:44:38 np0005532585.localdomain podman[97291]: 2025-11-23 08:44:38.026349007 +0000 UTC m=+0.072802201 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true)
Nov 23 08:44:38 np0005532585.localdomain podman[97291]: 2025-11-23 08:44:38.076377056 +0000 UTC m=+0.122830320 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:44:38 np0005532585.localdomain systemd[1]: tmp-crun.1A097M.mount: Deactivated successfully.
Nov 23 08:44:38 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:44:38 np0005532585.localdomain podman[97284]: 2025-11-23 08:44:38.088518269 +0000 UTC m=+0.138495561 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 23 08:44:38 np0005532585.localdomain podman[97283]: 2025-11-23 08:44:38.150994462 +0000 UTC m=+0.204484702 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:44:38 np0005532585.localdomain podman[97283]: 2025-11-23 08:44:38.161127823 +0000 UTC m=+0.214618093 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com)
Nov 23 08:44:38 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:44:38 np0005532585.localdomain podman[97284]: 2025-11-23 08:44:38.172128582 +0000 UTC m=+0.222105874 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 23 08:44:38 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:44:38 np0005532585.localdomain podman[97285]: 2025-11-23 08:44:38.141606632 +0000 UTC m=+0.187396845 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12)
Nov 23 08:44:38 np0005532585.localdomain podman[97285]: 2025-11-23 08:44:38.225508084 +0000 UTC m=+0.271298357 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, vcs-type=git)
Nov 23 08:44:38 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:44:39 np0005532585.localdomain systemd[1]: tmp-crun.nCzh6z.mount: Deactivated successfully.
Nov 23 08:44:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:44:41 np0005532585.localdomain systemd[1]: tmp-crun.Zw6zDn.mount: Deactivated successfully.
Nov 23 08:44:41 np0005532585.localdomain podman[97377]: 2025-11-23 08:44:41.023379024 +0000 UTC m=+0.073659416 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:44:41 np0005532585.localdomain podman[97377]: 2025-11-23 08:44:41.389319082 +0000 UTC m=+0.439599504 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible)
Nov 23 08:44:41 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:44:42 np0005532585.localdomain sshd[97402]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:44:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:44:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:44:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:44:43 np0005532585.localdomain podman[97404]: 2025-11-23 08:44:43.032994525 +0000 UTC m=+0.091965161 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:44:43 np0005532585.localdomain systemd[1]: tmp-crun.pvjTiP.mount: Deactivated successfully.
Nov 23 08:44:43 np0005532585.localdomain podman[97405]: 2025-11-23 08:44:43.081844368 +0000 UTC m=+0.137579633 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044)
Nov 23 08:44:43 np0005532585.localdomain podman[97405]: 2025-11-23 08:44:43.126518272 +0000 UTC m=+0.182253557 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 23 08:44:43 np0005532585.localdomain podman[97405]: unhealthy
Nov 23 08:44:43 np0005532585.localdomain podman[97406]: 2025-11-23 08:44:43.138646775 +0000 UTC m=+0.191040507 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller)
Nov 23 08:44:43 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:44:43 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:44:43 np0005532585.localdomain sshd[97402]: Invalid user node from 193.32.162.146 port 52290
Nov 23 08:44:43 np0005532585.localdomain podman[97406]: 2025-11-23 08:44:43.160372614 +0000 UTC m=+0.212766396 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:44:43 np0005532585.localdomain podman[97406]: unhealthy
Nov 23 08:44:43 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:44:43 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:44:43 np0005532585.localdomain podman[97404]: 2025-11-23 08:44:43.219376109 +0000 UTC m=+0.278346705 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:44:43 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:44:43 np0005532585.localdomain sshd[97402]: Connection closed by invalid user node 193.32.162.146 port 52290 [preauth]
Nov 23 08:45:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:45:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:45:06 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:45:06 np0005532585.localdomain recover_tripleo_nova_virtqemud[97488]: 61756
Nov 23 08:45:06 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:45:06 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:45:07 np0005532585.localdomain podman[97475]: 2025-11-23 08:45:07.034643863 +0000 UTC m=+0.092943470 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:45:07 np0005532585.localdomain podman[97475]: 2025-11-23 08:45:07.073294012 +0000 UTC m=+0.131593629 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:45:07 np0005532585.localdomain systemd[1]: tmp-crun.DmpLfx.mount: Deactivated successfully.
Nov 23 08:45:07 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:45:07 np0005532585.localdomain podman[97476]: 2025-11-23 08:45:07.086947693 +0000 UTC m=+0.141370011 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:45:07 np0005532585.localdomain podman[97476]: 2025-11-23 08:45:07.094293238 +0000 UTC m=+0.148715536 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, version=17.1.12)
Nov 23 08:45:07 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:45:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:45:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:45:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:45:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:45:09 np0005532585.localdomain podman[97516]: 2025-11-23 08:45:09.018519733 +0000 UTC m=+0.075890786 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:45:09 np0005532585.localdomain podman[97516]: 2025-11-23 08:45:09.044198312 +0000 UTC m=+0.101569365 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z)
Nov 23 08:45:09 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:45:09 np0005532585.localdomain podman[97517]: 2025-11-23 08:45:09.079598072 +0000 UTC m=+0.133756616 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:45:09 np0005532585.localdomain podman[97515]: 2025-11-23 08:45:09.128114534 +0000 UTC m=+0.187167258 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:45:09 np0005532585.localdomain podman[97517]: 2025-11-23 08:45:09.133279893 +0000 UTC m=+0.187438457 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true)
Nov 23 08:45:09 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:45:09 np0005532585.localdomain podman[97514]: 2025-11-23 08:45:09.180810805 +0000 UTC m=+0.236742693 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:45:09 np0005532585.localdomain podman[97515]: 2025-11-23 08:45:09.204609988 +0000 UTC m=+0.263662652 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Nov 23 08:45:09 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:45:09 np0005532585.localdomain podman[97514]: 2025-11-23 08:45:09.217336039 +0000 UTC m=+0.273268007 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc.)
Nov 23 08:45:09 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:45:09 np0005532585.localdomain sudo[97614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:45:09 np0005532585.localdomain sudo[97614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:45:09 np0005532585.localdomain sudo[97614]: pam_unix(sudo:session): session closed for user root
Nov 23 08:45:09 np0005532585.localdomain sudo[97629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:45:09 np0005532585.localdomain sudo[97629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:45:10 np0005532585.localdomain sudo[97629]: pam_unix(sudo:session): session closed for user root
Nov 23 08:45:11 np0005532585.localdomain sudo[97677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:45:11 np0005532585.localdomain sudo[97677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:45:11 np0005532585.localdomain sudo[97677]: pam_unix(sudo:session): session closed for user root
Nov 23 08:45:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:45:12 np0005532585.localdomain podman[97692]: 2025-11-23 08:45:12.011266308 +0000 UTC m=+0.070160950 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:45:12 np0005532585.localdomain podman[97692]: 2025-11-23 08:45:12.397313183 +0000 UTC m=+0.456207785 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:45:12 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:45:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:45:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:45:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:45:14 np0005532585.localdomain systemd[1]: tmp-crun.ysB0Cv.mount: Deactivated successfully.
Nov 23 08:45:14 np0005532585.localdomain podman[97718]: 2025-11-23 08:45:14.083317849 +0000 UTC m=+0.140098721 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:45:14 np0005532585.localdomain podman[97718]: 2025-11-23 08:45:14.095541826 +0000 UTC m=+0.152322678 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:45:14 np0005532585.localdomain podman[97718]: unhealthy
Nov 23 08:45:14 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:45:14 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:45:14 np0005532585.localdomain podman[97719]: 2025-11-23 08:45:14.013084679 +0000 UTC m=+0.064493005 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=)
Nov 23 08:45:14 np0005532585.localdomain podman[97719]: 2025-11-23 08:45:14.146444152 +0000 UTC m=+0.197852598 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible)
Nov 23 08:45:14 np0005532585.localdomain podman[97719]: unhealthy
Nov 23 08:45:14 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:45:14 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:45:14 np0005532585.localdomain podman[97717]: 2025-11-23 08:45:14.048971943 +0000 UTC m=+0.104894119 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=metrics_qdr)
Nov 23 08:45:14 np0005532585.localdomain podman[97717]: 2025-11-23 08:45:14.260544181 +0000 UTC m=+0.316466367 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:45:14 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:45:15 np0005532585.localdomain systemd[1]: tmp-crun.MkHjYp.mount: Deactivated successfully.
Nov 23 08:45:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:45:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:45:38 np0005532585.localdomain podman[97787]: 2025-11-23 08:45:38.020646018 +0000 UTC m=+0.071139059 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, architecture=x86_64, container_name=iscsid, com.redhat.component=openstack-iscsid-container)
Nov 23 08:45:38 np0005532585.localdomain podman[97787]: 2025-11-23 08:45:38.032324417 +0000 UTC m=+0.082817488 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1)
Nov 23 08:45:38 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:45:38 np0005532585.localdomain podman[97786]: 2025-11-23 08:45:38.075665221 +0000 UTC m=+0.126810553 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, container_name=collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Nov 23 08:45:38 np0005532585.localdomain podman[97786]: 2025-11-23 08:45:38.086201875 +0000 UTC m=+0.137347127 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:45:38 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:45:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:45:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:45:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:45:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:45:40 np0005532585.localdomain systemd[1]: tmp-crun.fneATk.mount: Deactivated successfully.
Nov 23 08:45:40 np0005532585.localdomain podman[97825]: 2025-11-23 08:45:40.03708275 +0000 UTC m=+0.090992511 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=logrotate_crond, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Nov 23 08:45:40 np0005532585.localdomain podman[97825]: 2025-11-23 08:45:40.078251866 +0000 UTC m=+0.132161627 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true)
Nov 23 08:45:40 np0005532585.localdomain systemd[1]: tmp-crun.JetZ4J.mount: Deactivated successfully.
Nov 23 08:45:40 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:45:40 np0005532585.localdomain podman[97826]: 2025-11-23 08:45:40.133451304 +0000 UTC m=+0.184441125 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:45:40 np0005532585.localdomain podman[97826]: 2025-11-23 08:45:40.158228647 +0000 UTC m=+0.209218428 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:45:40 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:45:40 np0005532585.localdomain podman[97827]: 2025-11-23 08:45:40.084076236 +0000 UTC m=+0.133209999 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 08:45:40 np0005532585.localdomain podman[97827]: 2025-11-23 08:45:40.221334988 +0000 UTC m=+0.270468811 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 23 08:45:40 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:45:40 np0005532585.localdomain podman[97828]: 2025-11-23 08:45:40.236539116 +0000 UTC m=+0.282216653 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:45:40 np0005532585.localdomain podman[97828]: 2025-11-23 08:45:40.26462626 +0000 UTC m=+0.310303787 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:45:40 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:45:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:45:43 np0005532585.localdomain podman[97916]: 2025-11-23 08:45:43.010300705 +0000 UTC m=+0.070254472 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:45:43 np0005532585.localdomain podman[97916]: 2025-11-23 08:45:43.393297927 +0000 UTC m=+0.453251694 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 23 08:45:43 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:45:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:45:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:45:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: tmp-crun.n3mz8a.mount: Deactivated successfully.
Nov 23 08:45:45 np0005532585.localdomain podman[97941]: 2025-11-23 08:45:45.024192328 +0000 UTC m=+0.080610311 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:45:45 np0005532585.localdomain podman[97942]: 2025-11-23 08:45:45.079239191 +0000 UTC m=+0.134189419 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 23 08:45:45 np0005532585.localdomain podman[97942]: 2025-11-23 08:45:45.090555239 +0000 UTC m=+0.145505497 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc.)
Nov 23 08:45:45 np0005532585.localdomain podman[97942]: unhealthy
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: tmp-crun.dScank.mount: Deactivated successfully.
Nov 23 08:45:45 np0005532585.localdomain podman[97943]: 2025-11-23 08:45:45.141123615 +0000 UTC m=+0.192945207 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:45:45 np0005532585.localdomain podman[97943]: 2025-11-23 08:45:45.148304675 +0000 UTC m=+0.200126267 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vcs-type=git)
Nov 23 08:45:45 np0005532585.localdomain podman[97943]: unhealthy
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:45:45 np0005532585.localdomain podman[97941]: 2025-11-23 08:45:45.239245463 +0000 UTC m=+0.295663446 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:45:45 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:46:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:46:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:46:09 np0005532585.localdomain systemd[1]: tmp-crun.uibBqM.mount: Deactivated successfully.
Nov 23 08:46:09 np0005532585.localdomain podman[98011]: 2025-11-23 08:46:09.019129779 +0000 UTC m=+0.081550980 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=)
Nov 23 08:46:09 np0005532585.localdomain systemd[1]: tmp-crun.gsRQgT.mount: Deactivated successfully.
Nov 23 08:46:09 np0005532585.localdomain podman[98011]: 2025-11-23 08:46:09.035174493 +0000 UTC m=+0.097595684 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public)
Nov 23 08:46:09 np0005532585.localdomain podman[98012]: 2025-11-23 08:46:09.0422195 +0000 UTC m=+0.097889562 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, tcib_managed=true)
Nov 23 08:46:09 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:46:09 np0005532585.localdomain podman[98012]: 2025-11-23 08:46:09.053169846 +0000 UTC m=+0.108839908 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:46:09 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:46:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:46:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:46:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:46:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:46:11 np0005532585.localdomain systemd[1]: tmp-crun.zvAFiR.mount: Deactivated successfully.
Nov 23 08:46:11 np0005532585.localdomain podman[98052]: 2025-11-23 08:46:11.004855936 +0000 UTC m=+0.063301378 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:46:11 np0005532585.localdomain systemd[1]: tmp-crun.PuCOO3.mount: Deactivated successfully.
Nov 23 08:46:11 np0005532585.localdomain podman[98053]: 2025-11-23 08:46:11.047129426 +0000 UTC m=+0.098634495 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1)
Nov 23 08:46:11 np0005532585.localdomain podman[98052]: 2025-11-23 08:46:11.06121222 +0000 UTC m=+0.119657702 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, version=17.1.12, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-type=git)
Nov 23 08:46:11 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:46:11 np0005532585.localdomain podman[98064]: 2025-11-23 08:46:11.10151186 +0000 UTC m=+0.147926702 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:46:11 np0005532585.localdomain podman[98051]: 2025-11-23 08:46:11.065798261 +0000 UTC m=+0.123991765 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:46:11 np0005532585.localdomain podman[98064]: 2025-11-23 08:46:11.124198778 +0000 UTC m=+0.170613620 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:46:11 np0005532585.localdomain podman[98053]: 2025-11-23 08:46:11.130176621 +0000 UTC m=+0.181681700 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12)
Nov 23 08:46:11 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:46:11 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:46:11 np0005532585.localdomain podman[98051]: 2025-11-23 08:46:11.145357648 +0000 UTC m=+0.203551202 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public)
Nov 23 08:46:11 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:46:11 np0005532585.localdomain sudo[98146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:46:11 np0005532585.localdomain sudo[98146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:46:11 np0005532585.localdomain sudo[98146]: pam_unix(sudo:session): session closed for user root
Nov 23 08:46:11 np0005532585.localdomain sudo[98161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 08:46:11 np0005532585.localdomain sudo[98161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:46:12 np0005532585.localdomain podman[98247]: 2025-11-23 08:46:12.215526459 +0000 UTC m=+0.085232992 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, distribution-scope=public, architecture=x86_64)
Nov 23 08:46:12 np0005532585.localdomain podman[98247]: 2025-11-23 08:46:12.354358441 +0000 UTC m=+0.224064944 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 08:46:12 np0005532585.localdomain sudo[98161]: pam_unix(sudo:session): session closed for user root
Nov 23 08:46:12 np0005532585.localdomain sudo[98315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:46:12 np0005532585.localdomain sudo[98315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:46:12 np0005532585.localdomain sudo[98315]: pam_unix(sudo:session): session closed for user root
Nov 23 08:46:12 np0005532585.localdomain sudo[98330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:46:12 np0005532585.localdomain sudo[98330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:46:13 np0005532585.localdomain sudo[98330]: pam_unix(sudo:session): session closed for user root
Nov 23 08:46:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:46:14 np0005532585.localdomain podman[98377]: 2025-11-23 08:46:14.037132838 +0000 UTC m=+0.090592839 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1)
Nov 23 08:46:14 np0005532585.localdomain sudo[98400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:46:14 np0005532585.localdomain sudo[98400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:46:14 np0005532585.localdomain sudo[98400]: pam_unix(sudo:session): session closed for user root
Nov 23 08:46:14 np0005532585.localdomain podman[98377]: 2025-11-23 08:46:14.434622975 +0000 UTC m=+0.488082936 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Nov 23 08:46:14 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:46:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:46:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:46:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:46:16 np0005532585.localdomain systemd[1]: tmp-crun.LdRzsv.mount: Deactivated successfully.
Nov 23 08:46:16 np0005532585.localdomain podman[98418]: 2025-11-23 08:46:16.081308642 +0000 UTC m=+0.130725052 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:46:16 np0005532585.localdomain podman[98416]: 2025-11-23 08:46:16.049187145 +0000 UTC m=+0.104101464 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:46:16 np0005532585.localdomain podman[98418]: 2025-11-23 08:46:16.118549519 +0000 UTC m=+0.167965899 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:46:16 np0005532585.localdomain podman[98418]: unhealthy
Nov 23 08:46:16 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:46:16 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:46:16 np0005532585.localdomain podman[98417]: 2025-11-23 08:46:16.185301721 +0000 UTC m=+0.238292311 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 23 08:46:16 np0005532585.localdomain podman[98417]: 2025-11-23 08:46:16.201331675 +0000 UTC m=+0.254322265 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 23 08:46:16 np0005532585.localdomain podman[98417]: unhealthy
Nov 23 08:46:16 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:46:16 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:46:16 np0005532585.localdomain podman[98416]: 2025-11-23 08:46:16.266215941 +0000 UTC m=+0.321130310 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:46:16 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:46:36 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:46:36 np0005532585.localdomain recover_tripleo_nova_virtqemud[98480]: 61756
Nov 23 08:46:36 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:46:36 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:46:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:46:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:46:40 np0005532585.localdomain podman[98482]: 2025-11-23 08:46:40.03999585 +0000 UTC m=+0.091509766 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:46:40 np0005532585.localdomain podman[98481]: 2025-11-23 08:46:40.092816665 +0000 UTC m=+0.144407114 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:46:40 np0005532585.localdomain podman[98481]: 2025-11-23 08:46:40.104383731 +0000 UTC m=+0.155974220 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z)
Nov 23 08:46:40 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:46:40 np0005532585.localdomain podman[98482]: 2025-11-23 08:46:40.120275829 +0000 UTC m=+0.171789795 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 23 08:46:40 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:46:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:46:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:46:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:46:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:46:42 np0005532585.localdomain podman[98519]: 2025-11-23 08:46:42.030643607 +0000 UTC m=+0.087230564 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:46:42 np0005532585.localdomain podman[98519]: 2025-11-23 08:46:42.042400929 +0000 UTC m=+0.098987946 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., tcib_managed=true)
Nov 23 08:46:42 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:46:42 np0005532585.localdomain podman[98527]: 2025-11-23 08:46:42.089796827 +0000 UTC m=+0.134176069 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:46:42 np0005532585.localdomain podman[98527]: 2025-11-23 08:46:42.147517153 +0000 UTC m=+0.191896405 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=nova_compute, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Nov 23 08:46:42 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:46:42 np0005532585.localdomain podman[98520]: 2025-11-23 08:46:42.127734384 +0000 UTC m=+0.180264276 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:46:42 np0005532585.localdomain podman[98521]: 2025-11-23 08:46:42.153465926 +0000 UTC m=+0.201687316 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:46:42 np0005532585.localdomain podman[98521]: 2025-11-23 08:46:42.236523261 +0000 UTC m=+0.284744631 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:46:42 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:46:42 np0005532585.localdomain podman[98520]: 2025-11-23 08:46:42.257265339 +0000 UTC m=+0.309795261 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:46:42 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:46:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:46:45 np0005532585.localdomain podman[98617]: 2025-11-23 08:46:45.018963716 +0000 UTC m=+0.079013061 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4)
Nov 23 08:46:45 np0005532585.localdomain podman[98617]: 2025-11-23 08:46:45.417189227 +0000 UTC m=+0.477238492 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:46:45 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:46:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:46:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:46:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:46:47 np0005532585.localdomain podman[98641]: 2025-11-23 08:46:47.023407649 +0000 UTC m=+0.078648711 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, vcs-type=git)
Nov 23 08:46:47 np0005532585.localdomain podman[98641]: 2025-11-23 08:46:47.03516306 +0000 UTC m=+0.090404202 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:46:47 np0005532585.localdomain podman[98641]: unhealthy
Nov 23 08:46:47 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:46:47 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:46:47 np0005532585.localdomain systemd[1]: tmp-crun.tiC81l.mount: Deactivated successfully.
Nov 23 08:46:47 np0005532585.localdomain podman[98640]: 2025-11-23 08:46:47.096997683 +0000 UTC m=+0.153848734 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:46:47 np0005532585.localdomain podman[98642]: 2025-11-23 08:46:47.099986594 +0000 UTC m=+0.150590393 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12)
Nov 23 08:46:47 np0005532585.localdomain podman[98642]: 2025-11-23 08:46:47.183504634 +0000 UTC m=+0.234108513 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com)
Nov 23 08:46:47 np0005532585.localdomain podman[98642]: unhealthy
Nov 23 08:46:47 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:46:47 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:46:47 np0005532585.localdomain podman[98640]: 2025-11-23 08:46:47.29001444 +0000 UTC m=+0.346865541 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:46:47 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:47:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:47:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:47:11 np0005532585.localdomain podman[98711]: 2025-11-23 08:47:11.031881847 +0000 UTC m=+0.087111401 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd)
Nov 23 08:47:11 np0005532585.localdomain podman[98711]: 2025-11-23 08:47:11.041540984 +0000 UTC m=+0.096770528 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 23 08:47:11 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:47:11 np0005532585.localdomain podman[98712]: 2025-11-23 08:47:11.12919215 +0000 UTC m=+0.183082573 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:47:11 np0005532585.localdomain podman[98712]: 2025-11-23 08:47:11.141368235 +0000 UTC m=+0.195258748 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:47:11 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:47:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:47:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:47:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:47:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:47:13 np0005532585.localdomain systemd[1]: tmp-crun.TdSs2B.mount: Deactivated successfully.
Nov 23 08:47:13 np0005532585.localdomain podman[98751]: 2025-11-23 08:47:13.049408471 +0000 UTC m=+0.091488796 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64)
Nov 23 08:47:13 np0005532585.localdomain podman[98755]: 2025-11-23 08:47:13.104372692 +0000 UTC m=+0.138835802 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Nov 23 08:47:13 np0005532585.localdomain podman[98751]: 2025-11-23 08:47:13.132307311 +0000 UTC m=+0.174387656 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:47:13 np0005532585.localdomain podman[98755]: 2025-11-23 08:47:13.142858656 +0000 UTC m=+0.177321786 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-type=git)
Nov 23 08:47:13 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:47:13 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:47:13 np0005532585.localdomain podman[98750]: 2025-11-23 08:47:13.212023954 +0000 UTC m=+0.257256865 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 23 08:47:13 np0005532585.localdomain podman[98750]: 2025-11-23 08:47:13.2492915 +0000 UTC m=+0.294524421 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=)
Nov 23 08:47:13 np0005532585.localdomain podman[98752]: 2025-11-23 08:47:13.261212866 +0000 UTC m=+0.299280207 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:47:13 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:47:13 np0005532585.localdomain podman[98752]: 2025-11-23 08:47:13.316064314 +0000 UTC m=+0.354131655 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:47:13 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:47:14 np0005532585.localdomain sudo[98850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:47:14 np0005532585.localdomain sudo[98850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:47:14 np0005532585.localdomain sudo[98850]: pam_unix(sudo:session): session closed for user root
Nov 23 08:47:14 np0005532585.localdomain sudo[98865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:47:14 np0005532585.localdomain sudo[98865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:47:15 np0005532585.localdomain sudo[98865]: pam_unix(sudo:session): session closed for user root
Nov 23 08:47:15 np0005532585.localdomain sudo[98911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:47:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:47:15 np0005532585.localdomain sudo[98911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:47:15 np0005532585.localdomain sudo[98911]: pam_unix(sudo:session): session closed for user root
Nov 23 08:47:15 np0005532585.localdomain podman[98925]: 2025-11-23 08:47:15.886680224 +0000 UTC m=+0.082728147 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044)
Nov 23 08:47:16 np0005532585.localdomain podman[98925]: 2025-11-23 08:47:16.238740193 +0000 UTC m=+0.434788136 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 23 08:47:16 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:47:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:47:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:47:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:47:18 np0005532585.localdomain systemd[1]: tmp-crun.xohBtN.mount: Deactivated successfully.
Nov 23 08:47:18 np0005532585.localdomain podman[98947]: 2025-11-23 08:47:18.055060428 +0000 UTC m=+0.101365779 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:47:18 np0005532585.localdomain podman[98949]: 2025-11-23 08:47:18.091947673 +0000 UTC m=+0.131275140 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:47:18 np0005532585.localdomain podman[98949]: 2025-11-23 08:47:18.112203336 +0000 UTC m=+0.151530763 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64)
Nov 23 08:47:18 np0005532585.localdomain podman[98949]: unhealthy
Nov 23 08:47:18 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:47:18 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:47:18 np0005532585.localdomain podman[98948]: 2025-11-23 08:47:18.203912047 +0000 UTC m=+0.247473243 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., release=1761123044, version=17.1.12)
Nov 23 08:47:18 np0005532585.localdomain podman[98948]: 2025-11-23 08:47:18.217801925 +0000 UTC m=+0.261363081 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Nov 23 08:47:18 np0005532585.localdomain podman[98948]: unhealthy
Nov 23 08:47:18 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:47:18 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:47:18 np0005532585.localdomain podman[98947]: 2025-11-23 08:47:18.283523046 +0000 UTC m=+0.329828387 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:47:18 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:47:19 np0005532585.localdomain systemd[1]: tmp-crun.lGBhFK.mount: Deactivated successfully.
Nov 23 08:47:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:47:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:47:42 np0005532585.localdomain podman[99012]: 2025-11-23 08:47:42.029159059 +0000 UTC m=+0.077955009 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 08:47:42 np0005532585.localdomain podman[99012]: 2025-11-23 08:47:42.04316979 +0000 UTC m=+0.091965670 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12)
Nov 23 08:47:42 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:47:42 np0005532585.localdomain systemd[1]: tmp-crun.zS9qzU.mount: Deactivated successfully.
Nov 23 08:47:42 np0005532585.localdomain podman[99011]: 2025-11-23 08:47:42.137078708 +0000 UTC m=+0.186397784 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:47:42 np0005532585.localdomain podman[99011]: 2025-11-23 08:47:42.169826356 +0000 UTC m=+0.219145432 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd)
Nov 23 08:47:42 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:47:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:47:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:47:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:47:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:47:44 np0005532585.localdomain podman[99054]: 2025-11-23 08:47:44.04332517 +0000 UTC m=+0.092583089 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:47:44 np0005532585.localdomain podman[99054]: 2025-11-23 08:47:44.075566971 +0000 UTC m=+0.124824860 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:47:44 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:47:44 np0005532585.localdomain podman[99055]: 2025-11-23 08:47:44.093181143 +0000 UTC m=+0.139014587 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Nov 23 08:47:44 np0005532585.localdomain podman[99055]: 2025-11-23 08:47:44.131178192 +0000 UTC m=+0.177011646 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:47:44 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:47:44 np0005532585.localdomain podman[99052]: 2025-11-23 08:47:44.134886056 +0000 UTC m=+0.190274235 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:47:44 np0005532585.localdomain podman[99052]: 2025-11-23 08:47:44.219474489 +0000 UTC m=+0.274862678 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:47:44 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:47:44 np0005532585.localdomain podman[99053]: 2025-11-23 08:47:44.187326319 +0000 UTC m=+0.239357163 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:47:44 np0005532585.localdomain podman[99053]: 2025-11-23 08:47:44.267163436 +0000 UTC m=+0.319194340 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4)
Nov 23 08:47:44 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:47:45 np0005532585.localdomain systemd[1]: tmp-crun.rUNSi3.mount: Deactivated successfully.
Nov 23 08:47:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:47:47 np0005532585.localdomain podman[99155]: 2025-11-23 08:47:47.002539133 +0000 UTC m=+0.055393305 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:47:47 np0005532585.localdomain podman[99155]: 2025-11-23 08:47:47.437245436 +0000 UTC m=+0.490099598 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:47:47 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:47:48 np0005532585.localdomain sshd[99176]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:47:48 np0005532585.localdomain sshd[99176]: Invalid user solana from 193.32.162.146 port 33226
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: tmp-crun.sRurOu.mount: Deactivated successfully.
Nov 23 08:47:48 np0005532585.localdomain sshd[99176]: Connection closed by invalid user solana 193.32.162.146 port 33226 [preauth]
Nov 23 08:47:48 np0005532585.localdomain podman[99178]: 2025-11-23 08:47:48.831070314 +0000 UTC m=+0.100714830 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: tmp-crun.KjCCNX.mount: Deactivated successfully.
Nov 23 08:47:48 np0005532585.localdomain podman[99179]: 2025-11-23 08:47:48.92977948 +0000 UTC m=+0.192279226 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team)
Nov 23 08:47:48 np0005532585.localdomain podman[99180]: 2025-11-23 08:47:48.886084595 +0000 UTC m=+0.148905251 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 23 08:47:48 np0005532585.localdomain podman[99179]: 2025-11-23 08:47:48.94309046 +0000 UTC m=+0.205590226 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, tcib_managed=true)
Nov 23 08:47:48 np0005532585.localdomain podman[99179]: unhealthy
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:47:48 np0005532585.localdomain podman[99180]: 2025-11-23 08:47:48.969539653 +0000 UTC m=+0.232360309 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:47:48 np0005532585.localdomain podman[99180]: unhealthy
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:47:48 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:47:49 np0005532585.localdomain podman[99178]: 2025-11-23 08:47:49.01202503 +0000 UTC m=+0.281669546 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, distribution-scope=public)
Nov 23 08:47:49 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:48:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:48:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:48:13 np0005532585.localdomain podman[99245]: 2025-11-23 08:48:13.014832642 +0000 UTC m=+0.064857246 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 23 08:48:13 np0005532585.localdomain podman[99245]: 2025-11-23 08:48:13.021802207 +0000 UTC m=+0.071826761 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container)
Nov 23 08:48:13 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:48:13 np0005532585.localdomain systemd[1]: tmp-crun.UESNvA.mount: Deactivated successfully.
Nov 23 08:48:13 np0005532585.localdomain podman[99244]: 2025-11-23 08:48:13.085359672 +0000 UTC m=+0.136064917 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, version=17.1.12, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z)
Nov 23 08:48:13 np0005532585.localdomain podman[99244]: 2025-11-23 08:48:13.093646217 +0000 UTC m=+0.144351502 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:51:28Z, version=17.1.12, container_name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:48:13 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:48:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:48:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:48:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:48:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:48:15 np0005532585.localdomain systemd[1]: tmp-crun.JqHYyx.mount: Deactivated successfully.
Nov 23 08:48:15 np0005532585.localdomain podman[99284]: 2025-11-23 08:48:15.04156183 +0000 UTC m=+0.098097018 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true)
Nov 23 08:48:15 np0005532585.localdomain podman[99284]: 2025-11-23 08:48:15.047482642 +0000 UTC m=+0.104017860 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 23 08:48:15 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:48:15 np0005532585.localdomain podman[99286]: 2025-11-23 08:48:15.14167637 +0000 UTC m=+0.191426720 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 08:48:15 np0005532585.localdomain podman[99286]: 2025-11-23 08:48:15.172274602 +0000 UTC m=+0.222024942 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 23 08:48:15 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:48:15 np0005532585.localdomain podman[99292]: 2025-11-23 08:48:15.188964895 +0000 UTC m=+0.235260418 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public)
Nov 23 08:48:15 np0005532585.localdomain podman[99292]: 2025-11-23 08:48:15.219236836 +0000 UTC m=+0.265532359 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 23 08:48:15 np0005532585.localdomain podman[99285]: 2025-11-23 08:48:15.241917773 +0000 UTC m=+0.296584644 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute)
Nov 23 08:48:15 np0005532585.localdomain podman[99285]: 2025-11-23 08:48:15.276301802 +0000 UTC m=+0.330968653 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:48:15 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:48:15 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:48:15 np0005532585.localdomain sudo[99386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:48:16 np0005532585.localdomain sudo[99386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:48:16 np0005532585.localdomain sudo[99386]: pam_unix(sudo:session): session closed for user root
Nov 23 08:48:16 np0005532585.localdomain sudo[99401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:48:16 np0005532585.localdomain sudo[99401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:48:16 np0005532585.localdomain sudo[99401]: pam_unix(sudo:session): session closed for user root
Nov 23 08:48:16 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:48:16 np0005532585.localdomain recover_tripleo_nova_virtqemud[99448]: 61756
Nov 23 08:48:16 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:48:16 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:48:17 np0005532585.localdomain sudo[99449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:48:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:48:18 np0005532585.localdomain sudo[99449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:48:18 np0005532585.localdomain sudo[99449]: pam_unix(sudo:session): session closed for user root
Nov 23 08:48:18 np0005532585.localdomain podman[99463]: 2025-11-23 08:48:18.131706092 +0000 UTC m=+0.086854543 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:48:18 np0005532585.localdomain podman[99463]: 2025-11-23 08:48:18.502380124 +0000 UTC m=+0.457528625 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true)
Nov 23 08:48:18 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:48:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:48:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:48:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:48:20 np0005532585.localdomain podman[99491]: 2025-11-23 08:48:20.01444279 +0000 UTC m=+0.068809537 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:48:20 np0005532585.localdomain podman[99491]: 2025-11-23 08:48:20.031307276 +0000 UTC m=+0.085673923 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:48:20 np0005532585.localdomain podman[99491]: unhealthy
Nov 23 08:48:20 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:48:20 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:48:20 np0005532585.localdomain podman[99490]: 2025-11-23 08:48:20.079234033 +0000 UTC m=+0.134988342 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:48:20 np0005532585.localdomain podman[99489]: 2025-11-23 08:48:20.123160477 +0000 UTC m=+0.181126594 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr)
Nov 23 08:48:20 np0005532585.localdomain podman[99490]: 2025-11-23 08:48:20.143383926 +0000 UTC m=+0.199138265 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:48:20 np0005532585.localdomain podman[99490]: unhealthy
Nov 23 08:48:20 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:48:20 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:48:20 np0005532585.localdomain podman[99489]: 2025-11-23 08:48:20.296961336 +0000 UTC m=+0.354927473 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:48:20 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:48:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:48:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:48:44 np0005532585.localdomain systemd[1]: tmp-crun.073sXJ.mount: Deactivated successfully.
Nov 23 08:48:44 np0005532585.localdomain podman[99561]: 2025-11-23 08:48:44.035531573 +0000 UTC m=+0.088074357 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 23 08:48:44 np0005532585.localdomain podman[99561]: 2025-11-23 08:48:44.043139855 +0000 UTC m=+0.095682669 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=)
Nov 23 08:48:44 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:48:44 np0005532585.localdomain systemd[1]: tmp-crun.C4fJ7z.mount: Deactivated successfully.
Nov 23 08:48:44 np0005532585.localdomain podman[99560]: 2025-11-23 08:48:44.13413265 +0000 UTC m=+0.187540211 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:48:44 np0005532585.localdomain podman[99560]: 2025-11-23 08:48:44.171298617 +0000 UTC m=+0.224706138 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.12)
Nov 23 08:48:44 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:48:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:48:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:48:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:48:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:48:46 np0005532585.localdomain podman[99603]: 2025-11-23 08:48:46.047029693 +0000 UTC m=+0.087513439 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z)
Nov 23 08:48:46 np0005532585.localdomain podman[99603]: 2025-11-23 08:48:46.09432574 +0000 UTC m=+0.134809476 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:48:46 np0005532585.localdomain podman[99602]: 2025-11-23 08:48:46.094264158 +0000 UTC m=+0.140473850 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:48:46 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:48:46 np0005532585.localdomain podman[99609]: 2025-11-23 08:48:46.152452068 +0000 UTC m=+0.191489761 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Nov 23 08:48:46 np0005532585.localdomain podman[99602]: 2025-11-23 08:48:46.179376043 +0000 UTC m=+0.225585755 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:48:46 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:48:46 np0005532585.localdomain podman[99609]: 2025-11-23 08:48:46.204051328 +0000 UTC m=+0.243089061 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step5, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:48:46 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:48:46 np0005532585.localdomain podman[99601]: 2025-11-23 08:48:46.190124732 +0000 UTC m=+0.239120569 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:48:46 np0005532585.localdomain podman[99601]: 2025-11-23 08:48:46.269595754 +0000 UTC m=+0.318591551 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 23 08:48:46 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:48:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:48:49 np0005532585.localdomain podman[99700]: 2025-11-23 08:48:49.026175257 +0000 UTC m=+0.080366720 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:48:49 np0005532585.localdomain podman[99700]: 2025-11-23 08:48:49.396340516 +0000 UTC m=+0.450531969 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:48:49 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:48:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:48:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:48:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:48:51 np0005532585.localdomain systemd[1]: tmp-crun.9ntNS9.mount: Deactivated successfully.
Nov 23 08:48:51 np0005532585.localdomain podman[99723]: 2025-11-23 08:48:51.070610256 +0000 UTC m=+0.129769633 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=)
Nov 23 08:48:51 np0005532585.localdomain podman[99724]: 2025-11-23 08:48:51.019222833 +0000 UTC m=+0.078409030 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:48:51 np0005532585.localdomain podman[99725]: 2025-11-23 08:48:51.040103102 +0000 UTC m=+0.093442581 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, architecture=x86_64)
Nov 23 08:48:51 np0005532585.localdomain podman[99724]: 2025-11-23 08:48:51.104193233 +0000 UTC m=+0.163379410 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:48:51 np0005532585.localdomain podman[99724]: unhealthy
Nov 23 08:48:51 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:48:51 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:48:51 np0005532585.localdomain podman[99725]: 2025-11-23 08:48:51.173963319 +0000 UTC m=+0.227302778 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:48:51 np0005532585.localdomain podman[99725]: unhealthy
Nov 23 08:48:51 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:48:51 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:48:51 np0005532585.localdomain podman[99723]: 2025-11-23 08:48:51.262174088 +0000 UTC m=+0.321333395 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4)
Nov 23 08:48:51 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:48:52 np0005532585.localdomain systemd[1]: tmp-crun.36qpjl.mount: Deactivated successfully.
Nov 23 08:49:04 np0005532585.localdomain sshd[36165]: Received disconnect from 192.168.122.100 port 55476:11: disconnected by user
Nov 23 08:49:04 np0005532585.localdomain sshd[36165]: Disconnected from user tripleo-admin 192.168.122.100 port 55476
Nov 23 08:49:04 np0005532585.localdomain sshd[36144]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 23 08:49:04 np0005532585.localdomain systemd-logind[761]: Session 28 logged out. Waiting for processes to exit.
Nov 23 08:49:04 np0005532585.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Nov 23 08:49:04 np0005532585.localdomain systemd[1]: session-28.scope: Consumed 6min 57.879s CPU time.
Nov 23 08:49:04 np0005532585.localdomain systemd-logind[761]: Removed session 28.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Activating special unit Exit the Session...
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Removed slice User Background Tasks Slice.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped target Main User Target.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped target Basic System.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped target Paths.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped target Sockets.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped target Timers.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Closed D-Bus User Message Bus Socket.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Stopped Create User's Volatile Files and Directories.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Removed slice User Application Slice.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Reached target Shutdown.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Finished Exit the Session.
Nov 23 08:49:14 np0005532585.localdomain systemd[36148]: Reached target Exit the Session.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: user@1003.service: Consumed 4.964s CPU time, read 0B from disk, written 7.0K to disk.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: user-1003.slice: Consumed 7min 2.873s CPU time.
Nov 23 08:49:14 np0005532585.localdomain podman[99792]: 2025-11-23 08:49:14.790427342 +0000 UTC m=+0.090937244 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, container_name=iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:49:14 np0005532585.localdomain podman[99792]: 2025-11-23 08:49:14.804655667 +0000 UTC m=+0.105165589 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: tmp-crun.HS480i.mount: Deactivated successfully.
Nov 23 08:49:14 np0005532585.localdomain podman[99791]: 2025-11-23 08:49:14.863025894 +0000 UTC m=+0.166647081 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Nov 23 08:49:14 np0005532585.localdomain podman[99791]: 2025-11-23 08:49:14.877448785 +0000 UTC m=+0.181069982 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:49:14 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:49:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:49:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:49:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:49:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:49:17 np0005532585.localdomain systemd[1]: tmp-crun.JErjMx.mount: Deactivated successfully.
Nov 23 08:49:17 np0005532585.localdomain podman[99832]: 2025-11-23 08:49:17.446138637 +0000 UTC m=+0.501846439 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-cron-container)
Nov 23 08:49:17 np0005532585.localdomain podman[99833]: 2025-11-23 08:49:17.455170754 +0000 UTC m=+0.505108760 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 23 08:49:17 np0005532585.localdomain podman[99832]: 2025-11-23 08:49:17.459351732 +0000 UTC m=+0.515059534 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=)
Nov 23 08:49:17 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:49:17 np0005532585.localdomain podman[99833]: 2025-11-23 08:49:17.489930208 +0000 UTC m=+0.539868184 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:49:17 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:49:17 np0005532585.localdomain podman[99840]: 2025-11-23 08:49:17.542050642 +0000 UTC m=+0.584709714 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:49:17 np0005532585.localdomain podman[99834]: 2025-11-23 08:49:17.511307482 +0000 UTC m=+0.557054159 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044)
Nov 23 08:49:17 np0005532585.localdomain podman[99840]: 2025-11-23 08:49:17.56974225 +0000 UTC m=+0.612401322 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:49:17 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:49:17 np0005532585.localdomain podman[99834]: 2025-11-23 08:49:17.59128904 +0000 UTC m=+0.637035687 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4)
Nov 23 08:49:17 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:49:18 np0005532585.localdomain sudo[99927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:49:18 np0005532585.localdomain sudo[99927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:49:18 np0005532585.localdomain sudo[99927]: pam_unix(sudo:session): session closed for user root
Nov 23 08:49:18 np0005532585.localdomain sudo[99942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:49:18 np0005532585.localdomain sudo[99942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:49:18 np0005532585.localdomain systemd[1]: tmp-crun.jyQgbs.mount: Deactivated successfully.
Nov 23 08:49:19 np0005532585.localdomain sudo[99942]: pam_unix(sudo:session): session closed for user root
Nov 23 08:49:19 np0005532585.localdomain sudo[99988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:49:19 np0005532585.localdomain sudo[99988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:49:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:49:19 np0005532585.localdomain sudo[99988]: pam_unix(sudo:session): session closed for user root
Nov 23 08:49:19 np0005532585.localdomain podman[100003]: 2025-11-23 08:49:19.787703669 +0000 UTC m=+0.094432631 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Nov 23 08:49:20 np0005532585.localdomain podman[100003]: 2025-11-23 08:49:20.164647816 +0000 UTC m=+0.471376738 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 23 08:49:20 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:49:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:49:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:49:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: tmp-crun.xjAfqZ.mount: Deactivated successfully.
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: tmp-crun.0jPBj8.mount: Deactivated successfully.
Nov 23 08:49:22 np0005532585.localdomain podman[100027]: 2025-11-23 08:49:22.043827127 +0000 UTC m=+0.093562504 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044)
Nov 23 08:49:22 np0005532585.localdomain podman[100026]: 2025-11-23 08:49:22.100051267 +0000 UTC m=+0.152743655 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:49:22 np0005532585.localdomain podman[100027]: 2025-11-23 08:49:22.12463694 +0000 UTC m=+0.174372267 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Nov 23 08:49:22 np0005532585.localdomain podman[100028]: 2025-11-23 08:49:22.072753273 +0000 UTC m=+0.116388014 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, release=1761123044, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 08:49:22 np0005532585.localdomain podman[100027]: unhealthy
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:49:22 np0005532585.localdomain podman[100028]: 2025-11-23 08:49:22.210259511 +0000 UTC m=+0.253894262 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 23 08:49:22 np0005532585.localdomain podman[100028]: unhealthy
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:49:22 np0005532585.localdomain podman[100026]: 2025-11-23 08:49:22.268314188 +0000 UTC m=+0.321006586 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=)
Nov 23 08:49:22 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:49:36 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:49:36 np0005532585.localdomain recover_tripleo_nova_virtqemud[100095]: 61756
Nov 23 08:49:36 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:49:36 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:49:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:49:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:49:45 np0005532585.localdomain systemd[1]: tmp-crun.tAHTa1.mount: Deactivated successfully.
Nov 23 08:49:45 np0005532585.localdomain podman[100097]: 2025-11-23 08:49:45.047327746 +0000 UTC m=+0.100059673 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, container_name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:49:45 np0005532585.localdomain podman[100096]: 2025-11-23 08:49:45.089098954 +0000 UTC m=+0.142180402 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, container_name=collectd, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 23 08:49:45 np0005532585.localdomain podman[100096]: 2025-11-23 08:49:45.102304158 +0000 UTC m=+0.155385596 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, release=1761123044, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=collectd, vcs-type=git)
Nov 23 08:49:45 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:49:45 np0005532585.localdomain podman[100097]: 2025-11-23 08:49:45.130192512 +0000 UTC m=+0.182924419 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 23 08:49:45 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:49:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:49:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:49:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:49:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:49:48 np0005532585.localdomain systemd[1]: tmp-crun.W8V17T.mount: Deactivated successfully.
Nov 23 08:49:48 np0005532585.localdomain podman[100133]: 2025-11-23 08:49:48.044142332 +0000 UTC m=+0.097122374 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:49:48 np0005532585.localdomain systemd[1]: tmp-crun.WykkDF.mount: Deactivated successfully.
Nov 23 08:49:48 np0005532585.localdomain podman[100132]: 2025-11-23 08:49:48.090550811 +0000 UTC m=+0.146697280 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Nov 23 08:49:48 np0005532585.localdomain podman[100132]: 2025-11-23 08:49:48.104188539 +0000 UTC m=+0.160334998 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 08:49:48 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:49:48 np0005532585.localdomain podman[100134]: 2025-11-23 08:49:48.120733205 +0000 UTC m=+0.175125401 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 23 08:49:48 np0005532585.localdomain podman[100134]: 2025-11-23 08:49:48.149227557 +0000 UTC m=+0.203619753 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:49:48 np0005532585.localdomain podman[100135]: 2025-11-23 08:49:48.105368675 +0000 UTC m=+0.149882178 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1761123044, url=https://www.redhat.com, tcib_managed=true)
Nov 23 08:49:48 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:49:48 np0005532585.localdomain podman[100133]: 2025-11-23 08:49:48.172861141 +0000 UTC m=+0.225841143 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Nov 23 08:49:48 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:49:48 np0005532585.localdomain podman[100135]: 2025-11-23 08:49:48.189403337 +0000 UTC m=+0.233916870 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1)
Nov 23 08:49:48 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:49:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:49:51 np0005532585.localdomain systemd[1]: tmp-crun.T14XBP.mount: Deactivated successfully.
Nov 23 08:49:51 np0005532585.localdomain podman[100229]: 2025-11-23 08:49:51.029940599 +0000 UTC m=+0.086619391 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:49:51 np0005532585.localdomain podman[100229]: 2025-11-23 08:49:51.399435978 +0000 UTC m=+0.456114740 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, vcs-type=git)
Nov 23 08:49:51 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:49:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:49:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:49:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:49:53 np0005532585.localdomain podman[100252]: 2025-11-23 08:49:53.040743179 +0000 UTC m=+0.100060093 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com)
Nov 23 08:49:53 np0005532585.localdomain podman[100254]: 2025-11-23 08:49:53.09434074 +0000 UTC m=+0.145554436 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, batch=17.1_20251118.1)
Nov 23 08:49:53 np0005532585.localdomain podman[100254]: 2025-11-23 08:49:53.109282678 +0000 UTC m=+0.160496394 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:49:53 np0005532585.localdomain podman[100254]: unhealthy
Nov 23 08:49:53 np0005532585.localdomain podman[100253]: 2025-11-23 08:49:53.01820161 +0000 UTC m=+0.076068109 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible)
Nov 23 08:49:53 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:49:53 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:49:53 np0005532585.localdomain podman[100253]: 2025-11-23 08:49:53.151477309 +0000 UTC m=+0.209343838 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Nov 23 08:49:53 np0005532585.localdomain podman[100253]: unhealthy
Nov 23 08:49:53 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:49:53 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:49:53 np0005532585.localdomain podman[100252]: 2025-11-23 08:49:53.271369658 +0000 UTC m=+0.330686582 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 23 08:49:53 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:50:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:50:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:50:16 np0005532585.localdomain systemd[1]: tmp-crun.uoNyNc.mount: Deactivated successfully.
Nov 23 08:50:16 np0005532585.localdomain podman[100320]: 2025-11-23 08:50:16.011011979 +0000 UTC m=+0.071242442 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:50:16 np0005532585.localdomain podman[100319]: 2025-11-23 08:50:16.072410938 +0000 UTC m=+0.131768693 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z)
Nov 23 08:50:16 np0005532585.localdomain podman[100319]: 2025-11-23 08:50:16.080854787 +0000 UTC m=+0.140212522 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:50:16 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:50:16 np0005532585.localdomain podman[100320]: 2025-11-23 08:50:16.101692964 +0000 UTC m=+0.161923447 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, container_name=iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:50:16 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:50:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:50:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:50:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:50:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:50:19 np0005532585.localdomain podman[100364]: 2025-11-23 08:50:19.035273105 +0000 UTC m=+0.079248956 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_id=tripleo_step5, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 23 08:50:19 np0005532585.localdomain systemd[1]: tmp-crun.4W3ekX.mount: Deactivated successfully.
Nov 23 08:50:19 np0005532585.localdomain podman[100358]: 2025-11-23 08:50:19.094168088 +0000 UTC m=+0.149240528 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Nov 23 08:50:19 np0005532585.localdomain podman[100358]: 2025-11-23 08:50:19.105252067 +0000 UTC m=+0.160324517 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Nov 23 08:50:19 np0005532585.localdomain podman[100364]: 2025-11-23 08:50:19.112224071 +0000 UTC m=+0.156199932 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:50:19 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:50:19 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:50:19 np0005532585.localdomain podman[100359]: 2025-11-23 08:50:19.192966072 +0000 UTC m=+0.242938376 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:50:19 np0005532585.localdomain podman[100360]: 2025-11-23 08:50:19.250115401 +0000 UTC m=+0.296307850 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1)
Nov 23 08:50:19 np0005532585.localdomain podman[100359]: 2025-11-23 08:50:19.253276147 +0000 UTC m=+0.303248431 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:50:19 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:50:19 np0005532585.localdomain podman[100360]: 2025-11-23 08:50:19.280244233 +0000 UTC m=+0.326436632 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 23 08:50:19 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:50:19 np0005532585.localdomain sudo[100455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:50:19 np0005532585.localdomain sudo[100455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:50:19 np0005532585.localdomain sudo[100455]: pam_unix(sudo:session): session closed for user root
Nov 23 08:50:19 np0005532585.localdomain sudo[100470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:50:19 np0005532585.localdomain sudo[100470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:50:20 np0005532585.localdomain sudo[100470]: pam_unix(sudo:session): session closed for user root
Nov 23 08:50:21 np0005532585.localdomain sudo[100517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:50:21 np0005532585.localdomain sudo[100517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:50:21 np0005532585.localdomain sudo[100517]: pam_unix(sudo:session): session closed for user root
Nov 23 08:50:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:50:22 np0005532585.localdomain systemd[1]: tmp-crun.u9rl1F.mount: Deactivated successfully.
Nov 23 08:50:22 np0005532585.localdomain podman[100532]: 2025-11-23 08:50:22.037690213 +0000 UTC m=+0.090475590 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:50:22 np0005532585.localdomain podman[100532]: 2025-11-23 08:50:22.416232988 +0000 UTC m=+0.469018305 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:50:22 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:50:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:50:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:50:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:50:24 np0005532585.localdomain podman[100555]: 2025-11-23 08:50:24.029421849 +0000 UTC m=+0.085107236 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 23 08:50:24 np0005532585.localdomain podman[100557]: 2025-11-23 08:50:24.093994025 +0000 UTC m=+0.141389058 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12)
Nov 23 08:50:24 np0005532585.localdomain podman[100557]: 2025-11-23 08:50:24.108606542 +0000 UTC m=+0.156001595 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:50:24 np0005532585.localdomain podman[100557]: unhealthy
Nov 23 08:50:24 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:50:24 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:50:24 np0005532585.localdomain podman[100556]: 2025-11-23 08:50:24.187201077 +0000 UTC m=+0.236791527 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 23 08:50:24 np0005532585.localdomain podman[100556]: 2025-11-23 08:50:24.204276229 +0000 UTC m=+0.253866679 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:50:24 np0005532585.localdomain podman[100556]: unhealthy
Nov 23 08:50:24 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:50:24 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:50:24 np0005532585.localdomain podman[100555]: 2025-11-23 08:50:24.265337748 +0000 UTC m=+0.321023175 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:50:24 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:50:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:50:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:50:47 np0005532585.localdomain podman[100626]: 2025-11-23 08:50:47.006300143 +0000 UTC m=+0.059877043 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 23 08:50:47 np0005532585.localdomain podman[100626]: 2025-11-23 08:50:47.015350339 +0000 UTC m=+0.068927219 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:50:47 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:50:47 np0005532585.localdomain systemd[1]: tmp-crun.xWFXUX.mount: Deactivated successfully.
Nov 23 08:50:47 np0005532585.localdomain podman[100625]: 2025-11-23 08:50:47.07484811 +0000 UTC m=+0.129973178 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 23 08:50:47 np0005532585.localdomain podman[100625]: 2025-11-23 08:50:47.088281221 +0000 UTC m=+0.143406289 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:50:47 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:50:48 np0005532585.localdomain sshd[100665]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:50:49 np0005532585.localdomain sshd[100667]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:50:49 np0005532585.localdomain sshd[100667]: Invalid user sol from 193.32.162.146 port 42370
Nov 23 08:50:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:50:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:50:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:50:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:50:49 np0005532585.localdomain podman[100670]: 2025-11-23 08:50:49.825043709 +0000 UTC m=+0.065923418 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:50:49 np0005532585.localdomain sshd[100667]: Connection closed by invalid user sol 193.32.162.146 port 42370 [preauth]
Nov 23 08:50:49 np0005532585.localdomain podman[100671]: 2025-11-23 08:50:49.897036673 +0000 UTC m=+0.135859300 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4)
Nov 23 08:50:49 np0005532585.localdomain podman[100672]: 2025-11-23 08:50:49.855490961 +0000 UTC m=+0.088115657 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:50:49 np0005532585.localdomain podman[100670]: 2025-11-23 08:50:49.91199369 +0000 UTC m=+0.152873409 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, version=17.1.12, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:50:49 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:50:49 np0005532585.localdomain podman[100671]: 2025-11-23 08:50:49.949688814 +0000 UTC m=+0.188511401 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, version=17.1.12, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git)
Nov 23 08:50:49 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:50:49 np0005532585.localdomain podman[100669]: 2025-11-23 08:50:49.989024498 +0000 UTC m=+0.229268088 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, tcib_managed=true, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:50:50 np0005532585.localdomain podman[100669]: 2025-11-23 08:50:50.001160579 +0000 UTC m=+0.241404159 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, version=17.1.12)
Nov 23 08:50:50 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:50:50 np0005532585.localdomain sshd[100665]: Invalid user guest from 50.223.176.171 port 35557
Nov 23 08:50:50 np0005532585.localdomain podman[100672]: 2025-11-23 08:50:50.041041739 +0000 UTC m=+0.273666455 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute)
Nov 23 08:50:50 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:50:50 np0005532585.localdomain sshd[100665]: Connection closed by invalid user guest 50.223.176.171 port 35557 [preauth]
Nov 23 08:50:50 np0005532585.localdomain systemd[1]: tmp-crun.wDQCdP.mount: Deactivated successfully.
Nov 23 08:50:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:50:53 np0005532585.localdomain systemd[1]: tmp-crun.QezOms.mount: Deactivated successfully.
Nov 23 08:50:53 np0005532585.localdomain podman[100768]: 2025-11-23 08:50:53.039240728 +0000 UTC m=+0.095805534 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true)
Nov 23 08:50:53 np0005532585.localdomain podman[100768]: 2025-11-23 08:50:53.421399623 +0000 UTC m=+0.477964379 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:50:53 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:50:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:50:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:50:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:50:55 np0005532585.localdomain systemd[1]: tmp-crun.d9QnHA.mount: Deactivated successfully.
Nov 23 08:50:55 np0005532585.localdomain podman[100793]: 2025-11-23 08:50:55.031425277 +0000 UTC m=+0.088122498 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:14:25Z)
Nov 23 08:50:55 np0005532585.localdomain podman[100793]: 2025-11-23 08:50:55.077825087 +0000 UTC m=+0.134522358 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, version=17.1.12)
Nov 23 08:50:55 np0005532585.localdomain podman[100793]: unhealthy
Nov 23 08:50:55 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:50:55 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:50:55 np0005532585.localdomain podman[100794]: 2025-11-23 08:50:55.119056569 +0000 UTC m=+0.171706186 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 23 08:50:55 np0005532585.localdomain podman[100792]: 2025-11-23 08:50:55.079682144 +0000 UTC m=+0.136518739 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Nov 23 08:50:55 np0005532585.localdomain podman[100794]: 2025-11-23 08:50:55.136792231 +0000 UTC m=+0.189441888 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 08:50:55 np0005532585.localdomain podman[100794]: unhealthy
Nov 23 08:50:55 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:50:55 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:50:55 np0005532585.localdomain podman[100792]: 2025-11-23 08:50:55.247342615 +0000 UTC m=+0.304179220 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:50:55 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:51:11 np0005532585.localdomain sshd[100861]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:51:16 np0005532585.localdomain sshd[100861]: Connection closed by authenticating user root 80.94.95.116 port 15528 [preauth]
Nov 23 08:51:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:51:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:51:18 np0005532585.localdomain podman[100863]: 2025-11-23 08:51:18.024494016 +0000 UTC m=+0.081316739 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Nov 23 08:51:18 np0005532585.localdomain systemd[1]: tmp-crun.Vmfrwa.mount: Deactivated successfully.
Nov 23 08:51:18 np0005532585.localdomain podman[100864]: 2025-11-23 08:51:18.069108082 +0000 UTC m=+0.123758469 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:51:18 np0005532585.localdomain podman[100863]: 2025-11-23 08:51:18.091171877 +0000 UTC m=+0.147994610 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:51:18 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:51:18 np0005532585.localdomain podman[100864]: 2025-11-23 08:51:18.103282898 +0000 UTC m=+0.157933225 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:44:13Z, tcib_managed=true)
Nov 23 08:51:18 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:51:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:51:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:51:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:51:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:51:21 np0005532585.localdomain systemd[1]: tmp-crun.ElOvjL.mount: Deactivated successfully.
Nov 23 08:51:21 np0005532585.localdomain podman[100903]: 2025-11-23 08:51:21.046815582 +0000 UTC m=+0.098873297 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:51:21 np0005532585.localdomain podman[100909]: 2025-11-23 08:51:21.081281417 +0000 UTC m=+0.129420982 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:51:21 np0005532585.localdomain podman[100903]: 2025-11-23 08:51:21.095778401 +0000 UTC m=+0.147836116 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git)
Nov 23 08:51:21 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:51:21 np0005532585.localdomain podman[100902]: 2025-11-23 08:51:21.137688163 +0000 UTC m=+0.192141831 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, version=17.1.12)
Nov 23 08:51:21 np0005532585.localdomain podman[100909]: 2025-11-23 08:51:21.159029816 +0000 UTC m=+0.207169381 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:51:21 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:51:21 np0005532585.localdomain podman[100902]: 2025-11-23 08:51:21.211008247 +0000 UTC m=+0.265461955 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:51:21 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:51:21 np0005532585.localdomain podman[100901]: 2025-11-23 08:51:21.010615024 +0000 UTC m=+0.071094007 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1)
Nov 23 08:51:21 np0005532585.localdomain podman[100901]: 2025-11-23 08:51:21.29541462 +0000 UTC m=+0.355893583 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron)
Nov 23 08:51:21 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:51:21 np0005532585.localdomain sudo[100998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:51:21 np0005532585.localdomain sudo[100998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:51:21 np0005532585.localdomain sudo[100998]: pam_unix(sudo:session): session closed for user root
Nov 23 08:51:21 np0005532585.localdomain sudo[101013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:51:21 np0005532585.localdomain sudo[101013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:51:22 np0005532585.localdomain sudo[101013]: pam_unix(sudo:session): session closed for user root
Nov 23 08:51:22 np0005532585.localdomain sudo[101059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:51:22 np0005532585.localdomain sudo[101059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:51:22 np0005532585.localdomain sudo[101059]: pam_unix(sudo:session): session closed for user root
Nov 23 08:51:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:51:24 np0005532585.localdomain systemd[1]: tmp-crun.PXajo2.mount: Deactivated successfully.
Nov 23 08:51:24 np0005532585.localdomain podman[101074]: 2025-11-23 08:51:24.040419699 +0000 UTC m=+0.094226495 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target)
Nov 23 08:51:24 np0005532585.localdomain podman[101074]: 2025-11-23 08:51:24.432519079 +0000 UTC m=+0.486325905 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:51:24 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:51:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:51:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:51:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:51:25 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:51:25 np0005532585.localdomain recover_tripleo_nova_virtqemud[101117]: 61756
Nov 23 08:51:25 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:51:25 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:51:25 np0005532585.localdomain podman[101099]: 2025-11-23 08:51:25.984871948 +0000 UTC m=+0.046825564 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, container_name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:51:25 np0005532585.localdomain podman[101099]: 2025-11-23 08:51:25.997162684 +0000 UTC m=+0.059116310 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:51:26 np0005532585.localdomain podman[101099]: unhealthy
Nov 23 08:51:26 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:51:26 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:51:26 np0005532585.localdomain podman[101098]: 2025-11-23 08:51:26.067971102 +0000 UTC m=+0.127928447 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=)
Nov 23 08:51:26 np0005532585.localdomain podman[101100]: 2025-11-23 08:51:26.040516711 +0000 UTC m=+0.097864846 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:51:26 np0005532585.localdomain podman[101100]: 2025-11-23 08:51:26.125393479 +0000 UTC m=+0.182741644 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:51:26 np0005532585.localdomain podman[101100]: unhealthy
Nov 23 08:51:26 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:51:26 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:51:26 np0005532585.localdomain podman[101098]: 2025-11-23 08:51:26.226226665 +0000 UTC m=+0.286183960 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 08:51:26 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:51:26 np0005532585.localdomain systemd[1]: tmp-crun.0sb7eT.mount: Deactivated successfully.
Nov 23 08:51:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:51:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:51:49 np0005532585.localdomain podman[101169]: 2025-11-23 08:51:49.03519269 +0000 UTC m=+0.092536303 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 23 08:51:49 np0005532585.localdomain podman[101169]: 2025-11-23 08:51:49.046775895 +0000 UTC m=+0.104119518 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 23 08:51:49 np0005532585.localdomain podman[101170]: 2025-11-23 08:51:49.086521661 +0000 UTC m=+0.141126200 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Nov 23 08:51:49 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:51:49 np0005532585.localdomain podman[101170]: 2025-11-23 08:51:49.124274227 +0000 UTC m=+0.178878756 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:51:49 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:51:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:51:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:51:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:51:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:51:52 np0005532585.localdomain systemd[1]: tmp-crun.oIZQSo.mount: Deactivated successfully.
Nov 23 08:51:52 np0005532585.localdomain podman[101210]: 2025-11-23 08:51:52.023101574 +0000 UTC m=+0.073187950 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi)
Nov 23 08:51:52 np0005532585.localdomain podman[101212]: 2025-11-23 08:51:52.084378 +0000 UTC m=+0.129411442 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 23 08:51:52 np0005532585.localdomain podman[101212]: 2025-11-23 08:51:52.136263057 +0000 UTC m=+0.181296489 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:51:52 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:51:52 np0005532585.localdomain podman[101209]: 2025-11-23 08:51:52.052009419 +0000 UTC m=+0.103939942 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:51:52 np0005532585.localdomain podman[101210]: 2025-11-23 08:51:52.161632334 +0000 UTC m=+0.211718700 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044)
Nov 23 08:51:52 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:51:52 np0005532585.localdomain podman[101209]: 2025-11-23 08:51:52.186370311 +0000 UTC m=+0.238300814 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Nov 23 08:51:52 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:51:52 np0005532585.localdomain podman[101208]: 2025-11-23 08:51:52.13762655 +0000 UTC m=+0.192127432 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 23 08:51:52 np0005532585.localdomain podman[101208]: 2025-11-23 08:51:52.267931787 +0000 UTC m=+0.322432679 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Nov 23 08:51:52 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:51:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:51:55 np0005532585.localdomain podman[101302]: 2025-11-23 08:51:55.019234768 +0000 UTC m=+0.074246233 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target)
Nov 23 08:51:55 np0005532585.localdomain podman[101302]: 2025-11-23 08:51:55.383982401 +0000 UTC m=+0.438993856 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, container_name=nova_migration_target, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container)
Nov 23 08:51:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:51:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:51:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:51:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:51:57 np0005532585.localdomain systemd[1]: tmp-crun.sEjoxg.mount: Deactivated successfully.
Nov 23 08:51:57 np0005532585.localdomain podman[101325]: 2025-11-23 08:51:57.022392274 +0000 UTC m=+0.082209707 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.)
Nov 23 08:51:57 np0005532585.localdomain podman[101327]: 2025-11-23 08:51:57.03339368 +0000 UTC m=+0.086567630 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4)
Nov 23 08:51:57 np0005532585.localdomain podman[101326]: 2025-11-23 08:51:57.072790136 +0000 UTC m=+0.129267607 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:51:57 np0005532585.localdomain podman[101326]: 2025-11-23 08:51:57.089227019 +0000 UTC m=+0.145704510 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, tcib_managed=true)
Nov 23 08:51:57 np0005532585.localdomain podman[101326]: unhealthy
Nov 23 08:51:57 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:51:57 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:51:57 np0005532585.localdomain podman[101327]: 2025-11-23 08:51:57.126415817 +0000 UTC m=+0.179589787 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc.)
Nov 23 08:51:57 np0005532585.localdomain podman[101327]: unhealthy
Nov 23 08:51:57 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:51:57 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:51:57 np0005532585.localdomain podman[101325]: 2025-11-23 08:51:57.219028582 +0000 UTC m=+0.278846015 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:51:57 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:52:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:52:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:52:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 08:52:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 08:52:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:52:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:52:20 np0005532585.localdomain systemd[1]: tmp-crun.FpXuKl.mount: Deactivated successfully.
Nov 23 08:52:20 np0005532585.localdomain podman[101391]: 2025-11-23 08:52:20.027606165 +0000 UTC m=+0.081181016 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 08:52:20 np0005532585.localdomain podman[101391]: 2025-11-23 08:52:20.060836932 +0000 UTC m=+0.114411773 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 23 08:52:20 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:52:20 np0005532585.localdomain systemd[1]: tmp-crun.po3ACC.mount: Deactivated successfully.
Nov 23 08:52:20 np0005532585.localdomain podman[101392]: 2025-11-23 08:52:20.141208612 +0000 UTC m=+0.191232133 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 08:52:20 np0005532585.localdomain podman[101392]: 2025-11-23 08:52:20.152271011 +0000 UTC m=+0.202294502 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid)
Nov 23 08:52:20 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:52:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:52:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:52:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:52:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:52:22 np0005532585.localdomain sudo[101430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:52:22 np0005532585.localdomain sudo[101430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:52:22 np0005532585.localdomain sudo[101430]: pam_unix(sudo:session): session closed for user root
Nov 23 08:52:23 np0005532585.localdomain systemd[1]: tmp-crun.WoQQbs.mount: Deactivated successfully.
Nov 23 08:52:23 np0005532585.localdomain podman[101433]: 2025-11-23 08:52:23.031573701 +0000 UTC m=+0.083786985 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:52:23 np0005532585.localdomain sudo[101488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:52:23 np0005532585.localdomain sudo[101488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:52:23 np0005532585.localdomain podman[101431]: 2025-11-23 08:52:23.014941712 +0000 UTC m=+0.073926074 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git)
Nov 23 08:52:23 np0005532585.localdomain podman[101429]: 2025-11-23 08:52:23.069047768 +0000 UTC m=+0.128645929 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Nov 23 08:52:23 np0005532585.localdomain podman[101433]: 2025-11-23 08:52:23.081253551 +0000 UTC m=+0.133466845 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com)
Nov 23 08:52:23 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:52:23 np0005532585.localdomain podman[101431]: 2025-11-23 08:52:23.098100796 +0000 UTC m=+0.157085118 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Nov 23 08:52:23 np0005532585.localdomain podman[101429]: 2025-11-23 08:52:23.104263276 +0000 UTC m=+0.163861397 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container)
Nov 23 08:52:23 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:52:23 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:52:23 np0005532585.localdomain podman[101435]: 2025-11-23 08:52:23.163013694 +0000 UTC m=+0.216567560 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, release=1761123044, architecture=x86_64)
Nov 23 08:52:23 np0005532585.localdomain podman[101435]: 2025-11-23 08:52:23.186682798 +0000 UTC m=+0.240236664 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team)
Nov 23 08:52:23 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:52:23 np0005532585.localdomain sudo[101488]: pam_unix(sudo:session): session closed for user root
Nov 23 08:52:25 np0005532585.localdomain sudo[101589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:52:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:52:25 np0005532585.localdomain sudo[101589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:52:25 np0005532585.localdomain sudo[101589]: pam_unix(sudo:session): session closed for user root
Nov 23 08:52:26 np0005532585.localdomain podman[101603]: 2025-11-23 08:52:26.028906463 +0000 UTC m=+0.083929600 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:52:26 np0005532585.localdomain podman[101603]: 2025-11-23 08:52:26.404722834 +0000 UTC m=+0.459745931 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Nov 23 08:52:26 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:52:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:52:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:52:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:52:28 np0005532585.localdomain podman[101627]: 2025-11-23 08:52:28.02788078 +0000 UTC m=+0.084563169 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, build-date=2025-11-18T22:49:46Z, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: tmp-crun.AcLKWd.mount: Deactivated successfully.
Nov 23 08:52:28 np0005532585.localdomain podman[101628]: 2025-11-23 08:52:28.088080663 +0000 UTC m=+0.141675497 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 08:52:28 np0005532585.localdomain podman[101628]: 2025-11-23 08:52:28.100781791 +0000 UTC m=+0.154376605 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Nov 23 08:52:28 np0005532585.localdomain podman[101628]: unhealthy
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: tmp-crun.12mAer.mount: Deactivated successfully.
Nov 23 08:52:28 np0005532585.localdomain podman[101629]: 2025-11-23 08:52:28.192924561 +0000 UTC m=+0.243326058 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, container_name=ovn_controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:52:28 np0005532585.localdomain podman[101627]: 2025-11-23 08:52:28.234272477 +0000 UTC m=+0.290954876 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:52:28 np0005532585.localdomain podman[101629]: 2025-11-23 08:52:28.285303919 +0000 UTC m=+0.335705416 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller)
Nov 23 08:52:28 np0005532585.localdomain podman[101629]: unhealthy
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:52:28 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:52:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:52:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:52:51 np0005532585.localdomain podman[101695]: 2025-11-23 08:52:51.02709473 +0000 UTC m=+0.079425961 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12)
Nov 23 08:52:51 np0005532585.localdomain podman[101695]: 2025-11-23 08:52:51.061319027 +0000 UTC m=+0.113650268 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, container_name=iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 23 08:52:51 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:52:51 np0005532585.localdomain podman[101694]: 2025-11-23 08:52:51.068855488 +0000 UTC m=+0.123540231 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:52:51 np0005532585.localdomain podman[101694]: 2025-11-23 08:52:51.148751623 +0000 UTC m=+0.203436396 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd)
Nov 23 08:52:51 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:52:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:52:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:52:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:52:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:52:54 np0005532585.localdomain systemd[1]: tmp-crun.7ipEdw.mount: Deactivated successfully.
Nov 23 08:52:54 np0005532585.localdomain podman[101734]: 2025-11-23 08:52:54.04404153 +0000 UTC m=+0.095810983 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute)
Nov 23 08:52:54 np0005532585.localdomain systemd[1]: tmp-crun.Cd8cve.mount: Deactivated successfully.
Nov 23 08:52:54 np0005532585.localdomain podman[101735]: 2025-11-23 08:52:54.090251995 +0000 UTC m=+0.138539422 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:52:54 np0005532585.localdomain podman[101734]: 2025-11-23 08:52:54.143513685 +0000 UTC m=+0.195283128 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z)
Nov 23 08:52:54 np0005532585.localdomain podman[101733]: 2025-11-23 08:52:54.14401251 +0000 UTC m=+0.198271999 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64)
Nov 23 08:52:54 np0005532585.localdomain podman[101735]: 2025-11-23 08:52:54.148258269 +0000 UTC m=+0.196545346 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible)
Nov 23 08:52:54 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:52:54 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:52:54 np0005532585.localdomain podman[101733]: 2025-11-23 08:52:54.23026793 +0000 UTC m=+0.284527379 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:52:54 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:52:54 np0005532585.localdomain podman[101740]: 2025-11-23 08:52:54.201308763 +0000 UTC m=+0.243973618 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:52:54 np0005532585.localdomain podman[101740]: 2025-11-23 08:52:54.285242882 +0000 UTC m=+0.327907707 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com)
Nov 23 08:52:54 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:52:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:52:56 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:52:56 np0005532585.localdomain recover_tripleo_nova_virtqemud[101833]: 61756
Nov 23 08:52:56 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:52:56 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:52:57 np0005532585.localdomain podman[101826]: 2025-11-23 08:52:57.031261292 +0000 UTC m=+0.075128980 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044)
Nov 23 08:52:57 np0005532585.localdomain podman[101826]: 2025-11-23 08:52:57.370565046 +0000 UTC m=+0.414432744 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:52:57 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:52:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:52:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:52:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:52:59 np0005532585.localdomain podman[101851]: 2025-11-23 08:52:59.01131367 +0000 UTC m=+0.064772343 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12)
Nov 23 08:52:59 np0005532585.localdomain podman[101853]: 2025-11-23 08:52:59.082111487 +0000 UTC m=+0.131003700 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 23 08:52:59 np0005532585.localdomain podman[101852]: 2025-11-23 08:52:59.042097683 +0000 UTC m=+0.090161761 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:52:59 np0005532585.localdomain podman[101853]: 2025-11-23 08:52:59.09916289 +0000 UTC m=+0.148055113 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 23 08:52:59 np0005532585.localdomain podman[101853]: unhealthy
Nov 23 08:52:59 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:52:59 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:52:59 np0005532585.localdomain podman[101852]: 2025-11-23 08:52:59.12531351 +0000 UTC m=+0.173377628 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:52:59 np0005532585.localdomain podman[101852]: unhealthy
Nov 23 08:52:59 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:52:59 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:52:59 np0005532585.localdomain podman[101851]: 2025-11-23 08:52:59.193041112 +0000 UTC m=+0.246499835 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:49:46Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:52:59 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:53:00 np0005532585.localdomain systemd[1]: tmp-crun.jSEyR9.mount: Deactivated successfully.
Nov 23 08:53:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:53:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:53:22 np0005532585.localdomain podman[101915]: 2025-11-23 08:53:22.023438533 +0000 UTC m=+0.076451992 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 23 08:53:22 np0005532585.localdomain podman[101915]: 2025-11-23 08:53:22.058189246 +0000 UTC m=+0.111202685 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:53:22 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:53:22 np0005532585.localdomain podman[101914]: 2025-11-23 08:53:22.076716283 +0000 UTC m=+0.129458192 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd)
Nov 23 08:53:22 np0005532585.localdomain podman[101914]: 2025-11-23 08:53:22.089264397 +0000 UTC m=+0.142006296 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:53:22 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:53:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:53:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:53:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:53:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:53:25 np0005532585.localdomain podman[101954]: 2025-11-23 08:53:25.031309097 +0000 UTC m=+0.086102385 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container)
Nov 23 08:53:25 np0005532585.localdomain podman[101954]: 2025-11-23 08:53:25.062369779 +0000 UTC m=+0.117163037 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:53:25 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:53:25 np0005532585.localdomain podman[101955]: 2025-11-23 08:53:25.076926134 +0000 UTC m=+0.128626218 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 08:53:25 np0005532585.localdomain podman[101953]: 2025-11-23 08:53:25.136159467 +0000 UTC m=+0.191849383 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 08:53:25 np0005532585.localdomain podman[101953]: 2025-11-23 08:53:25.169282811 +0000 UTC m=+0.224972717 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z)
Nov 23 08:53:25 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:53:25 np0005532585.localdomain podman[101956]: 2025-11-23 08:53:25.181743642 +0000 UTC m=+0.229542307 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z)
Nov 23 08:53:25 np0005532585.localdomain podman[101956]: 2025-11-23 08:53:25.20521516 +0000 UTC m=+0.253013875 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:53:25 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:53:25 np0005532585.localdomain podman[101955]: 2025-11-23 08:53:25.260132511 +0000 UTC m=+0.311832645 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:53:25 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:53:26 np0005532585.localdomain sudo[102053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:53:26 np0005532585.localdomain sudo[102053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:53:26 np0005532585.localdomain sudo[102053]: pam_unix(sudo:session): session closed for user root
Nov 23 08:53:26 np0005532585.localdomain sudo[102068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:53:26 np0005532585.localdomain sudo[102068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:53:26 np0005532585.localdomain sudo[102068]: pam_unix(sudo:session): session closed for user root
Nov 23 08:53:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:53:28 np0005532585.localdomain systemd[1]: tmp-crun.2SBTFq.mount: Deactivated successfully.
Nov 23 08:53:28 np0005532585.localdomain podman[102114]: 2025-11-23 08:53:28.031817786 +0000 UTC m=+0.086839269 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:53:28 np0005532585.localdomain sudo[102136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:53:28 np0005532585.localdomain sudo[102136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:53:28 np0005532585.localdomain sudo[102136]: pam_unix(sudo:session): session closed for user root
Nov 23 08:53:28 np0005532585.localdomain podman[102114]: 2025-11-23 08:53:28.412525837 +0000 UTC m=+0.467547330 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:53:28 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:53:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:53:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:53:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: tmp-crun.VOayAo.mount: Deactivated successfully.
Nov 23 08:53:30 np0005532585.localdomain podman[102153]: 2025-11-23 08:53:30.0276918 +0000 UTC m=+0.084837387 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64)
Nov 23 08:53:30 np0005532585.localdomain podman[102153]: 2025-11-23 08:53:30.045255068 +0000 UTC m=+0.102400645 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:53:30 np0005532585.localdomain podman[102153]: unhealthy
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: tmp-crun.zanDLi.mount: Deactivated successfully.
Nov 23 08:53:30 np0005532585.localdomain podman[102152]: 2025-11-23 08:53:30.137219282 +0000 UTC m=+0.195720461 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4)
Nov 23 08:53:30 np0005532585.localdomain podman[102154]: 2025-11-23 08:53:30.183349405 +0000 UTC m=+0.236895812 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public)
Nov 23 08:53:30 np0005532585.localdomain podman[102154]: 2025-11-23 08:53:30.200339355 +0000 UTC m=+0.253885762 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 08:53:30 np0005532585.localdomain podman[102154]: unhealthy
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:53:30 np0005532585.localdomain podman[102152]: 2025-11-23 08:53:30.332422017 +0000 UTC m=+0.390923196 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 08:53:30 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:53:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:53:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:53:53 np0005532585.localdomain podman[102221]: 2025-11-23 08:53:53.037247397 +0000 UTC m=+0.087784049 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 23 08:53:53 np0005532585.localdomain systemd[1]: tmp-crun.z2TnJa.mount: Deactivated successfully.
Nov 23 08:53:53 np0005532585.localdomain podman[102222]: 2025-11-23 08:53:53.084965177 +0000 UTC m=+0.133288391 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:53:53 np0005532585.localdomain podman[102222]: 2025-11-23 08:53:53.095306903 +0000 UTC m=+0.143630107 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible)
Nov 23 08:53:53 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:53:53 np0005532585.localdomain podman[102221]: 2025-11-23 08:53:53.147267403 +0000 UTC m=+0.197804005 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:53:53 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:53:54 np0005532585.localdomain sshd[102261]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:53:54 np0005532585.localdomain sshd[102261]: Invalid user sol from 193.32.162.146 port 51512
Nov 23 08:53:54 np0005532585.localdomain sshd[102261]: Connection closed by invalid user sol 193.32.162.146 port 51512 [preauth]
Nov 23 08:53:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:53:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:53:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:53:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:53:56 np0005532585.localdomain systemd[1]: tmp-crun.DgKjyx.mount: Deactivated successfully.
Nov 23 08:53:56 np0005532585.localdomain podman[102264]: 2025-11-23 08:53:56.033762363 +0000 UTC m=+0.085234919 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:53:56 np0005532585.localdomain podman[102264]: 2025-11-23 08:53:56.088607802 +0000 UTC m=+0.140080318 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4)
Nov 23 08:53:56 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:53:56 np0005532585.localdomain podman[102265]: 2025-11-23 08:53:56.110286896 +0000 UTC m=+0.153475068 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:53:56 np0005532585.localdomain podman[102273]: 2025-11-23 08:53:56.075080688 +0000 UTC m=+0.116368863 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true)
Nov 23 08:53:56 np0005532585.localdomain podman[102265]: 2025-11-23 08:53:56.140328815 +0000 UTC m=+0.183517007 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:53:56 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:53:56 np0005532585.localdomain podman[102263]: 2025-11-23 08:53:56.18427063 +0000 UTC m=+0.234558380 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 08:53:56 np0005532585.localdomain podman[102273]: 2025-11-23 08:53:56.211170283 +0000 UTC m=+0.252458458 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 23 08:53:56 np0005532585.localdomain podman[102263]: 2025-11-23 08:53:56.222372436 +0000 UTC m=+0.272660206 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:53:56 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:53:56 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:53:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:53:58 np0005532585.localdomain podman[102363]: 2025-11-23 08:53:58.996447755 +0000 UTC m=+0.056072087 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, release=1761123044, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:53:59 np0005532585.localdomain podman[102363]: 2025-11-23 08:53:59.36421343 +0000 UTC m=+0.423837762 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:53:59 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:54:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:54:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:54:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:54:00 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:54:00 np0005532585.localdomain recover_tripleo_nova_virtqemud[102400]: 61756
Nov 23 08:54:00 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:54:00 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:54:01 np0005532585.localdomain podman[102387]: 2025-11-23 08:54:01.041521494 +0000 UTC m=+0.085866959 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:54:01 np0005532585.localdomain podman[102386]: 2025-11-23 08:54:01.09042015 +0000 UTC m=+0.135076835 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:54:01 np0005532585.localdomain podman[102387]: 2025-11-23 08:54:01.108592117 +0000 UTC m=+0.152937642 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:54:01 np0005532585.localdomain podman[102388]: 2025-11-23 08:54:01.147818757 +0000 UTC m=+0.185521518 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 08:54:01 np0005532585.localdomain podman[102387]: unhealthy
Nov 23 08:54:01 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:54:01 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:54:01 np0005532585.localdomain podman[102388]: 2025-11-23 08:54:01.183684775 +0000 UTC m=+0.221387526 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:54:01 np0005532585.localdomain podman[102388]: unhealthy
Nov 23 08:54:01 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:54:01 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:54:01 np0005532585.localdomain podman[102386]: 2025-11-23 08:54:01.301121898 +0000 UTC m=+0.345778623 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 23 08:54:01 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:54:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:54:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:54:24 np0005532585.localdomain podman[102453]: 2025-11-23 08:54:24.027209087 +0000 UTC m=+0.082944600 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:54:24 np0005532585.localdomain podman[102453]: 2025-11-23 08:54:24.066406557 +0000 UTC m=+0.122142070 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, container_name=collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git)
Nov 23 08:54:24 np0005532585.localdomain podman[102454]: 2025-11-23 08:54:24.077932649 +0000 UTC m=+0.131862346 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid)
Nov 23 08:54:24 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:54:24 np0005532585.localdomain podman[102454]: 2025-11-23 08:54:24.117284814 +0000 UTC m=+0.171214491 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 23 08:54:24 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:54:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:54:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:54:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:54:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:54:27 np0005532585.localdomain systemd[1]: tmp-crun.8KQFLh.mount: Deactivated successfully.
Nov 23 08:54:27 np0005532585.localdomain podman[102494]: 2025-11-23 08:54:27.025961743 +0000 UTC m=+0.078619927 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12)
Nov 23 08:54:27 np0005532585.localdomain podman[102494]: 2025-11-23 08:54:27.051132853 +0000 UTC m=+0.103791057 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public)
Nov 23 08:54:27 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:54:27 np0005532585.localdomain podman[102495]: 2025-11-23 08:54:27.099995209 +0000 UTC m=+0.146362030 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 23 08:54:27 np0005532585.localdomain podman[102492]: 2025-11-23 08:54:27.182944467 +0000 UTC m=+0.237317263 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z)
Nov 23 08:54:27 np0005532585.localdomain podman[102492]: 2025-11-23 08:54:27.189510869 +0000 UTC m=+0.243883685 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:54:27 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:54:27 np0005532585.localdomain podman[102493]: 2025-11-23 08:54:27.236145225 +0000 UTC m=+0.288156149 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Nov 23 08:54:27 np0005532585.localdomain podman[102495]: 2025-11-23 08:54:27.254277441 +0000 UTC m=+0.300644252 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true)
Nov 23 08:54:27 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:54:27 np0005532585.localdomain podman[102493]: 2025-11-23 08:54:27.284257679 +0000 UTC m=+0.336268593 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64)
Nov 23 08:54:27 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:54:28 np0005532585.localdomain sudo[102591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:54:28 np0005532585.localdomain sudo[102591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:54:28 np0005532585.localdomain sudo[102591]: pam_unix(sudo:session): session closed for user root
Nov 23 08:54:28 np0005532585.localdomain sudo[102606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 08:54:28 np0005532585.localdomain sudo[102606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:54:28 np0005532585.localdomain sudo[102606]: pam_unix(sudo:session): session closed for user root
Nov 23 08:54:29 np0005532585.localdomain sudo[102643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:54:29 np0005532585.localdomain sudo[102643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:54:29 np0005532585.localdomain sudo[102643]: pam_unix(sudo:session): session closed for user root
Nov 23 08:54:29 np0005532585.localdomain sudo[102658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:54:29 np0005532585.localdomain sudo[102658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:54:29 np0005532585.localdomain sudo[102658]: pam_unix(sudo:session): session closed for user root
Nov 23 08:54:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:54:30 np0005532585.localdomain podman[102706]: 2025-11-23 08:54:30.036118698 +0000 UTC m=+0.090300295 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z)
Nov 23 08:54:30 np0005532585.localdomain sudo[102729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:54:30 np0005532585.localdomain sudo[102729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:54:30 np0005532585.localdomain sudo[102729]: pam_unix(sudo:session): session closed for user root
Nov 23 08:54:30 np0005532585.localdomain podman[102706]: 2025-11-23 08:54:30.452308044 +0000 UTC m=+0.506489651 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:54:30 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:54:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:54:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:54:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:54:32 np0005532585.localdomain podman[102745]: 2025-11-23 08:54:32.045965636 +0000 UTC m=+0.098457124 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z)
Nov 23 08:54:32 np0005532585.localdomain podman[102746]: 2025-11-23 08:54:32.092927934 +0000 UTC m=+0.142813212 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z)
Nov 23 08:54:32 np0005532585.localdomain podman[102744]: 2025-11-23 08:54:32.006078846 +0000 UTC m=+0.066229048 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container)
Nov 23 08:54:32 np0005532585.localdomain podman[102745]: 2025-11-23 08:54:32.111456721 +0000 UTC m=+0.163948129 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Nov 23 08:54:32 np0005532585.localdomain podman[102745]: unhealthy
Nov 23 08:54:32 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:54:32 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:54:32 np0005532585.localdomain podman[102746]: 2025-11-23 08:54:32.130207424 +0000 UTC m=+0.180092722 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1)
Nov 23 08:54:32 np0005532585.localdomain podman[102746]: unhealthy
Nov 23 08:54:32 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:54:32 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:54:32 np0005532585.localdomain podman[102744]: 2025-11-23 08:54:32.238919432 +0000 UTC m=+0.299069704 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 23 08:54:32 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:54:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:54:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:54:55 np0005532585.localdomain podman[102813]: 2025-11-23 08:54:55.028040828 +0000 UTC m=+0.080255958 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 23 08:54:55 np0005532585.localdomain podman[102813]: 2025-11-23 08:54:55.041337345 +0000 UTC m=+0.093552545 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:54:55 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:54:55 np0005532585.localdomain systemd[1]: tmp-crun.Y6wc9B.mount: Deactivated successfully.
Nov 23 08:54:55 np0005532585.localdomain podman[102814]: 2025-11-23 08:54:55.136987582 +0000 UTC m=+0.185515299 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, container_name=iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:54:55 np0005532585.localdomain podman[102814]: 2025-11-23 08:54:55.150166435 +0000 UTC m=+0.198694092 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z)
Nov 23 08:54:55 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:54:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:54:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:54:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:54:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:54:58 np0005532585.localdomain systemd[1]: tmp-crun.yY7JFg.mount: Deactivated successfully.
Nov 23 08:54:58 np0005532585.localdomain podman[102857]: 2025-11-23 08:54:58.017298442 +0000 UTC m=+0.068679442 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, vcs-type=git)
Nov 23 08:54:58 np0005532585.localdomain podman[102856]: 2025-11-23 08:54:58.040940466 +0000 UTC m=+0.092435130 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute)
Nov 23 08:54:58 np0005532585.localdomain podman[102856]: 2025-11-23 08:54:58.062741043 +0000 UTC m=+0.114235727 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12)
Nov 23 08:54:58 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:54:58 np0005532585.localdomain podman[102857]: 2025-11-23 08:54:58.118912082 +0000 UTC m=+0.170293092 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z)
Nov 23 08:54:58 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:54:58 np0005532585.localdomain podman[102859]: 2025-11-23 08:54:58.120047867 +0000 UTC m=+0.166022762 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Nov 23 08:54:58 np0005532585.localdomain podman[102855]: 2025-11-23 08:54:58.177503695 +0000 UTC m=+0.232475106 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 23 08:54:58 np0005532585.localdomain podman[102855]: 2025-11-23 08:54:58.192308079 +0000 UTC m=+0.247279490 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12)
Nov 23 08:54:58 np0005532585.localdomain podman[102859]: 2025-11-23 08:54:58.200050825 +0000 UTC m=+0.246025750 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, container_name=nova_compute, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:54:58 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:54:58 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:54:59 np0005532585.localdomain systemd[1]: tmp-crun.9pQOLV.mount: Deactivated successfully.
Nov 23 08:55:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:55:01 np0005532585.localdomain podman[102947]: 2025-11-23 08:55:01.01843852 +0000 UTC m=+0.080632398 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:55:01 np0005532585.localdomain podman[102947]: 2025-11-23 08:55:01.374345853 +0000 UTC m=+0.436539661 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:55:01 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:55:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:55:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:55:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:55:03 np0005532585.localdomain podman[102969]: 2025-11-23 08:55:03.040051331 +0000 UTC m=+0.094193884 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4)
Nov 23 08:55:03 np0005532585.localdomain podman[102970]: 2025-11-23 08:55:03.010379963 +0000 UTC m=+0.065263219 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:55:03 np0005532585.localdomain podman[102970]: 2025-11-23 08:55:03.094017022 +0000 UTC m=+0.148900238 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Nov 23 08:55:03 np0005532585.localdomain podman[102970]: unhealthy
Nov 23 08:55:03 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:55:03 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:55:03 np0005532585.localdomain podman[102971]: 2025-11-23 08:55:03.190121904 +0000 UTC m=+0.237625933 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, url=https://www.redhat.com)
Nov 23 08:55:03 np0005532585.localdomain podman[102971]: 2025-11-23 08:55:03.230324424 +0000 UTC m=+0.277828293 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:55:03 np0005532585.localdomain podman[102971]: unhealthy
Nov 23 08:55:03 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:55:03 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:55:03 np0005532585.localdomain podman[102969]: 2025-11-23 08:55:03.270261706 +0000 UTC m=+0.324404239 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:55:03 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:55:04 np0005532585.localdomain systemd[1]: tmp-crun.U3OxDM.mount: Deactivated successfully.
Nov 23 08:55:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:55:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:55:26 np0005532585.localdomain podman[103038]: 2025-11-23 08:55:26.023631197 +0000 UTC m=+0.080214905 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z)
Nov 23 08:55:26 np0005532585.localdomain podman[103038]: 2025-11-23 08:55:26.06127649 +0000 UTC m=+0.117860208 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Nov 23 08:55:26 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:55:26 np0005532585.localdomain systemd[1]: tmp-crun.LjvwhE.mount: Deactivated successfully.
Nov 23 08:55:26 np0005532585.localdomain podman[103039]: 2025-11-23 08:55:26.102819652 +0000 UTC m=+0.155374587 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z)
Nov 23 08:55:26 np0005532585.localdomain podman[103039]: 2025-11-23 08:55:26.139411971 +0000 UTC m=+0.191966866 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:55:26 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:55:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:55:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:55:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:55:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:55:29 np0005532585.localdomain systemd[1]: tmp-crun.dPOTkn.mount: Deactivated successfully.
Nov 23 08:55:29 np0005532585.localdomain podman[103078]: 2025-11-23 08:55:29.03797518 +0000 UTC m=+0.092545613 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, version=17.1.12)
Nov 23 08:55:29 np0005532585.localdomain podman[103082]: 2025-11-23 08:55:29.14478446 +0000 UTC m=+0.186600622 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:55:29 np0005532585.localdomain podman[103079]: 2025-11-23 08:55:29.193250553 +0000 UTC m=+0.240861252 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:55:29 np0005532585.localdomain podman[103077]: 2025-11-23 08:55:29.112415739 +0000 UTC m=+0.166647221 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1)
Nov 23 08:55:29 np0005532585.localdomain podman[103078]: 2025-11-23 08:55:29.218412553 +0000 UTC m=+0.272982986 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4)
Nov 23 08:55:29 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:55:29 np0005532585.localdomain podman[103077]: 2025-11-23 08:55:29.242207871 +0000 UTC m=+0.296439373 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:55:29 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:55:29 np0005532585.localdomain podman[103079]: 2025-11-23 08:55:29.27224684 +0000 UTC m=+0.319857499 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 08:55:29 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:55:29 np0005532585.localdomain podman[103082]: 2025-11-23 08:55:29.324011085 +0000 UTC m=+0.365827237 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, container_name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:55:29 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:55:30 np0005532585.localdomain systemd[1]: tmp-crun.axlcBa.mount: Deactivated successfully.
Nov 23 08:55:30 np0005532585.localdomain sudo[103174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:55:30 np0005532585.localdomain sudo[103174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:55:30 np0005532585.localdomain sudo[103174]: pam_unix(sudo:session): session closed for user root
Nov 23 08:55:30 np0005532585.localdomain sudo[103189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:55:30 np0005532585.localdomain sudo[103189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:55:31 np0005532585.localdomain sudo[103189]: pam_unix(sudo:session): session closed for user root
Nov 23 08:55:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:55:32 np0005532585.localdomain podman[103236]: 2025-11-23 08:55:32.023238573 +0000 UTC m=+0.081163175 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64)
Nov 23 08:55:32 np0005532585.localdomain sudo[103248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:55:32 np0005532585.localdomain sudo[103248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:55:32 np0005532585.localdomain sudo[103248]: pam_unix(sudo:session): session closed for user root
Nov 23 08:55:32 np0005532585.localdomain podman[103236]: 2025-11-23 08:55:32.383801498 +0000 UTC m=+0.441726100 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:55:32 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:55:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:55:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:55:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:55:34 np0005532585.localdomain podman[103273]: 2025-11-23 08:55:34.02806588 +0000 UTC m=+0.082867468 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=)
Nov 23 08:55:34 np0005532585.localdomain systemd[1]: tmp-crun.UQJa3z.mount: Deactivated successfully.
Nov 23 08:55:34 np0005532585.localdomain podman[103274]: 2025-11-23 08:55:34.135779436 +0000 UTC m=+0.188031726 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4)
Nov 23 08:55:34 np0005532585.localdomain podman[103274]: 2025-11-23 08:55:34.147719581 +0000 UTC m=+0.199971871 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, tcib_managed=true, vcs-type=git, distribution-scope=public, version=17.1.12, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com)
Nov 23 08:55:34 np0005532585.localdomain podman[103274]: unhealthy
Nov 23 08:55:34 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:55:34 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:55:34 np0005532585.localdomain podman[103273]: 2025-11-23 08:55:34.198385492 +0000 UTC m=+0.253187130 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 23 08:55:34 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:55:34 np0005532585.localdomain podman[103275]: 2025-11-23 08:55:34.148029361 +0000 UTC m=+0.197365611 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller)
Nov 23 08:55:34 np0005532585.localdomain podman[103275]: 2025-11-23 08:55:34.278502284 +0000 UTC m=+0.327838524 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Nov 23 08:55:34 np0005532585.localdomain podman[103275]: unhealthy
Nov 23 08:55:34 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:55:34 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:55:36 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:55:36 np0005532585.localdomain recover_tripleo_nova_virtqemud[103340]: 61756
Nov 23 08:55:36 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:55:36 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:55:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:55:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:55:57 np0005532585.localdomain systemd[1]: tmp-crun.7YzCQR.mount: Deactivated successfully.
Nov 23 08:55:57 np0005532585.localdomain podman[103341]: 2025-11-23 08:55:57.021072288 +0000 UTC m=+0.074820250 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4)
Nov 23 08:55:57 np0005532585.localdomain podman[103342]: 2025-11-23 08:55:57.058679879 +0000 UTC m=+0.109805461 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, container_name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container)
Nov 23 08:55:57 np0005532585.localdomain podman[103341]: 2025-11-23 08:55:57.087575003 +0000 UTC m=+0.141323005 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Nov 23 08:55:57 np0005532585.localdomain podman[103342]: 2025-11-23 08:55:57.094395663 +0000 UTC m=+0.145521325 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, vendor=Red Hat, Inc.)
Nov 23 08:55:57 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:55:57 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:55:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:55:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:55:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:55:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:56:00 np0005532585.localdomain podman[103379]: 2025-11-23 08:56:00.002951968 +0000 UTC m=+0.064331739 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, distribution-scope=public, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute)
Nov 23 08:56:00 np0005532585.localdomain podman[103378]: 2025-11-23 08:56:00.028304534 +0000 UTC m=+0.089200491 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Nov 23 08:56:00 np0005532585.localdomain systemd[1]: tmp-crun.IA4DEL.mount: Deactivated successfully.
Nov 23 08:56:00 np0005532585.localdomain podman[103380]: 2025-11-23 08:56:00.074717364 +0000 UTC m=+0.131731622 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 23 08:56:00 np0005532585.localdomain podman[103379]: 2025-11-23 08:56:00.080965656 +0000 UTC m=+0.142345327 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 23 08:56:00 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:56:00 np0005532585.localdomain podman[103381]: 2025-11-23 08:56:00.133100901 +0000 UTC m=+0.187905122 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:56:00 np0005532585.localdomain podman[103381]: 2025-11-23 08:56:00.15593781 +0000 UTC m=+0.210742031 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step5, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:56:00 np0005532585.localdomain podman[103378]: 2025-11-23 08:56:00.163535583 +0000 UTC m=+0.224431610 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-cron, release=1761123044, io.buildah.version=1.41.4)
Nov 23 08:56:00 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:56:00 np0005532585.localdomain podman[103380]: 2025-11-23 08:56:00.174093036 +0000 UTC m=+0.231107344 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Nov 23 08:56:00 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:56:00 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:56:00 np0005532585.localdomain systemd[1]: tmp-crun.J0U24S.mount: Deactivated successfully.
Nov 23 08:56:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:56:03 np0005532585.localdomain podman[103476]: 2025-11-23 08:56:03.019348525 +0000 UTC m=+0.079590267 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, managed_by=tripleo_ansible)
Nov 23 08:56:03 np0005532585.localdomain podman[103476]: 2025-11-23 08:56:03.359796704 +0000 UTC m=+0.420038406 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:56:03 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:56:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:56:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:56:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:56:05 np0005532585.localdomain podman[103498]: 2025-11-23 08:56:05.016856988 +0000 UTC m=+0.075223003 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:56:05 np0005532585.localdomain podman[103499]: 2025-11-23 08:56:05.071678445 +0000 UTC m=+0.128560036 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:56:05 np0005532585.localdomain systemd[1]: tmp-crun.PtKX6d.mount: Deactivated successfully.
Nov 23 08:56:05 np0005532585.localdomain podman[103500]: 2025-11-23 08:56:05.134382365 +0000 UTC m=+0.184608821 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller)
Nov 23 08:56:05 np0005532585.localdomain podman[103499]: 2025-11-23 08:56:05.157803431 +0000 UTC m=+0.214685052 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4)
Nov 23 08:56:05 np0005532585.localdomain podman[103499]: unhealthy
Nov 23 08:56:05 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:56:05 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:56:05 np0005532585.localdomain podman[103500]: 2025-11-23 08:56:05.17248529 +0000 UTC m=+0.222711766 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64)
Nov 23 08:56:05 np0005532585.localdomain podman[103500]: unhealthy
Nov 23 08:56:05 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:56:05 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:56:05 np0005532585.localdomain podman[103498]: 2025-11-23 08:56:05.230945339 +0000 UTC m=+0.289311284 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12)
Nov 23 08:56:05 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:56:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:56:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:56:28 np0005532585.localdomain systemd[1]: tmp-crun.f4WIRj.mount: Deactivated successfully.
Nov 23 08:56:28 np0005532585.localdomain podman[103567]: 2025-11-23 08:56:28.03035922 +0000 UTC m=+0.082336671 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Nov 23 08:56:28 np0005532585.localdomain podman[103567]: 2025-11-23 08:56:28.037074885 +0000 UTC m=+0.089052336 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:56:28 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:56:28 np0005532585.localdomain podman[103568]: 2025-11-23 08:56:28.014618088 +0000 UTC m=+0.067304421 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:56:28 np0005532585.localdomain podman[103568]: 2025-11-23 08:56:28.093205393 +0000 UTC m=+0.145891756 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:56:28 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:56:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:56:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:56:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:56:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:56:31 np0005532585.localdomain systemd[1]: tmp-crun.sIrOHK.mount: Deactivated successfully.
Nov 23 08:56:31 np0005532585.localdomain podman[103603]: 2025-11-23 08:56:31.016969903 +0000 UTC m=+0.070974053 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 23 08:56:31 np0005532585.localdomain podman[103602]: 2025-11-23 08:56:31.042412122 +0000 UTC m=+0.099976131 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, release=1761123044, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true)
Nov 23 08:56:31 np0005532585.localdomain podman[103602]: 2025-11-23 08:56:31.078690782 +0000 UTC m=+0.136254801 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, name=rhosp17/openstack-cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 23 08:56:31 np0005532585.localdomain podman[103603]: 2025-11-23 08:56:31.079080014 +0000 UTC m=+0.133084204 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4)
Nov 23 08:56:31 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:56:31 np0005532585.localdomain podman[103610]: 2025-11-23 08:56:31.080488947 +0000 UTC m=+0.127836713 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4)
Nov 23 08:56:31 np0005532585.localdomain podman[103604]: 2025-11-23 08:56:31.147629551 +0000 UTC m=+0.198915798 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64)
Nov 23 08:56:31 np0005532585.localdomain podman[103610]: 2025-11-23 08:56:31.159689221 +0000 UTC m=+0.207037017 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z)
Nov 23 08:56:31 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:56:31 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:56:31 np0005532585.localdomain podman[103604]: 2025-11-23 08:56:31.216874621 +0000 UTC m=+0.268160848 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:56:31 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:56:32 np0005532585.localdomain systemd[1]: tmp-crun.YQ9Xtb.mount: Deactivated successfully.
Nov 23 08:56:32 np0005532585.localdomain sudo[103699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:56:32 np0005532585.localdomain sudo[103699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:56:32 np0005532585.localdomain sudo[103699]: pam_unix(sudo:session): session closed for user root
Nov 23 08:56:32 np0005532585.localdomain sudo[103714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 08:56:32 np0005532585.localdomain sudo[103714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:56:33 np0005532585.localdomain podman[103801]: 2025-11-23 08:56:33.135443208 +0000 UTC m=+0.092759059 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, name=rhceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 08:56:33 np0005532585.localdomain podman[103801]: 2025-11-23 08:56:33.240203554 +0000 UTC m=+0.197519365 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=)
Nov 23 08:56:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:56:33 np0005532585.localdomain podman[103851]: 2025-11-23 08:56:33.525374422 +0000 UTC m=+0.101247040 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:56:33 np0005532585.localdomain sudo[103714]: pam_unix(sudo:session): session closed for user root
Nov 23 08:56:33 np0005532585.localdomain sudo[103888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:56:33 np0005532585.localdomain sudo[103888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:56:33 np0005532585.localdomain sudo[103888]: pam_unix(sudo:session): session closed for user root
Nov 23 08:56:33 np0005532585.localdomain sudo[103904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:56:33 np0005532585.localdomain sudo[103904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:56:33 np0005532585.localdomain podman[103851]: 2025-11-23 08:56:33.89813638 +0000 UTC m=+0.474008988 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Nov 23 08:56:33 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:56:34 np0005532585.localdomain sudo[103904]: pam_unix(sudo:session): session closed for user root
Nov 23 08:56:35 np0005532585.localdomain sudo[103952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:56:35 np0005532585.localdomain sudo[103952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:56:35 np0005532585.localdomain sudo[103952]: pam_unix(sudo:session): session closed for user root
Nov 23 08:56:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:56:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:56:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:56:36 np0005532585.localdomain systemd[1]: tmp-crun.q1E3e5.mount: Deactivated successfully.
Nov 23 08:56:36 np0005532585.localdomain podman[103968]: 2025-11-23 08:56:36.047583853 +0000 UTC m=+0.096992109 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 08:56:36 np0005532585.localdomain podman[103969]: 2025-11-23 08:56:36.093682124 +0000 UTC m=+0.143863154 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:56:36 np0005532585.localdomain podman[103969]: 2025-11-23 08:56:36.110077206 +0000 UTC m=+0.160258216 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Nov 23 08:56:36 np0005532585.localdomain podman[103969]: unhealthy
Nov 23 08:56:36 np0005532585.localdomain podman[103968]: 2025-11-23 08:56:36.118333728 +0000 UTC m=+0.167741974 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 23 08:56:36 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:56:36 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:56:36 np0005532585.localdomain podman[103968]: unhealthy
Nov 23 08:56:36 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:56:36 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:56:36 np0005532585.localdomain podman[103967]: 2025-11-23 08:56:36.188760444 +0000 UTC m=+0.239730869 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr)
Nov 23 08:56:36 np0005532585.localdomain podman[103967]: 2025-11-23 08:56:36.391547429 +0000 UTC m=+0.442517824 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:56:36 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:56:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:56:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:56:59 np0005532585.localdomain podman[104036]: 2025-11-23 08:56:59.009631763 +0000 UTC m=+0.069401728 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:56:59 np0005532585.localdomain podman[104036]: 2025-11-23 08:56:59.024099752 +0000 UTC m=+0.083869727 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 23 08:56:59 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:56:59 np0005532585.localdomain systemd[1]: tmp-crun.EcwyoW.mount: Deactivated successfully.
Nov 23 08:56:59 np0005532585.localdomain podman[104035]: 2025-11-23 08:56:59.088212134 +0000 UTC m=+0.148600028 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 23 08:56:59 np0005532585.localdomain podman[104035]: 2025-11-23 08:56:59.126279988 +0000 UTC m=+0.186667842 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, container_name=collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true)
Nov 23 08:56:59 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:57:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:57:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:57:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:57:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:57:02 np0005532585.localdomain podman[104074]: 2025-11-23 08:57:02.012582826 +0000 UTC m=+0.068262792 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 08:57:02 np0005532585.localdomain podman[104083]: 2025-11-23 08:57:02.065702787 +0000 UTC m=+0.110959930 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.12, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 23 08:57:02 np0005532585.localdomain podman[104074]: 2025-11-23 08:57:02.099668252 +0000 UTC m=+0.155348248 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4)
Nov 23 08:57:02 np0005532585.localdomain podman[104083]: 2025-11-23 08:57:02.118439716 +0000 UTC m=+0.163696899 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:57:02 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:57:02 np0005532585.localdomain podman[104081]: 2025-11-23 08:57:02.134071002 +0000 UTC m=+0.178065365 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1)
Nov 23 08:57:02 np0005532585.localdomain podman[104081]: 2025-11-23 08:57:02.161194174 +0000 UTC m=+0.205188527 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 23 08:57:02 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:57:02 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:57:02 np0005532585.localdomain systemd[1]: tmp-crun.MGlDjV.mount: Deactivated successfully.
Nov 23 08:57:02 np0005532585.localdomain sshd[104155]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:57:02 np0005532585.localdomain podman[104075]: 2025-11-23 08:57:02.25311592 +0000 UTC m=+0.301485689 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute)
Nov 23 08:57:02 np0005532585.localdomain podman[104075]: 2025-11-23 08:57:02.283373471 +0000 UTC m=+0.331743230 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:57:02 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:57:02 np0005532585.localdomain sshd[104155]: Invalid user sol from 193.32.162.146 port 60656
Nov 23 08:57:02 np0005532585.localdomain sshd[104155]: Connection closed by invalid user sol 193.32.162.146 port 60656 [preauth]
Nov 23 08:57:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:57:04 np0005532585.localdomain podman[104173]: 2025-11-23 08:57:04.021342206 +0000 UTC m=+0.081275667 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:57:04 np0005532585.localdomain podman[104173]: 2025-11-23 08:57:04.357216322 +0000 UTC m=+0.417149733 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Nov 23 08:57:04 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:57:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:57:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:57:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:57:06 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:57:06 np0005532585.localdomain recover_tripleo_nova_virtqemud[104208]: 61756
Nov 23 08:57:06 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:57:06 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:57:07 np0005532585.localdomain systemd[1]: tmp-crun.pSJQai.mount: Deactivated successfully.
Nov 23 08:57:07 np0005532585.localdomain podman[104195]: 2025-11-23 08:57:07.058119197 +0000 UTC m=+0.103819437 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:57:07 np0005532585.localdomain podman[104195]: 2025-11-23 08:57:07.095333893 +0000 UTC m=+0.141034173 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, release=1761123044, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:57:07 np0005532585.localdomain podman[104195]: unhealthy
Nov 23 08:57:07 np0005532585.localdomain podman[104194]: 2025-11-23 08:57:07.102604409 +0000 UTC m=+0.152057715 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:57:07 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:57:07 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:57:07 np0005532585.localdomain podman[104196]: 2025-11-23 08:57:07.145927046 +0000 UTC m=+0.187690823 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team)
Nov 23 08:57:07 np0005532585.localdomain podman[104196]: 2025-11-23 08:57:07.165581657 +0000 UTC m=+0.207345444 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:57:07 np0005532585.localdomain podman[104196]: unhealthy
Nov 23 08:57:07 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:57:07 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:57:07 np0005532585.localdomain podman[104194]: 2025-11-23 08:57:07.291784238 +0000 UTC m=+0.341237474 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:57:07 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:57:16 np0005532585.localdomain sshd[104262]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:57:21 np0005532585.localdomain sshd[104266]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:57:22 np0005532585.localdomain sshd[104262]: Connection closed by 183.223.156.154 port 44148 [preauth]
Nov 23 08:57:23 np0005532585.localdomain sshd[104266]: Invalid user debian from 49.207.9.39 port 54798
Nov 23 08:57:24 np0005532585.localdomain sshd[104266]: Connection closed by invalid user debian 49.207.9.39 port 54798 [preauth]
Nov 23 08:57:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:57:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:57:30 np0005532585.localdomain podman[104268]: 2025-11-23 08:57:30.02771943 +0000 UTC m=+0.082631779 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:57:30 np0005532585.localdomain podman[104268]: 2025-11-23 08:57:30.06730626 +0000 UTC m=+0.122218589 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Nov 23 08:57:30 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:57:30 np0005532585.localdomain podman[104269]: 2025-11-23 08:57:30.077140765 +0000 UTC m=+0.130675671 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:57:30 np0005532585.localdomain podman[104269]: 2025-11-23 08:57:30.161109134 +0000 UTC m=+0.214644050 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64)
Nov 23 08:57:30 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:57:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:57:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:57:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:57:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:57:33 np0005532585.localdomain podman[104310]: 2025-11-23 08:57:33.028994449 +0000 UTC m=+0.076433395 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 23 08:57:33 np0005532585.localdomain podman[104310]: 2025-11-23 08:57:33.088535649 +0000 UTC m=+0.135974655 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:57:33 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:57:33 np0005532585.localdomain podman[104308]: 2025-11-23 08:57:33.13587818 +0000 UTC m=+0.188929521 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4)
Nov 23 08:57:33 np0005532585.localdomain podman[104311]: 2025-11-23 08:57:33.090441999 +0000 UTC m=+0.135202512 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 23 08:57:33 np0005532585.localdomain podman[104308]: 2025-11-23 08:57:33.145168299 +0000 UTC m=+0.198219580 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:57:33 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:57:33 np0005532585.localdomain podman[104311]: 2025-11-23 08:57:33.170325501 +0000 UTC m=+0.215085954 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:36:58Z, release=1761123044, config_id=tripleo_step5, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 08:57:33 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully.
Nov 23 08:57:33 np0005532585.localdomain podman[104309]: 2025-11-23 08:57:33.194915875 +0000 UTC m=+0.244852029 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1)
Nov 23 08:57:33 np0005532585.localdomain podman[104309]: 2025-11-23 08:57:33.226334801 +0000 UTC m=+0.276270985 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-type=git, url=https://www.redhat.com, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:57:33 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:57:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:57:35 np0005532585.localdomain systemd[1]: tmp-crun.uv9ltS.mount: Deactivated successfully.
Nov 23 08:57:35 np0005532585.localdomain podman[104403]: 2025-11-23 08:57:35.005677582 +0000 UTC m=+0.065566539 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 23 08:57:35 np0005532585.localdomain sudo[104425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:57:35 np0005532585.localdomain sudo[104425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:57:35 np0005532585.localdomain sudo[104425]: pam_unix(sudo:session): session closed for user root
Nov 23 08:57:35 np0005532585.localdomain sudo[104441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:57:35 np0005532585.localdomain sudo[104441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:57:35 np0005532585.localdomain podman[104403]: 2025-11-23 08:57:35.391242012 +0000 UTC m=+0.451130959 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:57:35 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:57:35 np0005532585.localdomain sudo[104441]: pam_unix(sudo:session): session closed for user root
Nov 23 08:57:36 np0005532585.localdomain sudo[104487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:57:36 np0005532585.localdomain sudo[104487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:57:36 np0005532585.localdomain sudo[104487]: pam_unix(sudo:session): session closed for user root
Nov 23 08:57:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:57:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:57:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:57:38 np0005532585.localdomain podman[104502]: 2025-11-23 08:57:38.034853699 +0000 UTC m=+0.081744710 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:57:38 np0005532585.localdomain systemd[1]: tmp-crun.vYvEjI.mount: Deactivated successfully.
Nov 23 08:57:38 np0005532585.localdomain podman[104504]: 2025-11-23 08:57:38.102732218 +0000 UTC m=+0.146049539 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, container_name=ovn_controller, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:57:38 np0005532585.localdomain podman[104503]: 2025-11-23 08:57:38.144823926 +0000 UTC m=+0.191573903 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team)
Nov 23 08:57:38 np0005532585.localdomain podman[104503]: 2025-11-23 08:57:38.161181145 +0000 UTC m=+0.207931072 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Nov 23 08:57:38 np0005532585.localdomain podman[104503]: unhealthy
Nov 23 08:57:38 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:57:38 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:57:38 np0005532585.localdomain podman[104504]: 2025-11-23 08:57:38.173316812 +0000 UTC m=+0.216634113 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team)
Nov 23 08:57:38 np0005532585.localdomain podman[104504]: unhealthy
Nov 23 08:57:38 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:57:38 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:57:38 np0005532585.localdomain podman[104502]: 2025-11-23 08:57:38.30232175 +0000 UTC m=+0.349212711 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 23 08:57:38 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:58:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:58:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:58:01 np0005532585.localdomain podman[104567]: 2025-11-23 08:58:01.024949498 +0000 UTC m=+0.082589667 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid)
Nov 23 08:58:01 np0005532585.localdomain podman[104566]: 2025-11-23 08:58:01.062483285 +0000 UTC m=+0.122340903 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:58:01 np0005532585.localdomain podman[104566]: 2025-11-23 08:58:01.071634239 +0000 UTC m=+0.131491877 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-collectd, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 23 08:58:01 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:58:01 np0005532585.localdomain podman[104567]: 2025-11-23 08:58:01.11640226 +0000 UTC m=+0.174042409 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public)
Nov 23 08:58:01 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:58:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:58:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:58:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:58:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:58:04 np0005532585.localdomain podman[104605]: 2025-11-23 08:58:04.037064996 +0000 UTC m=+0.091387592 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 23 08:58:04 np0005532585.localdomain podman[104613]: 2025-11-23 08:58:04.09676464 +0000 UTC m=+0.137581276 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container)
Nov 23 08:58:04 np0005532585.localdomain podman[104613]: 2025-11-23 08:58:04.14049216 +0000 UTC m=+0.181308776 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute)
Nov 23 08:58:04 np0005532585.localdomain podman[104613]: unhealthy
Nov 23 08:58:04 np0005532585.localdomain systemd[1]: tmp-crun.agXxkw.mount: Deactivated successfully.
Nov 23 08:58:04 np0005532585.localdomain podman[104606]: 2025-11-23 08:58:04.158059395 +0000 UTC m=+0.207496688 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:58:04 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:58:04 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:58:04 np0005532585.localdomain podman[104607]: 2025-11-23 08:58:04.207704288 +0000 UTC m=+0.254563961 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 23 08:58:04 np0005532585.localdomain podman[104606]: 2025-11-23 08:58:04.214326013 +0000 UTC m=+0.263763316 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:58:04 np0005532585.localdomain podman[104605]: 2025-11-23 08:58:04.222691693 +0000 UTC m=+0.277014279 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, tcib_managed=true, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044)
Nov 23 08:58:04 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:58:04 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:58:04 np0005532585.localdomain podman[104607]: 2025-11-23 08:58:04.267364142 +0000 UTC m=+0.314223795 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:58:04 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:58:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:58:06 np0005532585.localdomain podman[104700]: 2025-11-23 08:58:06.026633628 +0000 UTC m=+0.077482779 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:58:06 np0005532585.localdomain podman[104700]: 2025-11-23 08:58:06.369730659 +0000 UTC m=+0.420579780 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 23 08:58:06 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:58:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:58:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:58:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:58:09 np0005532585.localdomain systemd[1]: tmp-crun.2HryfH.mount: Deactivated successfully.
Nov 23 08:58:09 np0005532585.localdomain podman[104723]: 2025-11-23 08:58:09.037083854 +0000 UTC m=+0.092543228 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Nov 23 08:58:09 np0005532585.localdomain podman[104724]: 2025-11-23 08:58:09.087688826 +0000 UTC m=+0.140873179 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com)
Nov 23 08:58:09 np0005532585.localdomain podman[104724]: 2025-11-23 08:58:09.100856274 +0000 UTC m=+0.154040637 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, batch=17.1_20251118.1)
Nov 23 08:58:09 np0005532585.localdomain podman[104724]: unhealthy
Nov 23 08:58:09 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:58:09 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:58:09 np0005532585.localdomain podman[104725]: 2025-11-23 08:58:09.188600042 +0000 UTC m=+0.239268177 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller)
Nov 23 08:58:09 np0005532585.localdomain podman[104725]: 2025-11-23 08:58:09.205169937 +0000 UTC m=+0.255838042 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 23 08:58:09 np0005532585.localdomain podman[104725]: unhealthy
Nov 23 08:58:09 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:58:09 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:58:09 np0005532585.localdomain podman[104723]: 2025-11-23 08:58:09.252325631 +0000 UTC m=+0.307784985 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 23 08:58:09 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:58:10 np0005532585.localdomain systemd[1]: tmp-crun.3W44de.mount: Deactivated successfully.
Nov 23 08:58:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:58:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:58:32 np0005532585.localdomain podman[104792]: 2025-11-23 08:58:32.026279561 +0000 UTC m=+0.080176873 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:58:32 np0005532585.localdomain podman[104792]: 2025-11-23 08:58:32.037238502 +0000 UTC m=+0.091135814 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:58:32 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:58:32 np0005532585.localdomain podman[104791]: 2025-11-23 08:58:32.122307255 +0000 UTC m=+0.178381474 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1)
Nov 23 08:58:32 np0005532585.localdomain podman[104791]: 2025-11-23 08:58:32.157283072 +0000 UTC m=+0.213357261 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 08:58:32 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:58:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:58:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:58:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:58:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: tmp-crun.fvgcfK.mount: Deactivated successfully.
Nov 23 08:58:35 np0005532585.localdomain podman[104831]: 2025-11-23 08:58:35.05258702 +0000 UTC m=+0.101791544 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: tmp-crun.QeDz2n.mount: Deactivated successfully.
Nov 23 08:58:35 np0005532585.localdomain podman[104833]: 2025-11-23 08:58:35.100087416 +0000 UTC m=+0.142415297 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 23 08:58:35 np0005532585.localdomain podman[104831]: 2025-11-23 08:58:35.133342569 +0000 UTC m=+0.182547033 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:58:35 np0005532585.localdomain podman[104830]: 2025-11-23 08:58:35.138414617 +0000 UTC m=+0.188319542 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 23 08:58:35 np0005532585.localdomain podman[104832]: 2025-11-23 08:58:35.205282435 +0000 UTC m=+0.250213917 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, version=17.1.12)
Nov 23 08:58:35 np0005532585.localdomain podman[104830]: 2025-11-23 08:58:35.219193347 +0000 UTC m=+0.269098292 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:58:35 np0005532585.localdomain podman[104832]: 2025-11-23 08:58:35.258320173 +0000 UTC m=+0.303251685 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:58:35 np0005532585.localdomain podman[104833]: 2025-11-23 08:58:35.271326887 +0000 UTC m=+0.313654768 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1)
Nov 23 08:58:35 np0005532585.localdomain podman[104833]: unhealthy
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:58:35 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:58:36 np0005532585.localdomain sudo[104926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:58:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:58:36 np0005532585.localdomain sudo[104926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:58:36 np0005532585.localdomain sudo[104926]: pam_unix(sudo:session): session closed for user root
Nov 23 08:58:36 np0005532585.localdomain sudo[104947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:58:36 np0005532585.localdomain sudo[104947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:58:36 np0005532585.localdomain systemd[1]: tmp-crun.DGio3p.mount: Deactivated successfully.
Nov 23 08:58:36 np0005532585.localdomain podman[104940]: 2025-11-23 08:58:36.902287597 +0000 UTC m=+0.102595159 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:58:37 np0005532585.localdomain podman[104940]: 2025-11-23 08:58:37.302325057 +0000 UTC m=+0.502632620 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 23 08:58:37 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:58:37 np0005532585.localdomain sudo[104947]: pam_unix(sudo:session): session closed for user root
Nov 23 08:58:38 np0005532585.localdomain sudo[105012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:58:38 np0005532585.localdomain sudo[105012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:58:38 np0005532585.localdomain sudo[105012]: pam_unix(sudo:session): session closed for user root
Nov 23 08:58:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:58:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:58:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:58:40 np0005532585.localdomain podman[105027]: 2025-11-23 08:58:40.030085899 +0000 UTC m=+0.078360606 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 23 08:58:40 np0005532585.localdomain podman[105028]: 2025-11-23 08:58:40.092013793 +0000 UTC m=+0.140544168 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 23 08:58:40 np0005532585.localdomain podman[105028]: 2025-11-23 08:58:40.106277026 +0000 UTC m=+0.154807421 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Nov 23 08:58:40 np0005532585.localdomain podman[105028]: unhealthy
Nov 23 08:58:40 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:58:40 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:58:40 np0005532585.localdomain systemd[1]: tmp-crun.v9VJvr.mount: Deactivated successfully.
Nov 23 08:58:40 np0005532585.localdomain podman[105029]: 2025-11-23 08:58:40.23898589 +0000 UTC m=+0.285214263 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 23 08:58:40 np0005532585.localdomain podman[105029]: 2025-11-23 08:58:40.253642716 +0000 UTC m=+0.299871029 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., container_name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:58:40 np0005532585.localdomain podman[105029]: unhealthy
Nov 23 08:58:40 np0005532585.localdomain podman[105027]: 2025-11-23 08:58:40.260680164 +0000 UTC m=+0.308954861 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:58:40 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:58:40 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:58:40 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:58:56 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 08:58:56 np0005532585.localdomain recover_tripleo_nova_virtqemud[105098]: 61756
Nov 23 08:58:56 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 08:58:56 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 08:59:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:59:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:59:03 np0005532585.localdomain podman[105100]: 2025-11-23 08:59:03.035689108 +0000 UTC m=+0.083453914 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 08:59:03 np0005532585.localdomain podman[105100]: 2025-11-23 08:59:03.076169587 +0000 UTC m=+0.123934343 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:59:03 np0005532585.localdomain systemd[1]: tmp-crun.OzLK2F.mount: Deactivated successfully.
Nov 23 08:59:03 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:59:03 np0005532585.localdomain podman[105099]: 2025-11-23 08:59:03.095740615 +0000 UTC m=+0.145943507 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 08:59:03 np0005532585.localdomain podman[105099]: 2025-11-23 08:59:03.179673703 +0000 UTC m=+0.229876625 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:59:03 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:59:03 np0005532585.localdomain sshd[105138]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:59:03 np0005532585.localdomain sshd[105138]: Accepted publickey for zuul from 192.168.122.30 port 60194 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 08:59:03 np0005532585.localdomain systemd-logind[761]: New session 36 of user zuul.
Nov 23 08:59:03 np0005532585.localdomain systemd[1]: Started Session 36 of User zuul.
Nov 23 08:59:03 np0005532585.localdomain sshd[105138]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 08:59:04 np0005532585.localdomain sudo[105231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbvcpjorvauhfxtjfprytdhxuphqiyzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888343.954102-27-23459231312457/AnsiballZ_stat.py
Nov 23 08:59:04 np0005532585.localdomain sudo[105231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:04 np0005532585.localdomain python3.9[105233]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 08:59:04 np0005532585.localdomain sudo[105231]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:05 np0005532585.localdomain sudo[105325]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhnvyqkfetevfaxlyoumorwickufnoqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888344.8654423-63-278769079219804/AnsiballZ_command.py
Nov 23 08:59:05 np0005532585.localdomain sudo[105325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:59:05 np0005532585.localdomain podman[105331]: 2025-11-23 08:59:05.522544613 +0000 UTC m=+0.067037574 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 23 08:59:05 np0005532585.localdomain podman[105329]: 2025-11-23 08:59:05.588972668 +0000 UTC m=+0.134125469 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true)
Nov 23 08:59:05 np0005532585.localdomain python3.9[105327]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 08:59:05 np0005532585.localdomain podman[105328]: 2025-11-23 08:59:05.638878607 +0000 UTC m=+0.180505559 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:32Z)
Nov 23 08:59:05 np0005532585.localdomain podman[105328]: 2025-11-23 08:59:05.646383871 +0000 UTC m=+0.188010863 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, version=17.1.12, container_name=logrotate_crond, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4)
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:59:05 np0005532585.localdomain podman[105329]: 2025-11-23 08:59:05.667025492 +0000 UTC m=+0.212178293 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:59:05 np0005532585.localdomain podman[105331]: 2025-11-23 08:59:05.669735596 +0000 UTC m=+0.214228637 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully.
Nov 23 08:59:05 np0005532585.localdomain sudo[105325]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:05 np0005532585.localdomain podman[105330]: 2025-11-23 08:59:05.567420618 +0000 UTC m=+0.112957081 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Nov 23 08:59:05 np0005532585.localdomain podman[105330]: 2025-11-23 08:59:05.74805559 +0000 UTC m=+0.293592033 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, architecture=x86_64)
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:59:05 np0005532585.localdomain podman[105331]: unhealthy
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:59:05 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:59:06 np0005532585.localdomain sudo[105513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwgceushunvjsojtituuhfkqufzhitsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888345.9128897-87-57327129624343/AnsiballZ_stat.py
Nov 23 08:59:06 np0005532585.localdomain sudo[105513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:06 np0005532585.localdomain python3.9[105515]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 08:59:06 np0005532585.localdomain sudo[105513]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:06 np0005532585.localdomain sudo[105607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsynjuhsftpabvscviegeaytcxmuteeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888346.5913508-111-24549590470067/AnsiballZ_command.py
Nov 23 08:59:06 np0005532585.localdomain sudo[105607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:07 np0005532585.localdomain python3.9[105609]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 08:59:07 np0005532585.localdomain sudo[105607]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:07 np0005532585.localdomain sudo[105700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcnipczxudxolvoinufmageuxrxqrbxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888347.411076-138-273140134534541/AnsiballZ_command.py
Nov 23 08:59:07 np0005532585.localdomain sudo[105700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:59:07 np0005532585.localdomain systemd[1]: tmp-crun.xJGjog.mount: Deactivated successfully.
Nov 23 08:59:07 np0005532585.localdomain podman[105703]: 2025-11-23 08:59:07.731795772 +0000 UTC m=+0.067741276 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:59:07 np0005532585.localdomain python3.9[105702]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 08:59:07 np0005532585.localdomain sudo[105700]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:08 np0005532585.localdomain podman[105703]: 2025-11-23 08:59:08.086345109 +0000 UTC m=+0.422290643 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 08:59:08 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:59:08 np0005532585.localdomain python3.9[105816]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 23 08:59:10 np0005532585.localdomain python3.9[105906]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 08:59:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:59:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:59:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:59:11 np0005532585.localdomain podman[105938]: 2025-11-23 08:59:11.049773954 +0000 UTC m=+0.094108896 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git)
Nov 23 08:59:11 np0005532585.localdomain podman[105937]: 2025-11-23 08:59:11.011065301 +0000 UTC m=+0.060930655 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12)
Nov 23 08:59:11 np0005532585.localdomain podman[105938]: 2025-11-23 08:59:11.094464952 +0000 UTC m=+0.138799874 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, version=17.1.12, container_name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 08:59:11 np0005532585.localdomain podman[105938]: unhealthy
Nov 23 08:59:11 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:59:11 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:59:11 np0005532585.localdomain podman[105936]: 2025-11-23 08:59:11.112020168 +0000 UTC m=+0.161317514 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, container_name=metrics_qdr)
Nov 23 08:59:11 np0005532585.localdomain podman[105937]: 2025-11-23 08:59:11.1449325 +0000 UTC m=+0.194797864 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Nov 23 08:59:11 np0005532585.localdomain podman[105937]: unhealthy
Nov 23 08:59:11 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:59:11 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:59:11 np0005532585.localdomain podman[105936]: 2025-11-23 08:59:11.31936071 +0000 UTC m=+0.368658036 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 23 08:59:11 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:59:11 np0005532585.localdomain python3.9[106065]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 23 08:59:12 np0005532585.localdomain python3.9[106155]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 08:59:13 np0005532585.localdomain python3.9[106203]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 08:59:13 np0005532585.localdomain sshd[105138]: pam_unix(sshd:session): session closed for user zuul
Nov 23 08:59:13 np0005532585.localdomain systemd[1]: session-36.scope: Deactivated successfully.
Nov 23 08:59:13 np0005532585.localdomain systemd[1]: session-36.scope: Consumed 4.717s CPU time.
Nov 23 08:59:13 np0005532585.localdomain systemd-logind[761]: Session 36 logged out. Waiting for processes to exit.
Nov 23 08:59:13 np0005532585.localdomain systemd-logind[761]: Removed session 36.
Nov 23 08:59:20 np0005532585.localdomain sshd[106219]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 08:59:20 np0005532585.localdomain sshd[106219]: Accepted publickey for zuul from 192.168.122.30 port 45020 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 08:59:20 np0005532585.localdomain systemd-logind[761]: New session 37 of user zuul.
Nov 23 08:59:20 np0005532585.localdomain systemd[1]: Started Session 37 of User zuul.
Nov 23 08:59:20 np0005532585.localdomain sshd[106219]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 08:59:21 np0005532585.localdomain sudo[106312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhabcowlhswkwxxjjddxouqkqhdxtbmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888361.0824375-24-262584111087836/AnsiballZ_systemd_service.py
Nov 23 08:59:21 np0005532585.localdomain sudo[106312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:22 np0005532585.localdomain python3.9[106314]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 08:59:22 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:59:22 np0005532585.localdomain systemd-rc-local-generator[106337]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:59:22 np0005532585.localdomain systemd-sysv-generator[106343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:59:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:59:22 np0005532585.localdomain sudo[106312]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:23 np0005532585.localdomain python3.9[106440]: ansible-ansible.builtin.service_facts Invoked
Nov 23 08:59:23 np0005532585.localdomain network[106457]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 08:59:23 np0005532585.localdomain network[106458]: 'network-scripts' will be removed from distribution in near future.
Nov 23 08:59:23 np0005532585.localdomain network[106459]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 08:59:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:59:27 np0005532585.localdomain python3.9[106656]: ansible-ansible.builtin.service_facts Invoked
Nov 23 08:59:27 np0005532585.localdomain network[106673]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 08:59:27 np0005532585.localdomain network[106674]: 'network-scripts' will be removed from distribution in near future.
Nov 23 08:59:27 np0005532585.localdomain network[106675]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 08:59:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:59:31 np0005532585.localdomain sudo[106872]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkmctedbghndnaneumbqhaplsguyieun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888370.8272278-114-67007771652297/AnsiballZ_systemd_service.py
Nov 23 08:59:31 np0005532585.localdomain sudo[106872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 08:59:31 np0005532585.localdomain python3.9[106874]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 08:59:31 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 08:59:31 np0005532585.localdomain systemd-rc-local-generator[106898]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 08:59:31 np0005532585.localdomain systemd-sysv-generator[106906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 08:59:31 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 08:59:31 np0005532585.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Nov 23 08:59:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 08:59:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 08:59:34 np0005532585.localdomain podman[106929]: 2025-11-23 08:59:34.028065327 +0000 UTC m=+0.079665516 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1)
Nov 23 08:59:34 np0005532585.localdomain podman[106928]: 2025-11-23 08:59:34.077734551 +0000 UTC m=+0.132364574 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4)
Nov 23 08:59:34 np0005532585.localdomain podman[106928]: 2025-11-23 08:59:34.0893113 +0000 UTC m=+0.143941323 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Nov 23 08:59:34 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 08:59:34 np0005532585.localdomain podman[106929]: 2025-11-23 08:59:34.142765141 +0000 UTC m=+0.194365280 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 23 08:59:34 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 08:59:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 08:59:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 08:59:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 08:59:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 08:59:36 np0005532585.localdomain podman[106969]: 2025-11-23 08:59:36.013238851 +0000 UTC m=+0.062514653 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Nov 23 08:59:36 np0005532585.localdomain podman[106968]: Error: container 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 is not running
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Main process exited, code=exited, status=125/n/a
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed with result 'exit-code'.
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: tmp-crun.wzhc5v.mount: Deactivated successfully.
Nov 23 08:59:36 np0005532585.localdomain podman[106979]: 2025-11-23 08:59:36.078976854 +0000 UTC m=+0.120163814 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 08:59:36 np0005532585.localdomain podman[106969]: 2025-11-23 08:59:36.085726194 +0000 UTC m=+0.135002066 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 23 08:59:36 np0005532585.localdomain podman[106979]: 2025-11-23 08:59:36.095182637 +0000 UTC m=+0.136369637 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, container_name=nova_compute, config_id=tripleo_step5, batch=17.1_20251118.1)
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 08:59:36 np0005532585.localdomain podman[106979]: unhealthy
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 08:59:36 np0005532585.localdomain podman[106967]: 2025-11-23 08:59:36.164652886 +0000 UTC m=+0.216844328 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 08:59:36 np0005532585.localdomain podman[106967]: 2025-11-23 08:59:36.172376466 +0000 UTC m=+0.224567998 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:59:36 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 08:59:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 08:59:38 np0005532585.localdomain podman[107048]: 2025-11-23 08:59:38.253173134 +0000 UTC m=+0.064758674 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 08:59:38 np0005532585.localdomain sudo[107070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 08:59:38 np0005532585.localdomain sudo[107070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:59:38 np0005532585.localdomain sudo[107070]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:38 np0005532585.localdomain sudo[107086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 08:59:38 np0005532585.localdomain sudo[107086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:59:38 np0005532585.localdomain podman[107048]: 2025-11-23 08:59:38.671486492 +0000 UTC m=+0.483072072 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=)
Nov 23 08:59:38 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 08:59:39 np0005532585.localdomain sudo[107086]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:39 np0005532585.localdomain sudo[107133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 08:59:39 np0005532585.localdomain sudo[107133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 08:59:39 np0005532585.localdomain sudo[107133]: pam_unix(sudo:session): session closed for user root
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: tmp-crun.0y2C73.mount: Deactivated successfully.
Nov 23 08:59:41 np0005532585.localdomain podman[107148]: 2025-11-23 08:59:41.545508559 +0000 UTC m=+0.099032039 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, version=17.1.12, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 08:59:41 np0005532585.localdomain podman[107149]: 2025-11-23 08:59:41.642586895 +0000 UTC m=+0.193864655 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 08:59:41 np0005532585.localdomain podman[107150]: 2025-11-23 08:59:41.614024167 +0000 UTC m=+0.161473168 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 08:59:41 np0005532585.localdomain podman[107149]: 2025-11-23 08:59:41.686432108 +0000 UTC m=+0.237709858 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com)
Nov 23 08:59:41 np0005532585.localdomain podman[107149]: unhealthy
Nov 23 08:59:41 np0005532585.localdomain podman[107150]: 2025-11-23 08:59:41.697337266 +0000 UTC m=+0.244786327 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 08:59:41 np0005532585.localdomain podman[107150]: unhealthy
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 08:59:41 np0005532585.localdomain podman[107148]: 2025-11-23 08:59:41.74541025 +0000 UTC m=+0.298933720 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public)
Nov 23 08:59:41 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 08:59:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2621 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B423260000000001030307) 
Nov 23 08:59:42 np0005532585.localdomain systemd[1]: tmp-crun.eT6Kbg.mount: Deactivated successfully.
Nov 23 08:59:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2622 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B427200000000001030307) 
Nov 23 08:59:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2623 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B42F200000000001030307) 
Nov 23 08:59:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10032 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4346B0000000001030307) 
Nov 23 08:59:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10033 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B438600000000001030307) 
Nov 23 08:59:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2624 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B43EE00000000001030307) 
Nov 23 08:59:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10034 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B440600000000001030307) 
Nov 23 08:59:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10035 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B450200000000001030307) 
Nov 23 08:59:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2625 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B460210000000001030307) 
Nov 23 08:59:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20543 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4692A0000000001030307) 
Nov 23 08:59:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26845 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B469DA0000000001030307) 
Nov 23 09:00:00 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20544 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B46D200000000001030307) 
Nov 23 09:00:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26846 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B46DE10000000001030307) 
Nov 23 09:00:01 np0005532585.localdomain CROND[107217]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Nov 23 09:00:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10036 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B470210000000001030307) 
Nov 23 09:00:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25363 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B472500000000001030307) 
Nov 23 09:00:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20545 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B475210000000001030307) 
Nov 23 09:00:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26847 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B475E00000000001030307) 
Nov 23 09:00:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25364 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B476610000000001030307) 
Nov 23 09:00:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 09:00:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 09:00:04 np0005532585.localdomain podman[107221]: 2025-11-23 09:00:04.295199577 +0000 UTC m=+0.082536717 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, vcs-type=git)
Nov 23 09:00:04 np0005532585.localdomain podman[107221]: 2025-11-23 09:00:04.329610646 +0000 UTC m=+0.116947796 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044)
Nov 23 09:00:04 np0005532585.localdomain podman[107220]: 2025-11-23 09:00:04.344389064 +0000 UTC m=+0.134420267 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 23 09:00:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 09:00:04 np0005532585.localdomain podman[107220]: 2025-11-23 09:00:04.349939987 +0000 UTC m=+0.139971220 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12)
Nov 23 09:00:04 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 09:00:05 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25365 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B47E600000000001030307) 
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 09:00:06 np0005532585.localdomain podman[107261]: 2025-11-23 09:00:06.522445773 +0000 UTC m=+0.071593096 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:12:45Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Nov 23 09:00:06 np0005532585.localdomain podman[107261]: 2025-11-23 09:00:06.578290329 +0000 UTC m=+0.127437652 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com)
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: tmp-crun.YLWXYc.mount: Deactivated successfully.
Nov 23 09:00:06 np0005532585.localdomain podman[107262]: 2025-11-23 09:00:06.591089396 +0000 UTC m=+0.140892099 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4)
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully.
Nov 23 09:00:06 np0005532585.localdomain podman[107260]: Error: container 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 is not running
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Main process exited, code=exited, status=125/n/a
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed with result 'exit-code'.
Nov 23 09:00:06 np0005532585.localdomain podman[107259]: 2025-11-23 09:00:06.683177397 +0000 UTC m=+0.239603645 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, container_name=logrotate_crond)
Nov 23 09:00:06 np0005532585.localdomain podman[107262]: 2025-11-23 09:00:06.709165575 +0000 UTC m=+0.258968328 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 09:00:06 np0005532585.localdomain podman[107262]: unhealthy
Nov 23 09:00:06 np0005532585.localdomain podman[107259]: 2025-11-23 09:00:06.717593237 +0000 UTC m=+0.274019435 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron)
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 09:00:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 09:00:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20546 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B484E00000000001030307) 
Nov 23 09:00:07 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26848 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B485A00000000001030307) 
Nov 23 09:00:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 09:00:09 np0005532585.localdomain systemd[1]: tmp-crun.sbcHqp.mount: Deactivated successfully.
Nov 23 09:00:09 np0005532585.localdomain podman[107338]: 2025-11-23 09:00:09.016143121 +0000 UTC m=+0.073111623 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Nov 23 09:00:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25366 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B48E210000000001030307) 
Nov 23 09:00:09 np0005532585.localdomain podman[107338]: 2025-11-23 09:00:09.397425968 +0000 UTC m=+0.454394450 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Nov 23 09:00:09 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 09:00:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33852 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B498560000000001030307) 
Nov 23 09:00:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 09:00:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:00:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:00:12 np0005532585.localdomain podman[107361]: 2025-11-23 09:00:12.029484804 +0000 UTC m=+0.079823431 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Nov 23 09:00:12 np0005532585.localdomain systemd[1]: tmp-crun.vtmvH2.mount: Deactivated successfully.
Nov 23 09:00:12 np0005532585.localdomain podman[107363]: 2025-11-23 09:00:12.103453383 +0000 UTC m=+0.146630237 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 09:00:12 np0005532585.localdomain podman[107362]: 2025-11-23 09:00:12.140504534 +0000 UTC m=+0.187093865 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 23 09:00:12 np0005532585.localdomain podman[107363]: 2025-11-23 09:00:12.187286178 +0000 UTC m=+0.230463002 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12)
Nov 23 09:00:12 np0005532585.localdomain podman[107363]: unhealthy
Nov 23 09:00:12 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:00:12 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:00:12 np0005532585.localdomain podman[107362]: 2025-11-23 09:00:12.20696985 +0000 UTC m=+0.253559141 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 23 09:00:12 np0005532585.localdomain podman[107362]: unhealthy
Nov 23 09:00:12 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:00:12 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:00:12 np0005532585.localdomain podman[107361]: 2025-11-23 09:00:12.219638033 +0000 UTC m=+0.269976620 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 23 09:00:12 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 09:00:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33853 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B49C600000000001030307) 
Nov 23 09:00:13 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2626 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4A0200000000001030307) 
Nov 23 09:00:13 np0005532585.localdomain podman[106914]: time="2025-11-23T09:00:13Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Nov 23 09:00:13 np0005532585.localdomain systemd[1]: libpod-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope: Deactivated successfully.
Nov 23 09:00:13 np0005532585.localdomain systemd[1]: libpod-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope: Consumed 6.235s CPU time.
Nov 23 09:00:13 np0005532585.localdomain podman[106914]: 2025-11-23 09:00:13.995664431 +0000 UTC m=+42.091552544 container stop 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 09:00:14 np0005532585.localdomain podman[106914]: 2025-11-23 09:00:14.03328767 +0000 UTC m=+42.129175763 container died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=)
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: Deactivated successfully.
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: No such file or directory
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9-userdata-shm.mount: Deactivated successfully.
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-43834aabac3051c95b0bd48b6a3d7296604e45656eac8be0b6aa4803a8bc68b2-merged.mount: Deactivated successfully.
Nov 23 09:00:14 np0005532585.localdomain podman[106914]: 2025-11-23 09:00:14.138839779 +0000 UTC m=+42.234727802 container cleanup 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 23 09:00:14 np0005532585.localdomain podman[106914]: ceilometer_agent_compute
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: No such file or directory
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: No such file or directory
Nov 23 09:00:14 np0005532585.localdomain podman[107431]: 2025-11-23 09:00:14.153701502 +0000 UTC m=+0.143861592 container cleanup 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: libpod-conmon-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope: Deactivated successfully.
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: No such file or directory
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: No such file or directory
Nov 23 09:00:14 np0005532585.localdomain podman[107445]: 2025-11-23 09:00:14.262501293 +0000 UTC m=+0.079147201 container cleanup 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 23 09:00:14 np0005532585.localdomain podman[107445]: ceilometer_agent_compute
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.048s CPU time, no IO.
Nov 23 09:00:14 np0005532585.localdomain sudo[106872]: pam_unix(sudo:session): session closed for user root
Nov 23 09:00:14 np0005532585.localdomain sudo[107547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcublxtjlsflklhnvuktgmjnlrjdhmgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888414.4224265-114-166985223959493/AnsiballZ_systemd_service.py
Nov 23 09:00:14 np0005532585.localdomain sudo[107547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:00:14 np0005532585.localdomain python3.9[107549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:00:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33854 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4A4600000000001030307) 
Nov 23 09:00:14 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:00:15 np0005532585.localdomain systemd-sysv-generator[107579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:00:15 np0005532585.localdomain systemd-rc-local-generator[107576]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:00:15 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:00:15 np0005532585.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Nov 23 09:00:16 np0005532585.localdomain sshd[107603]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:00:16 np0005532585.localdomain sshd[107603]: Invalid user sol from 193.32.162.146 port 41554
Nov 23 09:00:16 np0005532585.localdomain sshd[107603]: Connection closed by invalid user sol 193.32.162.146 port 41554 [preauth]
Nov 23 09:00:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10037 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4B0210000000001030307) 
Nov 23 09:00:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19456 DF PROTO=TCP SPT=35234 DPT=9102 SEQ=3403046721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4C5610000000001030307) 
Nov 23 09:00:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:5b:6f:0e MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=41938 SEQ=2615091442 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Nov 23 09:00:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33856 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4D4210000000001030307) 
Nov 23 09:00:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32818 DF PROTO=TCP SPT=41184 DPT=9105 SEQ=2384712410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4DF0B0000000001030307) 
Nov 23 09:00:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14926 DF PROTO=TCP SPT=45750 DPT=9101 SEQ=1924788322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4EA610000000001030307) 
Nov 23 09:00:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 09:00:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 09:00:34 np0005532585.localdomain podman[107605]: 2025-11-23 09:00:34.528804116 +0000 UTC m=+0.084424794 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 23 09:00:34 np0005532585.localdomain podman[107606]: 2025-11-23 09:00:34.57399742 +0000 UTC m=+0.128990909 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1)
Nov 23 09:00:34 np0005532585.localdomain podman[107605]: 2025-11-23 09:00:34.595012033 +0000 UTC m=+0.150632751 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1)
Nov 23 09:00:34 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully.
Nov 23 09:00:34 np0005532585.localdomain podman[107606]: 2025-11-23 09:00:34.610304478 +0000 UTC m=+0.165297947 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, container_name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 23 09:00:34 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully.
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 09:00:36 np0005532585.localdomain podman[107644]: Error: container d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 is not running
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Main process exited, code=exited, status=125/n/a
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed with result 'exit-code'.
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 09:00:36 np0005532585.localdomain podman[107655]: 2025-11-23 09:00:36.81924929 +0000 UTC m=+0.050560122 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond)
Nov 23 09:00:36 np0005532585.localdomain podman[107655]: 2025-11-23 09:00:36.857331142 +0000 UTC m=+0.088641994 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true)
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: tmp-crun.3EnnxA.mount: Deactivated successfully.
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully.
Nov 23 09:00:36 np0005532585.localdomain podman[107656]: 2025-11-23 09:00:36.874611549 +0000 UTC m=+0.098320226 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container)
Nov 23 09:00:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14927 DF PROTO=TCP SPT=45750 DPT=9101 SEQ=1924788322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4FA200000000001030307) 
Nov 23 09:00:36 np0005532585.localdomain podman[107656]: 2025-11-23 09:00:36.914313154 +0000 UTC m=+0.138021821 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_compute, io.buildah.version=1.41.4)
Nov 23 09:00:36 np0005532585.localdomain podman[107656]: unhealthy
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:00:36 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 09:00:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3890 DF PROTO=TCP SPT=48150 DPT=9100 SEQ=3852951719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B503610000000001030307) 
Nov 23 09:00:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 09:00:39 np0005532585.localdomain systemd[1]: tmp-crun.0GqwL5.mount: Deactivated successfully.
Nov 23 09:00:39 np0005532585.localdomain podman[107695]: 2025-11-23 09:00:39.531789778 +0000 UTC m=+0.085060925 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container)
Nov 23 09:00:39 np0005532585.localdomain podman[107695]: 2025-11-23 09:00:39.90258957 +0000 UTC m=+0.455860697 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_migration_target, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute)
Nov 23 09:00:39 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 09:00:39 np0005532585.localdomain sudo[107718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:00:39 np0005532585.localdomain sudo[107718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:00:39 np0005532585.localdomain sudo[107718]: pam_unix(sudo:session): session closed for user root
Nov 23 09:00:40 np0005532585.localdomain sudo[107733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:00:40 np0005532585.localdomain sudo[107733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:00:40 np0005532585.localdomain sudo[107733]: pam_unix(sudo:session): session closed for user root
Nov 23 09:00:41 np0005532585.localdomain sudo[107781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:00:41 np0005532585.localdomain sudo[107781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:00:41 np0005532585.localdomain sudo[107781]: pam_unix(sudo:session): session closed for user root
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:00:42 np0005532585.localdomain podman[107796]: 2025-11-23 09:00:42.796251915 +0000 UTC m=+0.099550394 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64)
Nov 23 09:00:42 np0005532585.localdomain podman[107798]: 2025-11-23 09:00:42.828456476 +0000 UTC m=+0.119275068 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: tmp-crun.o2kKJ4.mount: Deactivated successfully.
Nov 23 09:00:42 np0005532585.localdomain podman[107797]: 2025-11-23 09:00:42.861315896 +0000 UTC m=+0.161089006 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044)
Nov 23 09:00:42 np0005532585.localdomain podman[107798]: 2025-11-23 09:00:42.872315428 +0000 UTC m=+0.163133990 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 09:00:42 np0005532585.localdomain podman[107798]: unhealthy
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:00:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54848 DF PROTO=TCP SPT=33826 DPT=9882 SEQ=955523070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B511A00000000001030307) 
Nov 23 09:00:42 np0005532585.localdomain podman[107797]: 2025-11-23 09:00:42.925555543 +0000 UTC m=+0.225328643 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent)
Nov 23 09:00:42 np0005532585.localdomain podman[107797]: unhealthy
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:00:42 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:00:43 np0005532585.localdomain podman[107796]: 2025-11-23 09:00:43.005242959 +0000 UTC m=+0.308541438 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Nov 23 09:00:43 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully.
Nov 23 09:00:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54849 DF PROTO=TCP SPT=33826 DPT=9882 SEQ=955523070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B519A00000000001030307) 
Nov 23 09:00:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19458 DF PROTO=TCP SPT=35234 DPT=9102 SEQ=3403046721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B526200000000001030307) 
Nov 23 09:00:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44633 DF PROTO=TCP SPT=51672 DPT=9102 SEQ=4211352798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B53AA00000000001030307) 
Nov 23 09:00:54 np0005532585.localdomain CROND[107216]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Nov 23 09:00:54 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 09:00:54 np0005532585.localdomain recover_tripleo_nova_virtqemud[107865]: 61756
Nov 23 09:00:54 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 09:00:54 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 09:00:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54851 DF PROTO=TCP SPT=33826 DPT=9882 SEQ=955523070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B54A200000000001030307) 
Nov 23 09:00:57 np0005532585.localdomain podman[107589]: time="2025-11-23T09:00:57Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: libpod-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope: Deactivated successfully.
Nov 23 09:00:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:5b:6f:0e MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=41944 SEQ=2074507670 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: libpod-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope: Consumed 6.186s CPU time.
Nov 23 09:00:57 np0005532585.localdomain podman[107589]: 2025-11-23 09:00:57.44946856 +0000 UTC m=+42.092499030 container stop d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 23 09:00:57 np0005532585.localdomain podman[107589]: 2025-11-23 09:00:57.483054263 +0000 UTC m=+42.126084743 container died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: Deactivated successfully.
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: No such file or directory
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8-userdata-shm.mount: Deactivated successfully.
Nov 23 09:00:57 np0005532585.localdomain podman[107589]: 2025-11-23 09:00:57.591285536 +0000 UTC m=+42.234315976 container cleanup d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git)
Nov 23 09:00:57 np0005532585.localdomain podman[107589]: ceilometer_agent_ipmi
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: No such file or directory
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: No such file or directory
Nov 23 09:00:57 np0005532585.localdomain podman[107867]: 2025-11-23 09:00:57.60877318 +0000 UTC m=+0.145951396 container cleanup d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: libpod-conmon-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope: Deactivated successfully.
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: No such file or directory
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: No such file or directory
Nov 23 09:00:57 np0005532585.localdomain podman[107882]: 2025-11-23 09:00:57.713145913 +0000 UTC m=+0.069578664 container cleanup d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 23 09:00:57 np0005532585.localdomain podman[107882]: ceilometer_agent_ipmi
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Nov 23 09:00:57 np0005532585.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Nov 23 09:00:57 np0005532585.localdomain sudo[107547]: pam_unix(sudo:session): session closed for user root
Nov 23 09:00:58 np0005532585.localdomain sudo[107982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hljfigvzijvssdfltveqzmkwzraxklqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888457.8961825-114-53267435267468/AnsiballZ_systemd_service.py
Nov 23 09:00:58 np0005532585.localdomain sudo[107982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:00:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a8785be8dea5fa0361315af1fc74fe453e62e737d2ff3d773f6811b45d15cd9a-merged.mount: Deactivated successfully.
Nov 23 09:00:58 np0005532585.localdomain python3.9[107984]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:00:58 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:00:58 np0005532585.localdomain systemd-sysv-generator[108009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:00:58 np0005532585.localdomain systemd-rc-local-generator[108006]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:00:58 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:00:58 np0005532585.localdomain systemd[1]: Stopping collectd container...
Nov 23 09:00:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48175 DF PROTO=TCP SPT=38494 DPT=9105 SEQ=3740944446 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5543B0000000001030307) 
Nov 23 09:01:01 np0005532585.localdomain CROND[108037]: (root) CMD (run-parts /etc/cron.hourly)
Nov 23 09:01:01 np0005532585.localdomain run-parts[108040]: (/etc/cron.hourly) starting 0anacron
Nov 23 09:01:01 np0005532585.localdomain anacron[108048]: Anacron started on 2025-11-23
Nov 23 09:01:01 np0005532585.localdomain anacron[108048]: Will run job `cron.daily' in 18 min.
Nov 23 09:01:01 np0005532585.localdomain anacron[108048]: Will run job `cron.weekly' in 38 min.
Nov 23 09:01:01 np0005532585.localdomain anacron[108048]: Will run job `cron.monthly' in 58 min.
Nov 23 09:01:01 np0005532585.localdomain anacron[108048]: Jobs will be executed sequentially
Nov 23 09:01:01 np0005532585.localdomain run-parts[108050]: (/etc/cron.hourly) finished 0anacron
Nov 23 09:01:01 np0005532585.localdomain CROND[108036]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 23 09:01:01 np0005532585.localdomain CROND[108052]: (root) CMD (run-parts /etc/cron.hourly)
Nov 23 09:01:01 np0005532585.localdomain run-parts[108055]: (/etc/cron.hourly) starting 0anacron
Nov 23 09:01:01 np0005532585.localdomain run-parts[108061]: (/etc/cron.hourly) finished 0anacron
Nov 23 09:01:01 np0005532585.localdomain CROND[108051]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: libpod-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope: Deactivated successfully.
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: libpod-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope: Consumed 2.080s CPU time.
Nov 23 09:01:02 np0005532585.localdomain podman[108024]: 2025-11-23 09:01:02.641099993 +0000 UTC m=+3.758452891 container died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, container_name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: Deactivated successfully.
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: No such file or directory
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb-userdata-shm.mount: Deactivated successfully.
Nov 23 09:01:02 np0005532585.localdomain podman[108024]: 2025-11-23 09:01:02.759037167 +0000 UTC m=+3.876390065 container cleanup 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, distribution-scope=public, version=17.1.12, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:01:02 np0005532585.localdomain podman[108024]: collectd
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: No such file or directory
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: No such file or directory
Nov 23 09:01:02 np0005532585.localdomain podman[108063]: 2025-11-23 09:01:02.783932131 +0000 UTC m=+0.131741955 container cleanup 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: libpod-conmon-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope: Deactivated successfully.
Nov 23 09:01:02 np0005532585.localdomain podman[108096]: error opening file `/run/crun/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb/status`: No such file or directory
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: No such file or directory
Nov 23 09:01:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51060 DF PROTO=TCP SPT=50522 DPT=9101 SEQ=2870176415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B55FA00000000001030307) 
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: No such file or directory
Nov 23 09:01:02 np0005532585.localdomain podman[108085]: 2025-11-23 09:01:02.897157279 +0000 UTC m=+0.085632022 container cleanup 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 23 09:01:02 np0005532585.localdomain podman[108085]: collectd
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Nov 23 09:01:02 np0005532585.localdomain systemd[1]: Stopped collectd container.
Nov 23 09:01:02 np0005532585.localdomain sudo[107982]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:03 np0005532585.localdomain sudo[108190]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icnbpsrtjqaimnanwknrelccrlhtoraq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888463.0562775-114-178633020891797/AnsiballZ_systemd_service.py
Nov 23 09:01:03 np0005532585.localdomain sudo[108190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:03 np0005532585.localdomain python3.9[108192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad-merged.mount: Deactivated successfully.
Nov 23 09:01:03 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:01:03 np0005532585.localdomain systemd-rc-local-generator[108216]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:01:03 np0005532585.localdomain systemd-sysv-generator[108220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:01:03 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:01:03 np0005532585.localdomain systemd[1]: Stopping iscsid container...
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: libpod-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope: Deactivated successfully.
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: libpod-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope: Consumed 1.095s CPU time.
Nov 23 09:01:04 np0005532585.localdomain podman[108232]: 2025-11-23 09:01:04.063393518 +0000 UTC m=+0.079843613 container died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: Deactivated successfully.
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: No such file or directory
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a-userdata-shm.mount: Deactivated successfully.
Nov 23 09:01:04 np0005532585.localdomain podman[108232]: 2025-11-23 09:01:04.097878779 +0000 UTC m=+0.114328884 container cleanup 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git)
Nov 23 09:01:04 np0005532585.localdomain podman[108232]: iscsid
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: No such file or directory
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: No such file or directory
Nov 23 09:01:04 np0005532585.localdomain podman[108244]: 2025-11-23 09:01:04.130779752 +0000 UTC m=+0.058277202 container cleanup 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12)
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: libpod-conmon-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope: Deactivated successfully.
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: No such file or directory
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: No such file or directory
Nov 23 09:01:04 np0005532585.localdomain podman[108258]: 2025-11-23 09:01:04.215197545 +0000 UTC m=+0.052545624 container cleanup 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 09:01:04 np0005532585.localdomain podman[108258]: iscsid
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: Stopped iscsid container.
Nov 23 09:01:04 np0005532585.localdomain sudo[108190]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250-merged.mount: Deactivated successfully.
Nov 23 09:01:04 np0005532585.localdomain sudo[108360]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzjuyhnvzcpndznyacwkgkwmdahienqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888464.4195986-114-77148895588152/AnsiballZ_systemd_service.py
Nov 23 09:01:04 np0005532585.localdomain sudo[108360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:04 np0005532585.localdomain python3.9[108362]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:01:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25369 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B56C200000000001030307) 
Nov 23 09:01:06 np0005532585.localdomain systemd-sysv-generator[108396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:01:06 np0005532585.localdomain systemd-rc-local-generator[108393]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: Stopping logrotate_crond container...
Nov 23 09:01:06 np0005532585.localdomain crond[68839]: (CRON) INFO (Shutting down)
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: libpod-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.scope: Deactivated successfully.
Nov 23 09:01:06 np0005532585.localdomain podman[108404]: 2025-11-23 09:01:06.473119066 +0000 UTC m=+0.067590741 container died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: Deactivated successfully.
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: No such file or directory
Nov 23 09:01:06 np0005532585.localdomain podman[108404]: 2025-11-23 09:01:06.528646971 +0000 UTC m=+0.123118556 container cleanup 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 23 09:01:06 np0005532585.localdomain podman[108404]: logrotate_crond
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: No such file or directory
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: No such file or directory
Nov 23 09:01:06 np0005532585.localdomain podman[108416]: 2025-11-23 09:01:06.566067014 +0000 UTC m=+0.086687375 container cleanup 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1761123044, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron)
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: libpod-conmon-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.scope: Deactivated successfully.
Nov 23 09:01:06 np0005532585.localdomain podman[108444]: error opening file `/run/crun/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c/status`: No such file or directory
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: No such file or directory
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: No such file or directory
Nov 23 09:01:06 np0005532585.localdomain podman[108433]: 2025-11-23 09:01:06.673290845 +0000 UTC m=+0.075641871 container cleanup 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z)
Nov 23 09:01:06 np0005532585.localdomain podman[108433]: logrotate_crond
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Nov 23 09:01:06 np0005532585.localdomain systemd[1]: Stopped logrotate_crond container.
Nov 23 09:01:06 np0005532585.localdomain sudo[108360]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:07 np0005532585.localdomain sudo[108536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nykluqbfwothcligdddjvzpufnnihqaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888466.8827407-114-232380096672745/AnsiballZ_systemd_service.py
Nov 23 09:01:07 np0005532585.localdomain sudo[108536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 09:01:07 np0005532585.localdomain podman[108539]: 2025-11-23 09:01:07.261018708 +0000 UTC m=+0.060039256 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:01:07 np0005532585.localdomain podman[108539]: 2025-11-23 09:01:07.288281336 +0000 UTC m=+0.087301954 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 23 09:01:07 np0005532585.localdomain podman[108539]: unhealthy
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-503e987379308b9e6b9946670c4ac6382bcf032235dd53dd51b9045ac75aedc7-merged.mount: Deactivated successfully.
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c-userdata-shm.mount: Deactivated successfully.
Nov 23 09:01:07 np0005532585.localdomain python3.9[108538]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:01:07 np0005532585.localdomain systemd-sysv-generator[108588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:01:07 np0005532585.localdomain systemd-rc-local-generator[108584]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: Stopping metrics_qdr container...
Nov 23 09:01:07 np0005532585.localdomain kernel: qdrouterd[54470]: segfault at 0 ip 00007fe453f4d7cb sp 00007fff8ee61160 error 4 in libc.so.6[7fe453eea000+175000]
Nov 23 09:01:07 np0005532585.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Nov 23 09:01:07 np0005532585.localdomain systemd[1]: Started Process Core Dump (PID 108610/UID 0).
Nov 23 09:01:08 np0005532585.localdomain systemd-coredump[108611]: Resource limits disable core dumping for process 54470 (qdrouterd).
Nov 23 09:01:08 np0005532585.localdomain systemd-coredump[108611]: Process 54470 (qdrouterd) of user 42465 dumped core.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: systemd-coredump@0-108610-0.service: Deactivated successfully.
Nov 23 09:01:08 np0005532585.localdomain podman[108598]: 2025-11-23 09:01:08.090883645 +0000 UTC m=+0.240404651 container died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: libpod-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope: Deactivated successfully.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: libpod-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope: Consumed 27.405s CPU time.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: Deactivated successfully.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: No such file or directory
Nov 23 09:01:08 np0005532585.localdomain podman[108598]: 2025-11-23 09:01:08.141765497 +0000 UTC m=+0.291286473 container cleanup 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public)
Nov 23 09:01:08 np0005532585.localdomain podman[108598]: metrics_qdr
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: No such file or directory
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: No such file or directory
Nov 23 09:01:08 np0005532585.localdomain podman[108615]: 2025-11-23 09:01:08.163111829 +0000 UTC m=+0.061783390 container cleanup 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: libpod-conmon-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope: Deactivated successfully.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: No such file or directory
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: No such file or directory
Nov 23 09:01:08 np0005532585.localdomain podman[108627]: 2025-11-23 09:01:08.265304955 +0000 UTC m=+0.067927302 container cleanup 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, container_name=metrics_qdr, release=1761123044, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 23 09:01:08 np0005532585.localdomain podman[108627]: metrics_qdr
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: Stopped metrics_qdr container.
Nov 23 09:01:08 np0005532585.localdomain sudo[108536]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3-merged.mount: Deactivated successfully.
Nov 23 09:01:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2-userdata-shm.mount: Deactivated successfully.
Nov 23 09:01:08 np0005532585.localdomain sudo[108729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tawwzkoblgyuycrlukywvkymdtxmqrob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888468.4272678-114-100573433770125/AnsiballZ_systemd_service.py
Nov 23 09:01:08 np0005532585.localdomain sudo[108729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:09 np0005532585.localdomain python3.9[108731]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:09 np0005532585.localdomain sudo[108729]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47292 DF PROTO=TCP SPT=53018 DPT=9100 SEQ=1986790061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B578600000000001030307) 
Nov 23 09:01:09 np0005532585.localdomain sudo[108822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnfqwpxujmtqlwjonmyarovombiylkif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888469.1656806-114-2974546724819/AnsiballZ_systemd_service.py
Nov 23 09:01:09 np0005532585.localdomain sudo[108822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:09 np0005532585.localdomain python3.9[108824]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:09 np0005532585.localdomain sudo[108822]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 09:01:10 np0005532585.localdomain podman[108872]: 2025-11-23 09:01:10.027150242 +0000 UTC m=+0.070472152 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z)
Nov 23 09:01:10 np0005532585.localdomain sudo[108936]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pulzsplnmalyhmvauaporvphrgaugpgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888469.882021-114-144529651008708/AnsiballZ_systemd_service.py
Nov 23 09:01:10 np0005532585.localdomain sudo[108936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:10 np0005532585.localdomain podman[108872]: 2025-11-23 09:01:10.424190879 +0000 UTC m=+0.467512739 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container)
Nov 23 09:01:10 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 09:01:10 np0005532585.localdomain python3.9[108938]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:10 np0005532585.localdomain sudo[108936]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:10 np0005532585.localdomain sudo[109030]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnpzdptrxawduscvtkqlmygkxyewhajv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888470.615342-114-201890028120056/AnsiballZ_systemd_service.py
Nov 23 09:01:10 np0005532585.localdomain sudo[109030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:11 np0005532585.localdomain python3.9[109032]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:11 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:01:11 np0005532585.localdomain systemd-rc-local-generator[109060]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:01:11 np0005532585.localdomain systemd-sysv-generator[109064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:01:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:01:11 np0005532585.localdomain systemd[1]: Stopping nova_compute container...
Nov 23 09:01:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63102 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B586A10000000001030307) 
Nov 23 09:01:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:01:13 np0005532585.localdomain systemd[1]: tmp-crun.eHVT1m.mount: Deactivated successfully.
Nov 23 09:01:13 np0005532585.localdomain podman[109085]: 2025-11-23 09:01:13.026143631 +0000 UTC m=+0.081967478 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Nov 23 09:01:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:01:13 np0005532585.localdomain podman[109085]: 2025-11-23 09:01:13.05249083 +0000 UTC m=+0.108314647 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 09:01:13 np0005532585.localdomain podman[109085]: unhealthy
Nov 23 09:01:13 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:01:13 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:01:13 np0005532585.localdomain podman[109105]: 2025-11-23 09:01:13.139013628 +0000 UTC m=+0.084947870 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:01:13 np0005532585.localdomain podman[109105]: 2025-11-23 09:01:13.161518657 +0000 UTC m=+0.107452949 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:01:13 np0005532585.localdomain podman[109105]: unhealthy
Nov 23 09:01:13 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:01:13 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:01:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63103 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B58EA10000000001030307) 
Nov 23 09:01:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63104 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B59E600000000001030307) 
Nov 23 09:01:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1500 DF PROTO=TCP SPT=36602 DPT=9102 SEQ=2053157803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5AFE10000000001030307) 
Nov 23 09:01:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63105 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5BE200000000001030307) 
Nov 23 09:01:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26952 DF PROTO=TCP SPT=37260 DPT=9101 SEQ=742382203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5C8BA0000000001030307) 
Nov 23 09:01:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37594 DF PROTO=TCP SPT=52720 DPT=9105 SEQ=4006679888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5C96B0000000001030307) 
Nov 23 09:01:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26954 DF PROTO=TCP SPT=37260 DPT=9101 SEQ=742382203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5D4E10000000001030307) 
Nov 23 09:01:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3893 DF PROTO=TCP SPT=48150 DPT=9100 SEQ=3852951719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5E2200000000001030307) 
Nov 23 09:01:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 09:01:37 np0005532585.localdomain podman[109127]: Error: container e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce is not running
Nov 23 09:01:37 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=125/n/a
Nov 23 09:01:37 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'.
Nov 23 09:01:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5753 DF PROTO=TCP SPT=39194 DPT=9100 SEQ=359370144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5EDA00000000001030307) 
Nov 23 09:01:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 09:01:41 np0005532585.localdomain systemd[1]: tmp-crun.y0IJaQ.mount: Deactivated successfully.
Nov 23 09:01:41 np0005532585.localdomain podman[109140]: 2025-11-23 09:01:41.020411024 +0000 UTC m=+0.078132389 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 09:01:41 np0005532585.localdomain podman[109140]: 2025-11-23 09:01:41.379232734 +0000 UTC m=+0.436954039 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044)
Nov 23 09:01:41 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully.
Nov 23 09:01:41 np0005532585.localdomain sudo[109163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:01:41 np0005532585.localdomain sudo[109163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:01:41 np0005532585.localdomain sudo[109163]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:41 np0005532585.localdomain sudo[109178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:01:41 np0005532585.localdomain sudo[109178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:01:42 np0005532585.localdomain sudo[109178]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8215 DF PROTO=TCP SPT=55658 DPT=9882 SEQ=874165632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5FBE00000000001030307) 
Nov 23 09:01:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:01:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:01:43 np0005532585.localdomain sudo[109225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:01:43 np0005532585.localdomain sudo[109225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:01:43 np0005532585.localdomain sudo[109225]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:43 np0005532585.localdomain podman[109240]: 2025-11-23 09:01:43.283785155 +0000 UTC m=+0.081536474 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, url=https://www.redhat.com, architecture=x86_64)
Nov 23 09:01:43 np0005532585.localdomain podman[109240]: 2025-11-23 09:01:43.298049598 +0000 UTC m=+0.095800927 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 09:01:43 np0005532585.localdomain podman[109240]: unhealthy
Nov 23 09:01:43 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:01:43 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:01:43 np0005532585.localdomain podman[109239]: 2025-11-23 09:01:43.388913583 +0000 UTC m=+0.189693466 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible)
Nov 23 09:01:43 np0005532585.localdomain podman[109239]: 2025-11-23 09:01:43.405168217 +0000 UTC m=+0.205948130 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:01:43 np0005532585.localdomain podman[109239]: unhealthy
Nov 23 09:01:43 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:01:43 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:01:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8216 DF PROTO=TCP SPT=55658 DPT=9882 SEQ=874165632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B603E00000000001030307) 
Nov 23 09:01:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1502 DF PROTO=TCP SPT=36602 DPT=9102 SEQ=2053157803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B610200000000001030307) 
Nov 23 09:01:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38711 DF PROTO=TCP SPT=52154 DPT=9102 SEQ=4065124393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B624E00000000001030307) 
Nov 23 09:01:53 np0005532585.localdomain podman[109072]: time="2025-11-23T09:01:53Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: tmp-crun.CnmSQH.mount: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: libpod-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: libpod-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope: Consumed 35.074s CPU time.
Nov 23 09:01:53 np0005532585.localdomain podman[109072]: 2025-11-23 09:01:53.617773977 +0000 UTC m=+42.108239073 container died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public)
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: No such file or directory
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: tmp-crun.6IpaAq.mount: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain podman[109072]: 2025-11-23 09:01:53.71634472 +0000 UTC m=+42.206809816 container cleanup e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, architecture=x86_64, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 23 09:01:53 np0005532585.localdomain podman[109072]: nova_compute
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: No such file or directory
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: No such file or directory
Nov 23 09:01:53 np0005532585.localdomain podman[109280]: 2025-11-23 09:01:53.756958662 +0000 UTC m=+0.123005343 container cleanup e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: libpod-conmon-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: No such file or directory
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: No such file or directory
Nov 23 09:01:53 np0005532585.localdomain podman[109295]: 2025-11-23 09:01:53.833293454 +0000 UTC m=+0.048444636 container cleanup e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, config_id=tripleo_step5, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git)
Nov 23 09:01:53 np0005532585.localdomain podman[109295]: nova_compute
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: Stopped nova_compute container.
Nov 23 09:01:53 np0005532585.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.088s CPU time, no IO.
Nov 23 09:01:53 np0005532585.localdomain sudo[109030]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:54 np0005532585.localdomain sudo[109396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuvtwfuevuhnxhvvqumingtcqjyfsxzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888513.9816654-114-237953840359157/AnsiballZ_systemd_service.py
Nov 23 09:01:54 np0005532585.localdomain sudo[109396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829-merged.mount: Deactivated successfully.
Nov 23 09:01:54 np0005532585.localdomain python3.9[109398]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:01:54 np0005532585.localdomain systemd-sysv-generator[109429]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:01:54 np0005532585.localdomain systemd-rc-local-generator[109425]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:01:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: Stopping nova_migration_target container...
Nov 23 09:01:55 np0005532585.localdomain sshd[69165]: Received signal 15; terminating.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: libpod-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: libpod-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope: Consumed 34.442s CPU time.
Nov 23 09:01:55 np0005532585.localdomain podman[109439]: 2025-11-23 09:01:55.089667284 +0000 UTC m=+0.078474550 container died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target)
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: No such file or directory
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: tmp-crun.Td4eE0.mount: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain podman[109439]: 2025-11-23 09:01:55.146634434 +0000 UTC m=+0.135441680 container cleanup e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_migration_target)
Nov 23 09:01:55 np0005532585.localdomain podman[109439]: nova_migration_target
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: No such file or directory
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: No such file or directory
Nov 23 09:01:55 np0005532585.localdomain podman[109451]: 2025-11-23 09:01:55.172590861 +0000 UTC m=+0.072076691 container cleanup e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: libpod-conmon-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: No such file or directory
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: No such file or directory
Nov 23 09:01:55 np0005532585.localdomain podman[109463]: 2025-11-23 09:01:55.264356032 +0000 UTC m=+0.065752484 container cleanup e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Nov 23 09:01:55 np0005532585.localdomain podman[109463]: nova_migration_target
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: Stopped nova_migration_target container.
Nov 23 09:01:55 np0005532585.localdomain sudo[109396]: pam_unix(sudo:session): session closed for user root
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5-merged.mount: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7-userdata-shm.mount: Deactivated successfully.
Nov 23 09:01:55 np0005532585.localdomain sudo[109564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpmuiujwyycnlgawgptfzmpykwjwfixj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888515.4235294-114-2066117560356/AnsiballZ_systemd_service.py
Nov 23 09:01:55 np0005532585.localdomain sudo[109564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:01:56 np0005532585.localdomain python3.9[109566]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:01:56 np0005532585.localdomain systemd-rc-local-generator[109594]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:01:56 np0005532585.localdomain systemd-sysv-generator[109598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: libpod-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11.scope: Deactivated successfully.
Nov 23 09:01:56 np0005532585.localdomain podman[109606]: 2025-11-23 09:01:56.487454748 +0000 UTC m=+0.062802162 container died 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, container_name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:01:56 np0005532585.localdomain podman[109606]: 2025-11-23 09:01:56.528087351 +0000 UTC m=+0.103434745 container cleanup 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 23 09:01:56 np0005532585.localdomain podman[109606]: nova_virtlogd_wrapper
Nov 23 09:01:56 np0005532585.localdomain podman[109620]: 2025-11-23 09:01:56.550952131 +0000 UTC m=+0.052471502 container cleanup 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: tmp-crun.A4xUYe.mount: Deactivated successfully.
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11-merged.mount: Deactivated successfully.
Nov 23 09:01:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11-userdata-shm.mount: Deactivated successfully.
Nov 23 09:01:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8218 DF PROTO=TCP SPT=55658 DPT=9882 SEQ=874165632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B634200000000001030307) 
Nov 23 09:01:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14724 DF PROTO=TCP SPT=53910 DPT=9101 SEQ=3315328708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B63DEA0000000001030307) 
Nov 23 09:01:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19195 DF PROTO=TCP SPT=34468 DPT=9105 SEQ=346067449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B63E9B0000000001030307) 
Nov 23 09:02:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:02:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:02:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19197 DF PROTO=TCP SPT=34468 DPT=9105 SEQ=346067449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B64AA10000000001030307) 
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Activating special unit Exit the Session...
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Removed slice User Background Tasks Slice.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped target Main User Target.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped target Basic System.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped target Paths.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped target Sockets.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped target Timers.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Closed D-Bus User Message Bus Socket.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Stopped Create User's Volatile Files and Directories.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Removed slice User Application Slice.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Reached target Shutdown.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Finished Exit the Session.
Nov 23 09:02:03 np0005532585.localdomain systemd[84232]: Reached target Exit the Session.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: user@0.service: Consumed 4.298s CPU time, no IO.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 23 09:02:03 np0005532585.localdomain systemd[1]: user-0.slice: Consumed 5.280s CPU time.
Nov 23 09:02:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:02:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:02:05 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47295 DF PROTO=TCP SPT=53018 DPT=9100 SEQ=1986790061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B656200000000001030307) 
Nov 23 09:02:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=840 DF PROTO=TCP SPT=58022 DPT=9100 SEQ=3884394366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B662E00000000001030307) 
Nov 23 09:02:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35584 DF PROTO=TCP SPT=48126 DPT=9882 SEQ=854778408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B671210000000001030307) 
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:02:13 np0005532585.localdomain podman[109640]: 2025-11-23 09:02:13.533529915 +0000 UTC m=+0.085813056 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Nov 23 09:02:13 np0005532585.localdomain podman[109640]: 2025-11-23 09:02:13.574970453 +0000 UTC m=+0.127253564 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container)
Nov 23 09:02:13 np0005532585.localdomain podman[109640]: unhealthy
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: tmp-crun.BZ5Zpd.mount: Deactivated successfully.
Nov 23 09:02:13 np0005532585.localdomain podman[109639]: 2025-11-23 09:02:13.596347978 +0000 UTC m=+0.145820012 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044)
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:02:13 np0005532585.localdomain podman[109639]: 2025-11-23 09:02:13.619437625 +0000 UTC m=+0.168909649 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent)
Nov 23 09:02:13 np0005532585.localdomain podman[109639]: unhealthy
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:02:13 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:02:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35585 DF PROTO=TCP SPT=48126 DPT=9882 SEQ=854778408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B679200000000001030307) 
Nov 23 09:02:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38713 DF PROTO=TCP SPT=52154 DPT=9102 SEQ=4065124393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B686200000000001030307) 
Nov 23 09:02:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58598 DF PROTO=TCP SPT=51150 DPT=9102 SEQ=102310858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B69A200000000001030307) 
Nov 23 09:02:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35587 DF PROTO=TCP SPT=48126 DPT=9882 SEQ=854778408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6AA200000000001030307) 
Nov 23 09:02:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26604 DF PROTO=TCP SPT=56220 DPT=9101 SEQ=3970731790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6B31A0000000001030307) 
Nov 23 09:02:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45399 DF PROTO=TCP SPT=38422 DPT=9105 SEQ=592499428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6B3CC0000000001030307) 
Nov 23 09:02:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26606 DF PROTO=TCP SPT=56220 DPT=9101 SEQ=3970731790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6BF200000000001030307) 
Nov 23 09:02:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5756 DF PROTO=TCP SPT=39194 DPT=9100 SEQ=359370144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6CC200000000001030307) 
Nov 23 09:02:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49384 DF PROTO=TCP SPT=46730 DPT=9100 SEQ=505397214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6D8200000000001030307) 
Nov 23 09:02:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53432 DF PROTO=TCP SPT=60278 DPT=9882 SEQ=2084944691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6E6600000000001030307) 
Nov 23 09:02:43 np0005532585.localdomain sudo[109677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:02:43 np0005532585.localdomain sudo[109677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:02:43 np0005532585.localdomain sudo[109677]: pam_unix(sudo:session): session closed for user root
Nov 23 09:02:43 np0005532585.localdomain sudo[109692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:02:43 np0005532585.localdomain sudo[109692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:02:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:02:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:02:43 np0005532585.localdomain podman[109715]: 2025-11-23 09:02:43.757111666 +0000 UTC m=+0.060453920 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T23:34:05Z, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 23 09:02:43 np0005532585.localdomain podman[109715]: 2025-11-23 09:02:43.799800773 +0000 UTC m=+0.103143037 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044)
Nov 23 09:02:43 np0005532585.localdomain podman[109715]: unhealthy
Nov 23 09:02:43 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:02:43 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:02:43 np0005532585.localdomain podman[109709]: 2025-11-23 09:02:43.81097223 +0000 UTC m=+0.114223031 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 23 09:02:43 np0005532585.localdomain podman[109709]: 2025-11-23 09:02:43.843101219 +0000 UTC m=+0.146352000 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 23 09:02:43 np0005532585.localdomain podman[109709]: unhealthy
Nov 23 09:02:43 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:02:43 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:02:43 np0005532585.localdomain sudo[109692]: pam_unix(sudo:session): session closed for user root
Nov 23 09:02:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53433 DF PROTO=TCP SPT=60278 DPT=9882 SEQ=2084944691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6EE600000000001030307) 
Nov 23 09:02:46 np0005532585.localdomain sudo[109779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:02:46 np0005532585.localdomain sudo[109779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:02:46 np0005532585.localdomain sudo[109779]: pam_unix(sudo:session): session closed for user root
Nov 23 09:02:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58600 DF PROTO=TCP SPT=51150 DPT=9102 SEQ=102310858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6FA210000000001030307) 
Nov 23 09:02:51 np0005532585.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 23 09:02:51 np0005532585.localdomain recover_tripleo_nova_virtqemud[109795]: 61756
Nov 23 09:02:51 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 23 09:02:51 np0005532585.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 23 09:02:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22741 DF PROTO=TCP SPT=59924 DPT=9102 SEQ=3480946583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B70F600000000001030307) 
Nov 23 09:02:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53435 DF PROTO=TCP SPT=60278 DPT=9882 SEQ=2084944691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B71E200000000001030307) 
Nov 23 09:02:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24647 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=3931588753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7284A0000000001030307) 
Nov 23 09:02:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2008 DF PROTO=TCP SPT=50106 DPT=9105 SEQ=2647242858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B728FB0000000001030307) 
Nov 23 09:03:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24649 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=3931588753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B734610000000001030307) 
Nov 23 09:03:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=843 DF PROTO=TCP SPT=58022 DPT=9100 SEQ=3884394366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B742200000000001030307) 
Nov 23 09:03:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23464 DF PROTO=TCP SPT=57104 DPT=9100 SEQ=3587821903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B74D210000000001030307) 
Nov 23 09:03:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58452 DF PROTO=TCP SPT=47578 DPT=9882 SEQ=4215290881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B75B600000000001030307) 
Nov 23 09:03:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:03:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:03:14 np0005532585.localdomain podman[109797]: 2025-11-23 09:03:14.031824784 +0000 UTC m=+0.085527040 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com)
Nov 23 09:03:14 np0005532585.localdomain podman[109796]: 2025-11-23 09:03:14.083012374 +0000 UTC m=+0.137732841 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 09:03:14 np0005532585.localdomain podman[109797]: 2025-11-23 09:03:14.099957151 +0000 UTC m=+0.153659437 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 09:03:14 np0005532585.localdomain podman[109797]: unhealthy
Nov 23 09:03:14 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:03:14 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:03:14 np0005532585.localdomain podman[109796]: 2025-11-23 09:03:14.119810727 +0000 UTC m=+0.174531154 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 09:03:14 np0005532585.localdomain podman[109796]: unhealthy
Nov 23 09:03:14 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:03:14 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:03:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58453 DF PROTO=TCP SPT=47578 DPT=9882 SEQ=4215290881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B763600000000001030307) 
Nov 23 09:03:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22743 DF PROTO=TCP SPT=59924 DPT=9102 SEQ=3480946583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B770200000000001030307) 
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 60965 (conmon) with signal SIGKILL.
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: libpod-conmon-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11.scope: Deactivated successfully.
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: tmp-crun.pORbxB.mount: Deactivated successfully.
Nov 23 09:03:20 np0005532585.localdomain podman[109850]: error opening file `/run/crun/11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11/status`: No such file or directory
Nov 23 09:03:20 np0005532585.localdomain podman[109838]: 2025-11-23 09:03:20.76639882 +0000 UTC m=+0.075678812 container cleanup 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 09:03:20 np0005532585.localdomain podman[109838]: nova_virtlogd_wrapper
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Nov 23 09:03:20 np0005532585.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Nov 23 09:03:20 np0005532585.localdomain sudo[109564]: pam_unix(sudo:session): session closed for user root
Nov 23 09:03:21 np0005532585.localdomain sudo[109941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdoxvckxrsbjsuyyjidbcyqrllopcsmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888600.9365654-114-77978680002054/AnsiballZ_systemd_service.py
Nov 23 09:03:21 np0005532585.localdomain sudo[109941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:03:21 np0005532585.localdomain python3.9[109943]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:03:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:03:21 np0005532585.localdomain systemd-rc-local-generator[109972]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:03:21 np0005532585.localdomain systemd-sysv-generator[109977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:03:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:03:21 np0005532585.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Nov 23 09:03:21 np0005532585.localdomain systemd[1]: libpod-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope: Deactivated successfully.
Nov 23 09:03:21 np0005532585.localdomain systemd[1]: libpod-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope: Consumed 1.471s CPU time.
Nov 23 09:03:22 np0005532585.localdomain podman[109985]: 2025-11-23 09:03:22.000997184 +0000 UTC m=+0.059594904 container died aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:35:22Z, version=17.1.12, architecture=x86_64)
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: tmp-crun.n2Pgn8.mount: Deactivated successfully.
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: tmp-crun.RuIsjj.mount: Deactivated successfully.
Nov 23 09:03:22 np0005532585.localdomain podman[109985]: 2025-11-23 09:03:22.055176557 +0000 UTC m=+0.113774197 container cleanup aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, container_name=nova_virtnodedevd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Nov 23 09:03:22 np0005532585.localdomain podman[109985]: nova_virtnodedevd
Nov 23 09:03:22 np0005532585.localdomain podman[109999]: 2025-11-23 09:03:22.120257059 +0000 UTC m=+0.109399201 container cleanup aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible)
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: libpod-conmon-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope: Deactivated successfully.
Nov 23 09:03:22 np0005532585.localdomain podman[110026]: error opening file `/run/crun/aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd/status`: No such file or directory
Nov 23 09:03:22 np0005532585.localdomain podman[110014]: 2025-11-23 09:03:22.235381076 +0000 UTC m=+0.073035760 container cleanup aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, container_name=nova_virtnodedevd)
Nov 23 09:03:22 np0005532585.localdomain podman[110014]: nova_virtnodedevd
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Nov 23 09:03:22 np0005532585.localdomain sudo[109941]: pam_unix(sudo:session): session closed for user root
Nov 23 09:03:22 np0005532585.localdomain sshd[110112]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:03:22 np0005532585.localdomain sudo[110118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkpcqpopznldppradlxqwdecmfszmmgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888602.401578-114-18332369599923/AnsiballZ_systemd_service.py
Nov 23 09:03:22 np0005532585.localdomain sudo[110118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:03:22 np0005532585.localdomain python3.9[110121]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af-merged.mount: Deactivated successfully.
Nov 23 09:03:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd-userdata-shm.mount: Deactivated successfully.
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:03:23 np0005532585.localdomain systemd-rc-local-generator[110149]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:03:23 np0005532585.localdomain systemd-sysv-generator[110152]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:03:23 np0005532585.localdomain sshd[110112]: Invalid user solana from 193.32.162.146 port 50700
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: Stopping nova_virtproxyd container...
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: libpod-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08.scope: Deactivated successfully.
Nov 23 09:03:23 np0005532585.localdomain podman[110162]: 2025-11-23 09:03:23.426884281 +0000 UTC m=+0.070668848 container died 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 23 09:03:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50832 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=2341202835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B784A00000000001030307) 
Nov 23 09:03:23 np0005532585.localdomain podman[110162]: 2025-11-23 09:03:23.470077532 +0000 UTC m=+0.113862049 container cleanup 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, version=17.1.12, distribution-scope=public, container_name=nova_virtproxyd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt)
Nov 23 09:03:23 np0005532585.localdomain podman[110162]: nova_virtproxyd
Nov 23 09:03:23 np0005532585.localdomain podman[110176]: 2025-11-23 09:03:23.499658812 +0000 UTC m=+0.063745542 container cleanup 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_virtproxyd, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: libpod-conmon-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08.scope: Deactivated successfully.
Nov 23 09:03:23 np0005532585.localdomain sshd[110112]: Connection closed by invalid user solana 193.32.162.146 port 50700 [preauth]
Nov 23 09:03:23 np0005532585.localdomain podman[110206]: error opening file `/run/crun/108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08/status`: No such file or directory
Nov 23 09:03:23 np0005532585.localdomain podman[110193]: 2025-11-23 09:03:23.582393512 +0000 UTC m=+0.057194187 container cleanup 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, container_name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 09:03:23 np0005532585.localdomain podman[110193]: nova_virtproxyd
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: Stopped nova_virtproxyd container.
Nov 23 09:03:23 np0005532585.localdomain sudo[110118]: pam_unix(sudo:session): session closed for user root
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f-merged.mount: Deactivated successfully.
Nov 23 09:03:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08-userdata-shm.mount: Deactivated successfully.
Nov 23 09:03:23 np0005532585.localdomain sudo[110297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewoeqgtbmkuayhtiapcfrjygmcuybrhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888603.724273-114-167972927818013/AnsiballZ_systemd_service.py
Nov 23 09:03:24 np0005532585.localdomain sudo[110297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:03:24 np0005532585.localdomain python3.9[110299]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:03:24 np0005532585.localdomain systemd-rc-local-generator[110322]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:03:24 np0005532585.localdomain systemd-sysv-generator[110327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: Stopping nova_virtqemud container...
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: libpod-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope: Deactivated successfully.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: libpod-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope: Consumed 2.641s CPU time.
Nov 23 09:03:24 np0005532585.localdomain podman[110339]: 2025-11-23 09:03:24.765347061 +0000 UTC m=+0.080623216 container died 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=nova_virtqemud, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 23 09:03:24 np0005532585.localdomain podman[110339]: 2025-11-23 09:03:24.805995234 +0000 UTC m=+0.121271349 container cleanup 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, container_name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 09:03:24 np0005532585.localdomain podman[110339]: nova_virtqemud
Nov 23 09:03:24 np0005532585.localdomain podman[110354]: 2025-11-23 09:03:24.85156151 +0000 UTC m=+0.072049320 container cleanup 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_virtqemud, release=1761123044, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64)
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: tmp-crun.AsmWbu.mount: Deactivated successfully.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b-merged.mount: Deactivated successfully.
Nov 23 09:03:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8-userdata-shm.mount: Deactivated successfully.
Nov 23 09:03:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58455 DF PROTO=TCP SPT=47578 DPT=9882 SEQ=4215290881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B794210000000001030307) 
Nov 23 09:03:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51931 DF PROTO=TCP SPT=36278 DPT=9101 SEQ=2790590587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B79D7B0000000001030307) 
Nov 23 09:03:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33943 DF PROTO=TCP SPT=58484 DPT=9105 SEQ=1586031484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B79E2B0000000001030307) 
Nov 23 09:03:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51933 DF PROTO=TCP SPT=36278 DPT=9101 SEQ=2790590587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7A9A10000000001030307) 
Nov 23 09:03:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49387 DF PROTO=TCP SPT=46730 DPT=9100 SEQ=505397214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7B6210000000001030307) 
Nov 23 09:03:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59506 DF PROTO=TCP SPT=49440 DPT=9100 SEQ=3861658067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7C2600000000001030307) 
Nov 23 09:03:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27534 DF PROTO=TCP SPT=58332 DPT=9882 SEQ=4196375563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7D0A00000000001030307) 
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:03:44 np0005532585.localdomain podman[110371]: 2025-11-23 09:03:44.537001994 +0000 UTC m=+0.091159133 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: tmp-crun.bwRefP.mount: Deactivated successfully.
Nov 23 09:03:44 np0005532585.localdomain podman[110372]: 2025-11-23 09:03:44.600066134 +0000 UTC m=+0.152228222 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 09:03:44 np0005532585.localdomain podman[110371]: 2025-11-23 09:03:44.603954684 +0000 UTC m=+0.158111863 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 09:03:44 np0005532585.localdomain podman[110371]: unhealthy
Nov 23 09:03:44 np0005532585.localdomain podman[110372]: 2025-11-23 09:03:44.612786489 +0000 UTC m=+0.164948587 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:03:44 np0005532585.localdomain podman[110372]: unhealthy
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:03:44 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:03:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27535 DF PROTO=TCP SPT=58332 DPT=9882 SEQ=4196375563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7D8A00000000001030307) 
Nov 23 09:03:46 np0005532585.localdomain sudo[110410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:03:46 np0005532585.localdomain sudo[110410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:03:46 np0005532585.localdomain sudo[110410]: pam_unix(sudo:session): session closed for user root
Nov 23 09:03:46 np0005532585.localdomain sudo[110425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:03:46 np0005532585.localdomain sudo[110425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:03:47 np0005532585.localdomain sudo[110425]: pam_unix(sudo:session): session closed for user root
Nov 23 09:03:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50834 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=2341202835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7E4210000000001030307) 
Nov 23 09:03:48 np0005532585.localdomain sudo[110472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:03:48 np0005532585.localdomain sudo[110472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:03:48 np0005532585.localdomain sudo[110472]: pam_unix(sudo:session): session closed for user root
Nov 23 09:03:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37698 DF PROTO=TCP SPT=43254 DPT=9102 SEQ=114999264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7F9A00000000001030307) 
Nov 23 09:03:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27537 DF PROTO=TCP SPT=58332 DPT=9882 SEQ=4196375563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B808200000000001030307) 
Nov 23 09:03:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52983 DF PROTO=TCP SPT=33798 DPT=9101 SEQ=4110551325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B812A90000000001030307) 
Nov 23 09:03:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16327 DF PROTO=TCP SPT=36902 DPT=9105 SEQ=1191719041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8135B0000000001030307) 
Nov 23 09:04:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16329 DF PROTO=TCP SPT=36902 DPT=9105 SEQ=1191719041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B81F610000000001030307) 
Nov 23 09:04:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23467 DF PROTO=TCP SPT=57104 DPT=9100 SEQ=3587821903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B82C200000000001030307) 
Nov 23 09:04:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5977 DF PROTO=TCP SPT=51382 DPT=9100 SEQ=1610855476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B837A10000000001030307) 
Nov 23 09:04:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32097 DF PROTO=TCP SPT=48962 DPT=9882 SEQ=3463408620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B845E10000000001030307) 
Nov 23 09:04:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:04:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:04:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32098 DF PROTO=TCP SPT=48962 DPT=9882 SEQ=3463408620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B84DE00000000001030307) 
Nov 23 09:04:15 np0005532585.localdomain systemd[1]: tmp-crun.Nh8WFz.mount: Deactivated successfully.
Nov 23 09:04:15 np0005532585.localdomain podman[110487]: 2025-11-23 09:04:15.036717606 +0000 UTC m=+0.092474075 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public)
Nov 23 09:04:15 np0005532585.localdomain podman[110487]: 2025-11-23 09:04:15.051246517 +0000 UTC m=+0.107002946 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 23 09:04:15 np0005532585.localdomain podman[110487]: unhealthy
Nov 23 09:04:15 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:04:15 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:04:15 np0005532585.localdomain podman[110488]: 2025-11-23 09:04:15.132859343 +0000 UTC m=+0.187507587 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4)
Nov 23 09:04:15 np0005532585.localdomain podman[110488]: 2025-11-23 09:04:15.176288302 +0000 UTC m=+0.230936536 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, container_name=ovn_controller)
Nov 23 09:04:15 np0005532585.localdomain podman[110488]: unhealthy
Nov 23 09:04:15 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:04:15 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:04:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37700 DF PROTO=TCP SPT=43254 DPT=9102 SEQ=114999264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B85A210000000001030307) 
Nov 23 09:04:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12188 DF PROTO=TCP SPT=56828 DPT=9102 SEQ=235745335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B86EE00000000001030307) 
Nov 23 09:04:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32100 DF PROTO=TCP SPT=48962 DPT=9882 SEQ=3463408620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B87E200000000001030307) 
Nov 23 09:04:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62713 DF PROTO=TCP SPT=57822 DPT=9101 SEQ=3255914638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B887DA0000000001030307) 
Nov 23 09:04:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45979 DF PROTO=TCP SPT=33364 DPT=9105 SEQ=1400042007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8888E0000000001030307) 
Nov 23 09:04:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62715 DF PROTO=TCP SPT=57822 DPT=9101 SEQ=3255914638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B893E00000000001030307) 
Nov 23 09:04:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59509 DF PROTO=TCP SPT=49440 DPT=9100 SEQ=3861658067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8A0200000000001030307) 
Nov 23 09:04:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39162 DF PROTO=TCP SPT=60912 DPT=9100 SEQ=4066548656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8ACE10000000001030307) 
Nov 23 09:04:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18765 DF PROTO=TCP SPT=51360 DPT=9882 SEQ=4264641768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8BB200000000001030307) 
Nov 23 09:04:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18766 DF PROTO=TCP SPT=51360 DPT=9882 SEQ=4264641768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8C3200000000001030307) 
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: tmp-crun.AF0ZrZ.mount: Deactivated successfully.
Nov 23 09:04:45 np0005532585.localdomain podman[110526]: 2025-11-23 09:04:45.267648882 +0000 UTC m=+0.077004124 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:04:45 np0005532585.localdomain podman[110526]: 2025-11-23 09:04:45.306980994 +0000 UTC m=+0.116336236 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public)
Nov 23 09:04:45 np0005532585.localdomain podman[110526]: unhealthy
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'.
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: tmp-crun.Gf8B7r.mount: Deactivated successfully.
Nov 23 09:04:45 np0005532585.localdomain podman[110545]: 2025-11-23 09:04:45.380974553 +0000 UTC m=+0.087839880 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 23 09:04:45 np0005532585.localdomain podman[110545]: 2025-11-23 09:04:45.396163976 +0000 UTC m=+0.103029263 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, distribution-scope=public, vcs-type=git)
Nov 23 09:04:45 np0005532585.localdomain podman[110545]: unhealthy
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:04:45 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'.
Nov 23 09:04:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12190 DF PROTO=TCP SPT=56828 DPT=9102 SEQ=235745335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8D0200000000001030307) 
Nov 23 09:04:48 np0005532585.localdomain sudo[110563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:04:48 np0005532585.localdomain sudo[110563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:04:48 np0005532585.localdomain sudo[110563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:48 np0005532585.localdomain sudo[110578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 09:04:48 np0005532585.localdomain sudo[110578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud.service: Killing process 61752 (conmon) with signal SIGKILL.
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: libpod-conmon-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope: Deactivated successfully.
Nov 23 09:04:48 np0005532585.localdomain sudo[110578]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:48 np0005532585.localdomain podman[110624]: error opening file `/run/crun/80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8/status`: No such file or directory
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: tmp-crun.cgSfKs.mount: Deactivated successfully.
Nov 23 09:04:48 np0005532585.localdomain podman[110611]: 2025-11-23 09:04:48.915718349 +0000 UTC m=+0.066325862 container cleanup 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_virtqemud, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 23 09:04:48 np0005532585.localdomain podman[110611]: nova_virtqemud
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Nov 23 09:04:48 np0005532585.localdomain systemd[1]: Stopped nova_virtqemud container.
Nov 23 09:04:48 np0005532585.localdomain sudo[110297]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:49 np0005532585.localdomain sudo[110626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:04:49 np0005532585.localdomain sudo[110626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:04:49 np0005532585.localdomain sudo[110626]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:49 np0005532585.localdomain sudo[110655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:04:49 np0005532585.localdomain sudo[110655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:04:49 np0005532585.localdomain sudo[110745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnwtdkqjzswljxthpuawowgitrjkhipq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888689.0794618-114-74514097724599/AnsiballZ_systemd_service.py
Nov 23 09:04:49 np0005532585.localdomain sudo[110745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:04:49 np0005532585.localdomain python3.9[110747]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:04:49 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:04:49 np0005532585.localdomain sudo[110655]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:49 np0005532585.localdomain systemd-rc-local-generator[110806]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:04:49 np0005532585.localdomain systemd-sysv-generator[110809]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:04:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:04:49 np0005532585.localdomain sudo[110745]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:50 np0005532585.localdomain sudo[110856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:04:50 np0005532585.localdomain sudo[110856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:04:50 np0005532585.localdomain sudo[110856]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:50 np0005532585.localdomain sudo[110922]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcndwuqxzqckokoopljstazucpkjplff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888690.1617477-114-206510700742981/AnsiballZ_systemd_service.py
Nov 23 09:04:50 np0005532585.localdomain sudo[110922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:04:50 np0005532585.localdomain python3.9[110924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:04:50 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:04:50 np0005532585.localdomain systemd-sysv-generator[110956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:04:50 np0005532585.localdomain systemd-rc-local-generator[110951]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: Stopping nova_virtsecretd container...
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: tmp-crun.ECVk4b.mount: Deactivated successfully.
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: libpod-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03.scope: Deactivated successfully.
Nov 23 09:04:51 np0005532585.localdomain podman[110965]: 2025-11-23 09:04:51.29303389 +0000 UTC m=+0.091205125 container died a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, container_name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 09:04:51 np0005532585.localdomain podman[110965]: 2025-11-23 09:04:51.330417011 +0000 UTC m=+0.128588196 container cleanup a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team)
Nov 23 09:04:51 np0005532585.localdomain podman[110965]: nova_virtsecretd
Nov 23 09:04:51 np0005532585.localdomain podman[110978]: 2025-11-23 09:04:51.369562338 +0000 UTC m=+0.066429226 container cleanup a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtsecretd, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: libpod-conmon-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03.scope: Deactivated successfully.
Nov 23 09:04:51 np0005532585.localdomain podman[111009]: error opening file `/run/crun/a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03/status`: No such file or directory
Nov 23 09:04:51 np0005532585.localdomain podman[110997]: 2025-11-23 09:04:51.456401136 +0000 UTC m=+0.056760914 container cleanup a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, container_name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=)
Nov 23 09:04:51 np0005532585.localdomain podman[110997]: nova_virtsecretd
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Nov 23 09:04:51 np0005532585.localdomain systemd[1]: Stopped nova_virtsecretd container.
Nov 23 09:04:51 np0005532585.localdomain sudo[110922]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:51 np0005532585.localdomain sudo[111100]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bixxhijlaoaqsgcvddkgbztedawetwug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888691.6422913-114-217905714533586/AnsiballZ_systemd_service.py
Nov 23 09:04:51 np0005532585.localdomain sudo[111100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:04:52 np0005532585.localdomain python3.9[111102]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb-merged.mount: Deactivated successfully.
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03-userdata-shm.mount: Deactivated successfully.
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:04:52 np0005532585.localdomain systemd-rc-local-generator[111125]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:04:52 np0005532585.localdomain systemd-sysv-generator[111129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: Stopping nova_virtstoraged container...
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: libpod-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a.scope: Deactivated successfully.
Nov 23 09:04:52 np0005532585.localdomain podman[111143]: 2025-11-23 09:04:52.687439039 +0000 UTC m=+0.075707794 container died 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtstoraged, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Nov 23 09:04:52 np0005532585.localdomain podman[111143]: 2025-11-23 09:04:52.72094689 +0000 UTC m=+0.109215605 container cleanup 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_virtstoraged, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:04:52 np0005532585.localdomain podman[111143]: nova_virtstoraged
Nov 23 09:04:52 np0005532585.localdomain podman[111158]: 2025-11-23 09:04:52.766588838 +0000 UTC m=+0.068536831 container cleanup 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: libpod-conmon-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a.scope: Deactivated successfully.
Nov 23 09:04:52 np0005532585.localdomain podman[111186]: error opening file `/run/crun/33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a/status`: No such file or directory
Nov 23 09:04:52 np0005532585.localdomain podman[111175]: 2025-11-23 09:04:52.846413028 +0000 UTC m=+0.048025253 container cleanup 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container)
Nov 23 09:04:52 np0005532585.localdomain podman[111175]: nova_virtstoraged
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Nov 23 09:04:52 np0005532585.localdomain systemd[1]: Stopped nova_virtstoraged container.
Nov 23 09:04:52 np0005532585.localdomain sudo[111100]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:53 np0005532585.localdomain sudo[111278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgxzbxuoknulxznwgjbvczlbsxufapqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888693.0037642-114-249803399812324/AnsiballZ_systemd_service.py
Nov 23 09:04:53 np0005532585.localdomain sudo[111278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:04:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89-merged.mount: Deactivated successfully.
Nov 23 09:04:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a-userdata-shm.mount: Deactivated successfully.
Nov 23 09:04:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46878 DF PROTO=TCP SPT=48896 DPT=9102 SEQ=188186518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8E4200000000001030307) 
Nov 23 09:04:53 np0005532585.localdomain python3.9[111280]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:04:53 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:04:53 np0005532585.localdomain systemd-rc-local-generator[111306]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:04:53 np0005532585.localdomain systemd-sysv-generator[111312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:04:53 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:04:53 np0005532585.localdomain systemd[1]: Stopping ovn_controller container...
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: libpod-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope: Deactivated successfully.
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: libpod-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope: Consumed 2.507s CPU time.
Nov 23 09:04:54 np0005532585.localdomain podman[111321]: 2025-11-23 09:04:54.02305407 +0000 UTC m=+0.076316572 container died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller)
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: Deactivated successfully.
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: No such file or directory
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23-userdata-shm.mount: Deactivated successfully.
Nov 23 09:04:54 np0005532585.localdomain podman[111321]: 2025-11-23 09:04:54.062346571 +0000 UTC m=+0.115609033 container cleanup 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Nov 23 09:04:54 np0005532585.localdomain podman[111321]: ovn_controller
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: No such file or directory
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: No such file or directory
Nov 23 09:04:54 np0005532585.localdomain podman[111335]: 2025-11-23 09:04:54.099563108 +0000 UTC m=+0.065057923 container cleanup 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true)
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: libpod-conmon-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope: Deactivated successfully.
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: No such file or directory
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: No such file or directory
Nov 23 09:04:54 np0005532585.localdomain podman[111349]: 2025-11-23 09:04:54.188849092 +0000 UTC m=+0.049625592 container cleanup 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 23 09:04:54 np0005532585.localdomain podman[111349]: ovn_controller
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: Stopped ovn_controller container.
Nov 23 09:04:54 np0005532585.localdomain sudo[111278]: pam_unix(sudo:session): session closed for user root
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199-merged.mount: Deactivated successfully.
Nov 23 09:04:54 np0005532585.localdomain sudo[111451]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jknvohuzgszkhwxuhkuautmzmuuutakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888694.3582406-114-193935329178924/AnsiballZ_systemd_service.py
Nov 23 09:04:54 np0005532585.localdomain sudo[111451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:04:54 np0005532585.localdomain python3.9[111453]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:04:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:04:55 np0005532585.localdomain systemd-rc-local-generator[111476]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:04:55 np0005532585.localdomain systemd-sysv-generator[111479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:04:55 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:04:55 np0005532585.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: libpod-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope: Deactivated successfully.
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: libpod-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope: Consumed 10.886s CPU time.
Nov 23 09:04:56 np0005532585.localdomain podman[111494]: 2025-11-23 09:04:56.018530037 +0000 UTC m=+0.732985618 container died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: Deactivated successfully.
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: No such file or directory
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: tmp-crun.OfKjgQ.mount: Deactivated successfully.
Nov 23 09:04:56 np0005532585.localdomain podman[111494]: 2025-11-23 09:04:56.09332991 +0000 UTC m=+0.807785441 container cleanup 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 09:04:56 np0005532585.localdomain podman[111494]: ovn_metadata_agent
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: No such file or directory
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: No such file or directory
Nov 23 09:04:56 np0005532585.localdomain podman[111506]: 2025-11-23 09:04:56.152775508 +0000 UTC m=+0.124184730 container cleanup 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_id=tripleo_step4)
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9-merged.mount: Deactivated successfully.
Nov 23 09:04:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f-userdata-shm.mount: Deactivated successfully.
Nov 23 09:04:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18768 DF PROTO=TCP SPT=51360 DPT=9882 SEQ=4264641768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8F4200000000001030307) 
Nov 23 09:04:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45133 DF PROTO=TCP SPT=59240 DPT=9101 SEQ=2782383458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8FD0B0000000001030307) 
Nov 23 09:04:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46737 DF PROTO=TCP SPT=60328 DPT=9105 SEQ=1388543733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8FDBB0000000001030307) 
Nov 23 09:05:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45135 DF PROTO=TCP SPT=59240 DPT=9101 SEQ=2782383458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B909210000000001030307) 
Nov 23 09:05:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5980 DF PROTO=TCP SPT=51382 DPT=9100 SEQ=1610855476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B916210000000001030307) 
Nov 23 09:05:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8883 DF PROTO=TCP SPT=44354 DPT=9100 SEQ=438120719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B921E10000000001030307) 
Nov 23 09:05:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49268 DF PROTO=TCP SPT=43182 DPT=9882 SEQ=4264056996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B930200000000001030307) 
Nov 23 09:05:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49269 DF PROTO=TCP SPT=43182 DPT=9882 SEQ=4264056996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B938210000000001030307) 
Nov 23 09:05:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46880 DF PROTO=TCP SPT=48896 DPT=9102 SEQ=188186518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B944200000000001030307) 
Nov 23 09:05:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36866 DF PROTO=TCP SPT=40230 DPT=9102 SEQ=3651957829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B959600000000001030307) 
Nov 23 09:05:24 np0005532585.localdomain sshd[111524]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:05:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49271 DF PROTO=TCP SPT=43182 DPT=9882 SEQ=4264056996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B968200000000001030307) 
Nov 23 09:05:27 np0005532585.localdomain sshd[111524]: Connection closed by authenticating user root 185.156.73.233 port 52018 [preauth]
Nov 23 09:05:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19635 DF PROTO=TCP SPT=43478 DPT=9101 SEQ=3238666304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9723A0000000001030307) 
Nov 23 09:05:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31075 DF PROTO=TCP SPT=50952 DPT=9105 SEQ=3111224148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B972EB0000000001030307) 
Nov 23 09:05:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19637 DF PROTO=TCP SPT=43478 DPT=9101 SEQ=3238666304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B97E600000000001030307) 
Nov 23 09:05:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39165 DF PROTO=TCP SPT=60912 DPT=9100 SEQ=4066548656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B98C200000000001030307) 
Nov 23 09:05:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42724 DF PROTO=TCP SPT=51646 DPT=9100 SEQ=1282188036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B997200000000001030307) 
Nov 23 09:05:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31170 DF PROTO=TCP SPT=35550 DPT=9882 SEQ=811106714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9A5600000000001030307) 
Nov 23 09:05:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31171 DF PROTO=TCP SPT=35550 DPT=9882 SEQ=811106714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9AD600000000001030307) 
Nov 23 09:05:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36868 DF PROTO=TCP SPT=40230 DPT=9102 SEQ=3651957829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9BA210000000001030307) 
Nov 23 09:05:50 np0005532585.localdomain sudo[111526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:05:50 np0005532585.localdomain sudo[111526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:05:50 np0005532585.localdomain sudo[111526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:05:50 np0005532585.localdomain sudo[111541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:05:50 np0005532585.localdomain sudo[111541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:05:51 np0005532585.localdomain sudo[111541]: pam_unix(sudo:session): session closed for user root
Nov 23 09:05:51 np0005532585.localdomain sudo[111587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:05:51 np0005532585.localdomain sudo[111587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:05:51 np0005532585.localdomain sudo[111587]: pam_unix(sudo:session): session closed for user root
Nov 23 09:05:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56007 DF PROTO=TCP SPT=46236 DPT=9102 SEQ=420763077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9CE600000000001030307) 
Nov 23 09:05:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31173 DF PROTO=TCP SPT=35550 DPT=9882 SEQ=811106714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9DE200000000001030307) 
Nov 23 09:05:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47725 DF PROTO=TCP SPT=35522 DPT=9101 SEQ=2861703325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9E76A0000000001030307) 
Nov 23 09:05:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8950 DF PROTO=TCP SPT=50774 DPT=9105 SEQ=3687115265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9E81B0000000001030307) 
Nov 23 09:06:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8952 DF PROTO=TCP SPT=50774 DPT=9105 SEQ=3687115265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9F4200000000001030307) 
Nov 23 09:06:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8886 DF PROTO=TCP SPT=44354 DPT=9100 SEQ=438120719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA00210000000001030307) 
Nov 23 09:06:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14469 DF PROTO=TCP SPT=56816 DPT=9100 SEQ=2930898796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA0C600000000001030307) 
Nov 23 09:06:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32143 DF PROTO=TCP SPT=60372 DPT=9882 SEQ=1472291617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA1AA10000000001030307) 
Nov 23 09:06:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32144 DF PROTO=TCP SPT=60372 DPT=9882 SEQ=1472291617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA22A10000000001030307) 
Nov 23 09:06:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56009 DF PROTO=TCP SPT=46236 DPT=9102 SEQ=420763077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA2E200000000001030307) 
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 69516 (conmon) with signal SIGKILL.
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: libpod-conmon-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope: Deactivated successfully.
Nov 23 09:06:20 np0005532585.localdomain podman[111615]: error opening file `/run/crun/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f/status`: No such file or directory
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: No such file or directory
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: No such file or directory
Nov 23 09:06:20 np0005532585.localdomain podman[111602]: 2025-11-23 09:06:20.279561946 +0000 UTC m=+0.085577426 container cleanup 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 09:06:20 np0005532585.localdomain podman[111602]: ovn_metadata_agent
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Nov 23 09:06:20 np0005532585.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Nov 23 09:06:20 np0005532585.localdomain sudo[111451]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:20 np0005532585.localdomain sudo[111708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxdqmndymtxwhyqrdjuylfuzxkdrgcdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888780.4475465-114-179685923788330/AnsiballZ_systemd_service.py
Nov 23 09:06:20 np0005532585.localdomain sudo[111708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:20 np0005532585.localdomain python3.9[111710]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:06:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:06:21 np0005532585.localdomain systemd-sysv-generator[111740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:06:21 np0005532585.localdomain systemd-rc-local-generator[111736]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:06:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:06:21 np0005532585.localdomain sudo[111708]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:22 np0005532585.localdomain sudo[111838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnvoswsunibggemmmiopjymvglwnbbkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888782.2413714-564-26307199620609/AnsiballZ_file.py
Nov 23 09:06:22 np0005532585.localdomain sudo[111838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:22 np0005532585.localdomain python3.9[111840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:22 np0005532585.localdomain sudo[111838]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:23 np0005532585.localdomain sudo[111930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksqgeqliqgtutzubecucyanjxaudzcca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888783.0837753-564-106749804583277/AnsiballZ_file.py
Nov 23 09:06:23 np0005532585.localdomain sudo[111930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4660 DF PROTO=TCP SPT=60044 DPT=9102 SEQ=3991955285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA43A00000000001030307) 
Nov 23 09:06:23 np0005532585.localdomain python3.9[111932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:23 np0005532585.localdomain sudo[111930]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:23 np0005532585.localdomain sudo[112022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lisfbckfzmqlytajbarhftevqixpeybw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888783.59724-564-123256505572731/AnsiballZ_file.py
Nov 23 09:06:23 np0005532585.localdomain sudo[112022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:24 np0005532585.localdomain python3.9[112024]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:24 np0005532585.localdomain sudo[112022]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:24 np0005532585.localdomain sudo[112114]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyucyvbkebljqqyrxpaewmpkmqltlvpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888784.1679714-564-212449474092255/AnsiballZ_file.py
Nov 23 09:06:24 np0005532585.localdomain sudo[112114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:24 np0005532585.localdomain python3.9[112116]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:24 np0005532585.localdomain sudo[112114]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:25 np0005532585.localdomain sudo[112206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptbahwszpjfporomqphxqjhblocqwzrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888784.7796726-564-191908845788620/AnsiballZ_file.py
Nov 23 09:06:25 np0005532585.localdomain sudo[112206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:25 np0005532585.localdomain python3.9[112208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:25 np0005532585.localdomain sudo[112206]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:25 np0005532585.localdomain sudo[112298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiksmvjwxgweurcgccxzrrbmqkqkkyyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888785.3505678-564-261504171075471/AnsiballZ_file.py
Nov 23 09:06:25 np0005532585.localdomain sudo[112298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:25 np0005532585.localdomain python3.9[112300]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:25 np0005532585.localdomain sudo[112298]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:26 np0005532585.localdomain sudo[112390]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itoxmadpolorppcdljysasbbjifgnide ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888786.1037107-564-55608214739120/AnsiballZ_file.py
Nov 23 09:06:26 np0005532585.localdomain sudo[112390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:26 np0005532585.localdomain python3.9[112392]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:26 np0005532585.localdomain sudo[112390]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:26 np0005532585.localdomain sudo[112482]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmcqcvygipduharbmulymudqksosojar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888786.6900558-564-182053445117530/AnsiballZ_file.py
Nov 23 09:06:26 np0005532585.localdomain sudo[112482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32146 DF PROTO=TCP SPT=60372 DPT=9882 SEQ=1472291617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA52210000000001030307) 
Nov 23 09:06:27 np0005532585.localdomain python3.9[112484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:27 np0005532585.localdomain sudo[112482]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:27 np0005532585.localdomain sudo[112574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggbaujfgaxvnchwpplhjaapbpddtechd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888787.3218734-564-274092939519975/AnsiballZ_file.py
Nov 23 09:06:27 np0005532585.localdomain sudo[112574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:27 np0005532585.localdomain python3.9[112576]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:27 np0005532585.localdomain sudo[112574]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:28 np0005532585.localdomain sudo[112666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bczdokmenqutpzlrmvfzrukdntxhtkjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888787.8904185-564-162024292341274/AnsiballZ_file.py
Nov 23 09:06:28 np0005532585.localdomain sudo[112666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:28 np0005532585.localdomain python3.9[112668]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:28 np0005532585.localdomain sudo[112666]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:28 np0005532585.localdomain sshd[112728]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:06:28 np0005532585.localdomain sudo[112760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zisbwfefswrtbfkrkrzhzoiewcrpsuud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888788.462312-564-46662336315784/AnsiballZ_file.py
Nov 23 09:06:28 np0005532585.localdomain sudo[112760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:28 np0005532585.localdomain python3.9[112762]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:28 np0005532585.localdomain sudo[112760]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:29 np0005532585.localdomain sshd[112728]: Invalid user solana from 193.32.162.146 port 59852
Nov 23 09:06:29 np0005532585.localdomain sshd[112728]: Connection closed by invalid user solana 193.32.162.146 port 59852 [preauth]
Nov 23 09:06:29 np0005532585.localdomain sudo[112852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frbuxdiltmgbfohuvjjrumywpnhdqmpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888789.0713556-564-156247665355149/AnsiballZ_file.py
Nov 23 09:06:29 np0005532585.localdomain sudo[112852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:29 np0005532585.localdomain python3.9[112854]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:29 np0005532585.localdomain sudo[112852]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27570 DF PROTO=TCP SPT=58632 DPT=9101 SEQ=1385897793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA5C9B0000000001030307) 
Nov 23 09:06:29 np0005532585.localdomain sudo[112944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onkdslwcybtghyongzedzidglkpjmwzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888789.6128085-564-235485220736602/AnsiballZ_file.py
Nov 23 09:06:29 np0005532585.localdomain sudo[112944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45440 DF PROTO=TCP SPT=58254 DPT=9105 SEQ=3375843354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA5D4B0000000001030307) 
Nov 23 09:06:30 np0005532585.localdomain python3.9[112946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:30 np0005532585.localdomain sudo[112944]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:30 np0005532585.localdomain sudo[113036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzhrelfyhrfpxnhkmjtmoiqcnjspkzun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888790.2267923-564-204342369297021/AnsiballZ_file.py
Nov 23 09:06:30 np0005532585.localdomain sudo[113036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:30 np0005532585.localdomain python3.9[113038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:30 np0005532585.localdomain sudo[113036]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:31 np0005532585.localdomain sudo[113128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyjvgdwlvmqtlrfbodivpqmjtyvgvjdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888790.809832-564-255314094271551/AnsiballZ_file.py
Nov 23 09:06:31 np0005532585.localdomain sudo[113128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:31 np0005532585.localdomain python3.9[113130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:31 np0005532585.localdomain sudo[113128]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:31 np0005532585.localdomain sudo[113220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uscysrzpzmvqmhjwvsqbqqbeppyesnnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888791.6083481-564-120286036199258/AnsiballZ_file.py
Nov 23 09:06:31 np0005532585.localdomain sudo[113220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:32 np0005532585.localdomain python3.9[113222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:32 np0005532585.localdomain sudo[113220]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:32 np0005532585.localdomain sudo[113312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsvhtxmdkdogkhklzeoiabkhahzjhbdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888792.1982498-564-96632225765930/AnsiballZ_file.py
Nov 23 09:06:32 np0005532585.localdomain sudo[113312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:32 np0005532585.localdomain python3.9[113314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:32 np0005532585.localdomain sudo[113312]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27572 DF PROTO=TCP SPT=58632 DPT=9101 SEQ=1385897793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA68A10000000001030307) 
Nov 23 09:06:32 np0005532585.localdomain sudo[113404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iodhktyynwpvjavkumgpueomlfcvwnib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888792.7000275-564-78123425917304/AnsiballZ_file.py
Nov 23 09:06:32 np0005532585.localdomain sudo[113404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:33 np0005532585.localdomain python3.9[113406]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:33 np0005532585.localdomain sudo[113404]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:33 np0005532585.localdomain sudo[113496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pytwnvoyrpmuaqbfjsikeljaujqjcihs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888793.2126682-564-181898273724201/AnsiballZ_file.py
Nov 23 09:06:33 np0005532585.localdomain sudo[113496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:33 np0005532585.localdomain python3.9[113498]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:33 np0005532585.localdomain sudo[113496]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:33 np0005532585.localdomain sudo[113588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahdvojettxnriurhqqzdnyyouogksuen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888793.7396154-564-155338748535139/AnsiballZ_file.py
Nov 23 09:06:33 np0005532585.localdomain sudo[113588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:34 np0005532585.localdomain python3.9[113590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:34 np0005532585.localdomain sudo[113588]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:34 np0005532585.localdomain sudo[113680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqkbiazmrcxhlqagnzofzsyvtwbrliex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888794.374387-564-51795299696504/AnsiballZ_file.py
Nov 23 09:06:34 np0005532585.localdomain sudo[113680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:34 np0005532585.localdomain python3.9[113682]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:34 np0005532585.localdomain sudo[113680]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:36 np0005532585.localdomain sudo[113772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oarwktyuvcmzbdvrohcaqxygrgojiyay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888795.7414904-1014-49000304710034/AnsiballZ_file.py
Nov 23 09:06:36 np0005532585.localdomain sudo[113772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:36 np0005532585.localdomain python3.9[113774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:36 np0005532585.localdomain sudo[113772]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42727 DF PROTO=TCP SPT=51646 DPT=9100 SEQ=1282188036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA76200000000001030307) 
Nov 23 09:06:36 np0005532585.localdomain sudo[113864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfveeemfcazcbwxlvdxscuuysocpdtac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888796.3320935-1014-98630299077321/AnsiballZ_file.py
Nov 23 09:06:36 np0005532585.localdomain sudo[113864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:36 np0005532585.localdomain python3.9[113866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:36 np0005532585.localdomain sudo[113864]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:37 np0005532585.localdomain sudo[113956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyxswicojionhyqnbxadqdyyegijmxdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888797.1128852-1014-215038711111437/AnsiballZ_file.py
Nov 23 09:06:37 np0005532585.localdomain sudo[113956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:37 np0005532585.localdomain python3.9[113958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:37 np0005532585.localdomain sudo[113956]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:37 np0005532585.localdomain sudo[114048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hswtplxqnfokyqhbrtjupzqadfazvsqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888797.666556-1014-265590679793423/AnsiballZ_file.py
Nov 23 09:06:37 np0005532585.localdomain sudo[114048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:38 np0005532585.localdomain python3.9[114050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:38 np0005532585.localdomain sudo[114048]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:38 np0005532585.localdomain sudo[114140]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nayazosibnafflmbnynpiazmbavmqqaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888798.2260458-1014-38483359981965/AnsiballZ_file.py
Nov 23 09:06:38 np0005532585.localdomain sudo[114140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:38 np0005532585.localdomain python3.9[114142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:38 np0005532585.localdomain sudo[114140]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:39 np0005532585.localdomain sudo[114232]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tedwngpzlajizbioisqzzitxlkpwscit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888798.7713406-1014-42431767576587/AnsiballZ_file.py
Nov 23 09:06:39 np0005532585.localdomain sudo[114232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:39 np0005532585.localdomain python3.9[114234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:39 np0005532585.localdomain sudo[114232]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42173 DF PROTO=TCP SPT=36532 DPT=9100 SEQ=1382933724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA81A00000000001030307) 
Nov 23 09:06:39 np0005532585.localdomain sudo[114324]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydjcscahpzpzwwctzrpjvmqvzgtbrnwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888799.328834-1014-209946576194900/AnsiballZ_file.py
Nov 23 09:06:39 np0005532585.localdomain sudo[114324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:39 np0005532585.localdomain python3.9[114326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:39 np0005532585.localdomain sudo[114324]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:40 np0005532585.localdomain sudo[114416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsqbytsjuqwdazvqrqlxqshwhafyqglk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888799.900073-1014-49138432162561/AnsiballZ_file.py
Nov 23 09:06:40 np0005532585.localdomain sudo[114416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:40 np0005532585.localdomain python3.9[114418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:40 np0005532585.localdomain sudo[114416]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:40 np0005532585.localdomain sudo[114508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rouuoanwqqryzmthkfklgkigtavbhvxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888800.503405-1014-270682172293908/AnsiballZ_file.py
Nov 23 09:06:40 np0005532585.localdomain sudo[114508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:40 np0005532585.localdomain python3.9[114510]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:40 np0005532585.localdomain sudo[114508]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:41 np0005532585.localdomain sudo[114600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szfyunywkmhyknhybbnyzjxjphzxfehs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888801.0733523-1014-67044261903548/AnsiballZ_file.py
Nov 23 09:06:41 np0005532585.localdomain sudo[114600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:41 np0005532585.localdomain python3.9[114602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:41 np0005532585.localdomain sudo[114600]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:41 np0005532585.localdomain sudo[114692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqwmvbzxzryctvbzwqkcmnapxsqlqtnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888801.6505263-1014-189890849114071/AnsiballZ_file.py
Nov 23 09:06:41 np0005532585.localdomain sudo[114692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:42 np0005532585.localdomain python3.9[114694]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:42 np0005532585.localdomain sudo[114692]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:42 np0005532585.localdomain sudo[114784]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hytbbelkjhlhgiauzpusfenydbmknctp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888802.492435-1014-211686646965297/AnsiballZ_file.py
Nov 23 09:06:42 np0005532585.localdomain sudo[114784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2509 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4072214415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA8FE00000000001030307) 
Nov 23 09:06:42 np0005532585.localdomain python3.9[114786]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:42 np0005532585.localdomain sudo[114784]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:43 np0005532585.localdomain sudo[114876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-douoaxykwkiceqivpeccsejiofzehjdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888803.038506-1014-224474549940094/AnsiballZ_file.py
Nov 23 09:06:43 np0005532585.localdomain sudo[114876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:43 np0005532585.localdomain python3.9[114878]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:43 np0005532585.localdomain sudo[114876]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:43 np0005532585.localdomain sudo[114968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-legldznwocwvpclejptucntjvjmwaqoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888803.599422-1014-85247570639530/AnsiballZ_file.py
Nov 23 09:06:43 np0005532585.localdomain sudo[114968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:44 np0005532585.localdomain python3.9[114970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:44 np0005532585.localdomain sudo[114968]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:44 np0005532585.localdomain sudo[115060]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmtftjchwjmprjofdqjlwrznixpkbspw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888804.1643698-1014-124507450502091/AnsiballZ_file.py
Nov 23 09:06:44 np0005532585.localdomain sudo[115060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:44 np0005532585.localdomain python3.9[115062]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:44 np0005532585.localdomain sudo[115060]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2510 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4072214415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA97E10000000001030307) 
Nov 23 09:06:44 np0005532585.localdomain sudo[115152]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hognjimydrtcfcmyidmgptulmirfhgez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888804.7278442-1014-179753188856243/AnsiballZ_file.py
Nov 23 09:06:45 np0005532585.localdomain sudo[115152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:45 np0005532585.localdomain python3.9[115154]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:45 np0005532585.localdomain sudo[115152]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:45 np0005532585.localdomain sudo[115244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpqcwhjxcxurpgnksjskjdgccbsgpjvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888805.3624253-1014-200839484608066/AnsiballZ_file.py
Nov 23 09:06:45 np0005532585.localdomain sudo[115244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:45 np0005532585.localdomain python3.9[115246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:45 np0005532585.localdomain sudo[115244]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:46 np0005532585.localdomain sudo[115336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgctrzimbvrhnzazdwsfculyaptenizo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888805.925296-1014-110805039758104/AnsiballZ_file.py
Nov 23 09:06:46 np0005532585.localdomain sudo[115336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:46 np0005532585.localdomain python3.9[115338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:46 np0005532585.localdomain sudo[115336]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:46 np0005532585.localdomain sudo[115428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfqobkqxmrvdgqdcjmbdelpjyqocwlzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888806.4906983-1014-96967135696370/AnsiballZ_file.py
Nov 23 09:06:46 np0005532585.localdomain sudo[115428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:46 np0005532585.localdomain python3.9[115430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:46 np0005532585.localdomain sudo[115428]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:47 np0005532585.localdomain sudo[115520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnnctydmcssaczosfgvsuaqemhailekk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888807.0386214-1014-239219270602921/AnsiballZ_file.py
Nov 23 09:06:47 np0005532585.localdomain sudo[115520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:47 np0005532585.localdomain python3.9[115522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:47 np0005532585.localdomain sudo[115520]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:48 np0005532585.localdomain sudo[115612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obmhfvnslsbgfhhsyijkmhbemdomywex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888807.7334363-1014-99609859984101/AnsiballZ_file.py
Nov 23 09:06:48 np0005532585.localdomain sudo[115612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4662 DF PROTO=TCP SPT=60044 DPT=9102 SEQ=3991955285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAA4210000000001030307) 
Nov 23 09:06:48 np0005532585.localdomain python3.9[115614]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:06:48 np0005532585.localdomain sudo[115612]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:50 np0005532585.localdomain sudo[115704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqdyigvlkmsoxbmokeyhrwrtklxydhxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888809.7644546-1461-130615502792282/AnsiballZ_command.py
Nov 23 09:06:50 np0005532585.localdomain sudo[115704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:50 np0005532585.localdomain python3.9[115706]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:50 np0005532585.localdomain sudo[115704]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:51 np0005532585.localdomain python3.9[115798]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:06:51 np0005532585.localdomain sudo[115888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibgkgkvacepldcvvchhgfddgizmsdzou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888811.540767-1515-205972027944604/AnsiballZ_systemd_service.py
Nov 23 09:06:51 np0005532585.localdomain sudo[115888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:51 np0005532585.localdomain sudo[115891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:06:51 np0005532585.localdomain sudo[115891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:06:51 np0005532585.localdomain sudo[115891]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:52 np0005532585.localdomain sudo[115906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:06:52 np0005532585.localdomain sudo[115906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:06:52 np0005532585.localdomain python3.9[115890]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:06:52 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:06:52 np0005532585.localdomain systemd-rc-local-generator[115942]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:06:52 np0005532585.localdomain systemd-sysv-generator[115946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:06:52 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:06:52 np0005532585.localdomain sudo[115888]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:52 np0005532585.localdomain systemd[1]: tmp-crun.Ptevch.mount: Deactivated successfully.
Nov 23 09:06:52 np0005532585.localdomain podman[116066]: 2025-11-23 09:06:52.885326377 +0000 UTC m=+0.104343827 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, distribution-scope=public, release=553, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main)
Nov 23 09:06:53 np0005532585.localdomain podman[116066]: 2025-11-23 09:06:53.00630687 +0000 UTC m=+0.225324380 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, RELEASE=main)
Nov 23 09:06:53 np0005532585.localdomain sudo[116148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfvxqkimzogbwdakyaemxtddtoenfsbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888812.733268-1539-56520352523653/AnsiballZ_command.py
Nov 23 09:06:53 np0005532585.localdomain sudo[116148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:53 np0005532585.localdomain python3.9[116154]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:53 np0005532585.localdomain sudo[116148]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:53 np0005532585.localdomain sudo[115906]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:53 np0005532585.localdomain sudo[116201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:06:53 np0005532585.localdomain sudo[116201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:06:53 np0005532585.localdomain sudo[116201]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:53 np0005532585.localdomain sudo[116220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:06:53 np0005532585.localdomain sudo[116220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:06:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54890 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=2193620370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAB8E00000000001030307) 
Nov 23 09:06:53 np0005532585.localdomain sudo[116306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfrhvbpyaoafjvifijkfbuflcergbrlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888813.3980398-1539-236512783467477/AnsiballZ_command.py
Nov 23 09:06:53 np0005532585.localdomain sudo[116306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:53 np0005532585.localdomain python3.9[116308]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:53 np0005532585.localdomain sudo[116306]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:54 np0005532585.localdomain sudo[116220]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:54 np0005532585.localdomain sudo[116432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msxiavkiacyyhcqyhqnoquorovctrzqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888813.980862-1539-185207821421860/AnsiballZ_command.py
Nov 23 09:06:54 np0005532585.localdomain sudo[116432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:54 np0005532585.localdomain python3.9[116434]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:54 np0005532585.localdomain sudo[116432]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:54 np0005532585.localdomain sudo[116463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:06:54 np0005532585.localdomain sudo[116463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:06:54 np0005532585.localdomain sudo[116463]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:54 np0005532585.localdomain sudo[116540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsarhhjxfgzqxcrppyhzrdygoehzdoff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888814.5715778-1539-73500667472891/AnsiballZ_command.py
Nov 23 09:06:54 np0005532585.localdomain sudo[116540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:55 np0005532585.localdomain python3.9[116542]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:55 np0005532585.localdomain sudo[116540]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:55 np0005532585.localdomain sudo[116633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azoarlkzrzmsrmwcbtgjekcszuckponx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888815.2137127-1539-81730007951911/AnsiballZ_command.py
Nov 23 09:06:55 np0005532585.localdomain sudo[116633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:55 np0005532585.localdomain python3.9[116635]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:55 np0005532585.localdomain sudo[116633]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:56 np0005532585.localdomain sudo[116726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfcxkunqrdlgvqxfkvyitpuvczpyidob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888815.8060465-1539-79269424115893/AnsiballZ_command.py
Nov 23 09:06:56 np0005532585.localdomain sudo[116726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:56 np0005532585.localdomain python3.9[116728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:56 np0005532585.localdomain sudo[116726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:56 np0005532585.localdomain sudo[116819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxmdcmfvxbvegwqskpjcvewgmfqnghsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888816.3747907-1539-183525489955478/AnsiballZ_command.py
Nov 23 09:06:56 np0005532585.localdomain sudo[116819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:56 np0005532585.localdomain python3.9[116821]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:56 np0005532585.localdomain sudo[116819]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:57 np0005532585.localdomain sudo[116912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqskxkhhjzrkmlaejfwpxteyuzafeued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888816.9066622-1539-256881782630182/AnsiballZ_command.py
Nov 23 09:06:57 np0005532585.localdomain sudo[116912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2512 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4072214415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAC8200000000001030307) 
Nov 23 09:06:57 np0005532585.localdomain python3.9[116914]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:57 np0005532585.localdomain sudo[116912]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:57 np0005532585.localdomain sudo[117005]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbzyhnmumbvasbmvbrjdhuziymqmodgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888817.5056658-1539-214931008900219/AnsiballZ_command.py
Nov 23 09:06:57 np0005532585.localdomain sudo[117005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:57 np0005532585.localdomain python3.9[117007]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:57 np0005532585.localdomain sudo[117005]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:58 np0005532585.localdomain sudo[117098]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znqrqcdakfxkkggteltxnblzcbfoewip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888818.051606-1539-70889389212410/AnsiballZ_command.py
Nov 23 09:06:58 np0005532585.localdomain sudo[117098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:58 np0005532585.localdomain python3.9[117100]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:58 np0005532585.localdomain sudo[117098]: pam_unix(sudo:session): session closed for user root
Nov 23 09:06:58 np0005532585.localdomain sudo[117191]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnislcurpbznsmxbmzlimkcclqvvkprz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888818.6716628-1539-78819749508978/AnsiballZ_command.py
Nov 23 09:06:58 np0005532585.localdomain sudo[117191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:06:59 np0005532585.localdomain python3.9[117193]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:06:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60681 DF PROTO=TCP SPT=46284 DPT=9101 SEQ=4194018284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAD1C90000000001030307) 
Nov 23 09:06:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8316 DF PROTO=TCP SPT=39240 DPT=9105 SEQ=2249610514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAD27B0000000001030307) 
Nov 23 09:07:00 np0005532585.localdomain sudo[117191]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:00 np0005532585.localdomain sudo[117284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucvhulpwhiverhyvdqyyosnepipwbhch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888820.337263-1539-217435792673279/AnsiballZ_command.py
Nov 23 09:07:00 np0005532585.localdomain sudo[117284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:00 np0005532585.localdomain python3.9[117286]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:00 np0005532585.localdomain sudo[117284]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:01 np0005532585.localdomain sudo[117377]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouvowjuhzidmhkcrhfxpuiouigbqonww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888820.9433236-1539-163384635175571/AnsiballZ_command.py
Nov 23 09:07:01 np0005532585.localdomain sudo[117377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:01 np0005532585.localdomain python3.9[117379]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:01 np0005532585.localdomain sudo[117377]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:01 np0005532585.localdomain sudo[117470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfhhkpkonvrythwzdjnbjzfewvnxgirr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888821.570139-1539-14115748271469/AnsiballZ_command.py
Nov 23 09:07:01 np0005532585.localdomain sudo[117470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:02 np0005532585.localdomain python3.9[117472]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:02 np0005532585.localdomain sudo[117470]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:02 np0005532585.localdomain sudo[117563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzepkvuvntgdywfgluoxezbpcxmmtjfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888822.242624-1539-152002193392381/AnsiballZ_command.py
Nov 23 09:07:02 np0005532585.localdomain sudo[117563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:02 np0005532585.localdomain python3.9[117565]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:02 np0005532585.localdomain sudo[117563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60683 DF PROTO=TCP SPT=46284 DPT=9101 SEQ=4194018284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BADDE10000000001030307) 
Nov 23 09:07:03 np0005532585.localdomain sudo[117656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylfcbmavsiemhrjzsteooqrwzztfqvpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888822.93446-1539-165248539653988/AnsiballZ_command.py
Nov 23 09:07:03 np0005532585.localdomain sudo[117656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:03 np0005532585.localdomain python3.9[117658]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:04 np0005532585.localdomain sudo[117656]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:04 np0005532585.localdomain sudo[117749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czknsabsfssnoigevnraqzddwjeejpni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888824.5705934-1539-9179317243629/AnsiballZ_command.py
Nov 23 09:07:04 np0005532585.localdomain sudo[117749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:05 np0005532585.localdomain python3.9[117751]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:05 np0005532585.localdomain sudo[117749]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:05 np0005532585.localdomain sudo[117842]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrdhdoxqwyrtlwshrolirzjlxzasavxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888825.1714427-1539-113987657291489/AnsiballZ_command.py
Nov 23 09:07:05 np0005532585.localdomain sudo[117842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:05 np0005532585.localdomain python3.9[117844]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:05 np0005532585.localdomain sudo[117842]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:05 np0005532585.localdomain sudo[117935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxafbqoqffwwxnxixysvwesqtwvtbbqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888825.7278736-1539-28210493162310/AnsiballZ_command.py
Nov 23 09:07:05 np0005532585.localdomain sudo[117935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14472 DF PROTO=TCP SPT=56816 DPT=9100 SEQ=2930898796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAEA200000000001030307) 
Nov 23 09:07:06 np0005532585.localdomain python3.9[117937]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:06 np0005532585.localdomain sudo[117935]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:06 np0005532585.localdomain sudo[118028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkogxkpvdhjqvkjpinqnhbbfuwonmnbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888826.3161118-1539-17676889060436/AnsiballZ_command.py
Nov 23 09:07:06 np0005532585.localdomain sudo[118028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:06 np0005532585.localdomain python3.9[118030]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:06 np0005532585.localdomain sudo[118028]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:07 np0005532585.localdomain sudo[118121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyhzbekczordfsicpjnxjrupxibmlyhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888826.8928025-1539-149655481720739/AnsiballZ_command.py
Nov 23 09:07:07 np0005532585.localdomain sudo[118121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:07 np0005532585.localdomain python3.9[118123]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:07 np0005532585.localdomain sudo[118121]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38609 DF PROTO=TCP SPT=59828 DPT=9100 SEQ=3921597297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAF6A10000000001030307) 
Nov 23 09:07:11 np0005532585.localdomain sshd[106219]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:07:11 np0005532585.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Nov 23 09:07:11 np0005532585.localdomain systemd[1]: session-37.scope: Consumed 46.941s CPU time.
Nov 23 09:07:11 np0005532585.localdomain systemd-logind[761]: Session 37 logged out. Waiting for processes to exit.
Nov 23 09:07:11 np0005532585.localdomain systemd-logind[761]: Removed session 37.
Nov 23 09:07:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19102 DF PROTO=TCP SPT=39486 DPT=9882 SEQ=1997658962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB04E00000000001030307) 
Nov 23 09:07:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19103 DF PROTO=TCP SPT=39486 DPT=9882 SEQ=1997658962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB0CE10000000001030307) 
Nov 23 09:07:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54892 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=2193620370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB1A210000000001030307) 
Nov 23 09:07:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63455 DF PROTO=TCP SPT=56026 DPT=9102 SEQ=2178988037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB2E200000000001030307) 
Nov 23 09:07:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19105 DF PROTO=TCP SPT=39486 DPT=9882 SEQ=1997658962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB3C200000000001030307) 
Nov 23 09:07:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63821 DF PROTO=TCP SPT=59042 DPT=9101 SEQ=1271651755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB46F90000000001030307) 
Nov 23 09:07:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27142 DF PROTO=TCP SPT=59458 DPT=9105 SEQ=1292689378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB47AB0000000001030307) 
Nov 23 09:07:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63823 DF PROTO=TCP SPT=59042 DPT=9101 SEQ=1271651755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB53200000000001030307) 
Nov 23 09:07:36 np0005532585.localdomain sshd[118140]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:07:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42176 DF PROTO=TCP SPT=36532 DPT=9100 SEQ=1382933724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB60200000000001030307) 
Nov 23 09:07:36 np0005532585.localdomain sshd[118140]: Accepted publickey for zuul from 192.168.122.31 port 53098 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:07:36 np0005532585.localdomain systemd-logind[761]: New session 38 of user zuul.
Nov 23 09:07:36 np0005532585.localdomain systemd[1]: Started Session 38 of User zuul.
Nov 23 09:07:36 np0005532585.localdomain sshd[118140]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:07:37 np0005532585.localdomain python3.9[118233]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 23 09:07:38 np0005532585.localdomain python3.9[118337]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:07:38 np0005532585.localdomain sudo[118427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-futiuascvfbknbizaozigneyqdclzwlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888858.5025265-94-34205829375918/AnsiballZ_command.py
Nov 23 09:07:38 np0005532585.localdomain sudo[118427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:39 np0005532585.localdomain python3.9[118429]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:39 np0005532585.localdomain sudo[118427]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54385 DF PROTO=TCP SPT=48914 DPT=9100 SEQ=988562431 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB6BE10000000001030307) 
Nov 23 09:07:39 np0005532585.localdomain sudo[118520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyiusuqxfdazhlbcdnhixcpvjjelvgwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888859.573975-130-151457793208052/AnsiballZ_stat.py
Nov 23 09:07:39 np0005532585.localdomain sudo[118520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:40 np0005532585.localdomain python3.9[118522]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:07:40 np0005532585.localdomain sudo[118520]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:40 np0005532585.localdomain sudo[118612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxozcjxwnnmmhbltjjzzrbxbpxuqawvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888860.330151-154-100527365058333/AnsiballZ_file.py
Nov 23 09:07:40 np0005532585.localdomain sudo[118612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:40 np0005532585.localdomain python3.9[118614]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:07:40 np0005532585.localdomain sudo[118612]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:41 np0005532585.localdomain sudo[118704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxrmxnnjaylkqjfwolcjyhnfavothmts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888861.1344411-178-164186714208497/AnsiballZ_stat.py
Nov 23 09:07:41 np0005532585.localdomain sudo[118704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:41 np0005532585.localdomain python3.9[118706]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:07:41 np0005532585.localdomain sudo[118704]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:42 np0005532585.localdomain sudo[118777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-useqafzskpaiakksgueweuqeabfvgkro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888861.1344411-178-164186714208497/AnsiballZ_copy.py
Nov 23 09:07:42 np0005532585.localdomain sudo[118777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:42 np0005532585.localdomain python3.9[118779]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763888861.1344411-178-164186714208497/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:07:42 np0005532585.localdomain sudo[118777]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:42 np0005532585.localdomain sudo[118869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osgzjnliokofswjxujvmmilqkkstynsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888862.5254114-223-219133246980628/AnsiballZ_setup.py
Nov 23 09:07:42 np0005532585.localdomain sudo[118869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15774 DF PROTO=TCP SPT=48214 DPT=9882 SEQ=851862344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB7A200000000001030307) 
Nov 23 09:07:43 np0005532585.localdomain python3.9[118871]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:07:43 np0005532585.localdomain sudo[118869]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:43 np0005532585.localdomain sudo[118965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olsblhxyaepafgaovgozdloxlxmpctuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888863.5765607-247-144918858777382/AnsiballZ_file.py
Nov 23 09:07:43 np0005532585.localdomain sudo[118965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:43 np0005532585.localdomain python3.9[118967]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:07:44 np0005532585.localdomain sudo[118965]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:44 np0005532585.localdomain sudo[119057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaqvzmooqnnaarusepiabhrrgpfuqnhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888864.2703428-274-18063792523084/AnsiballZ_file.py
Nov 23 09:07:44 np0005532585.localdomain sudo[119057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:44 np0005532585.localdomain python3.9[119059]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:07:44 np0005532585.localdomain sudo[119057]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15775 DF PROTO=TCP SPT=48214 DPT=9882 SEQ=851862344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB82200000000001030307) 
Nov 23 09:07:45 np0005532585.localdomain python3.9[119149]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:07:45 np0005532585.localdomain network[119166]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:07:45 np0005532585.localdomain network[119167]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:07:45 np0005532585.localdomain network[119168]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:07:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:07:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63457 DF PROTO=TCP SPT=56026 DPT=9102 SEQ=2178988037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB8E200000000001030307) 
Nov 23 09:07:49 np0005532585.localdomain python3.9[119365]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:07:50 np0005532585.localdomain python3.9[119455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:07:51 np0005532585.localdomain sudo[119549]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vavpjdlcbnddxxxtlkawcqgkzhtlzmjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763888871.2351153-376-11164517587479/AnsiballZ_command.py
Nov 23 09:07:51 np0005532585.localdomain sudo[119549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:07:51 np0005532585.localdomain python3.9[119551]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:07:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16845 DF PROTO=TCP SPT=45158 DPT=9102 SEQ=3372202955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBA3200000000001030307) 
Nov 23 09:07:54 np0005532585.localdomain sudo[119559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:07:54 np0005532585.localdomain sudo[119559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:07:54 np0005532585.localdomain sudo[119559]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:54 np0005532585.localdomain sudo[119574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:07:54 np0005532585.localdomain sudo[119574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:07:55 np0005532585.localdomain sudo[119574]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:56 np0005532585.localdomain sudo[119620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:07:56 np0005532585.localdomain sudo[119620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:07:56 np0005532585.localdomain sudo[119620]: pam_unix(sudo:session): session closed for user root
Nov 23 09:07:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15777 DF PROTO=TCP SPT=48214 DPT=9882 SEQ=851862344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBB2210000000001030307) 
Nov 23 09:07:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2105 DF PROTO=TCP SPT=33312 DPT=9101 SEQ=340619041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBBC2A0000000001030307) 
Nov 23 09:07:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32588 DF PROTO=TCP SPT=51366 DPT=9105 SEQ=1368777817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBBCDB0000000001030307) 
Nov 23 09:08:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32590 DF PROTO=TCP SPT=51366 DPT=9105 SEQ=1368777817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBC8E00000000001030307) 
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 23 09:08:05 np0005532585.localdomain sshd[45832]: Received signal 15; terminating.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: sshd.service: Consumed 1.614s CPU time.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 23 09:08:05 np0005532585.localdomain sshd[119670]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:08:05 np0005532585.localdomain sshd[119670]: Server listening on 0.0.0.0 port 22.
Nov 23 09:08:05 np0005532585.localdomain sshd[119670]: Server listening on :: port 22.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 09:08:05 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 09:08:05 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38612 DF PROTO=TCP SPT=59828 DPT=9100 SEQ=3921597297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBD4200000000001030307) 
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: run-r4658003025a440c5865624621cf2eda4.service: Deactivated successfully.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: run-r3d6d1e89840d494eac6a4fe4b9820ff2.service: Deactivated successfully.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 23 09:08:06 np0005532585.localdomain sshd[119670]: Received signal 15; terminating.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 23 09:08:06 np0005532585.localdomain sshd[120095]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:08:06 np0005532585.localdomain sshd[120095]: Server listening on 0.0.0.0 port 22.
Nov 23 09:08:06 np0005532585.localdomain sshd[120095]: Server listening on :: port 22.
Nov 23 09:08:06 np0005532585.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 23 09:08:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59732 DF PROTO=TCP SPT=51994 DPT=9100 SEQ=4145928038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBE1200000000001030307) 
Nov 23 09:08:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42355 DF PROTO=TCP SPT=41954 DPT=9882 SEQ=1328253083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBEB570000000001030307) 
Nov 23 09:08:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42357 DF PROTO=TCP SPT=41954 DPT=9882 SEQ=1328253083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBF7600000000001030307) 
Nov 23 09:08:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16847 DF PROTO=TCP SPT=45158 DPT=9102 SEQ=3372202955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC04200000000001030307) 
Nov 23 09:08:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35865 DF PROTO=TCP SPT=43132 DPT=9102 SEQ=3585187217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC18600000000001030307) 
Nov 23 09:08:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42359 DF PROTO=TCP SPT=41954 DPT=9882 SEQ=1328253083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC28200000000001030307) 
Nov 23 09:08:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8146 DF PROTO=TCP SPT=45462 DPT=9101 SEQ=3146966238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC315A0000000001030307) 
Nov 23 09:08:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25335 DF PROTO=TCP SPT=50892 DPT=9105 SEQ=2536592468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC320B0000000001030307) 
Nov 23 09:08:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8148 DF PROTO=TCP SPT=45462 DPT=9101 SEQ=3146966238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC3D600000000001030307) 
Nov 23 09:08:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54388 DF PROTO=TCP SPT=48914 DPT=9100 SEQ=988562431 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC4A200000000001030307) 
Nov 23 09:08:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52397 DF PROTO=TCP SPT=37434 DPT=9100 SEQ=1444354613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC56600000000001030307) 
Nov 23 09:08:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19836 DF PROTO=TCP SPT=38490 DPT=9882 SEQ=3546743551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC60880000000001030307) 
Nov 23 09:08:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19838 DF PROTO=TCP SPT=38490 DPT=9882 SEQ=3546743551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC6CA00000000001030307) 
Nov 23 09:08:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35867 DF PROTO=TCP SPT=43132 DPT=9102 SEQ=3585187217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC78200000000001030307) 
Nov 23 09:08:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45553 DF PROTO=TCP SPT=44470 DPT=9102 SEQ=2349228216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC8DA00000000001030307) 
Nov 23 09:08:56 np0005532585.localdomain sudo[120407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:08:56 np0005532585.localdomain sudo[120407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:08:56 np0005532585.localdomain sudo[120407]: pam_unix(sudo:session): session closed for user root
Nov 23 09:08:56 np0005532585.localdomain sudo[120423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:08:56 np0005532585.localdomain sudo[120423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:08:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19840 DF PROTO=TCP SPT=38490 DPT=9882 SEQ=3546743551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC9C210000000001030307) 
Nov 23 09:08:57 np0005532585.localdomain sudo[120423]: pam_unix(sudo:session): session closed for user root
Nov 23 09:08:57 np0005532585.localdomain sudo[120484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:08:57 np0005532585.localdomain sudo[120484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:08:57 np0005532585.localdomain sudo[120484]: pam_unix(sudo:session): session closed for user root
Nov 23 09:08:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64365 DF PROTO=TCP SPT=53466 DPT=9101 SEQ=3268082378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCA68A0000000001030307) 
Nov 23 09:08:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40909 DF PROTO=TCP SPT=33646 DPT=9105 SEQ=3732032219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCA73B0000000001030307) 
Nov 23 09:09:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64367 DF PROTO=TCP SPT=53466 DPT=9101 SEQ=3268082378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCB2A10000000001030307) 
Nov 23 09:09:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59735 DF PROTO=TCP SPT=51994 DPT=9100 SEQ=4145928038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCC0210000000001030307) 
Nov 23 09:09:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55183 DF PROTO=TCP SPT=40146 DPT=9100 SEQ=3954503623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCCB610000000001030307) 
Nov 23 09:09:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18869 DF PROTO=TCP SPT=43202 DPT=9882 SEQ=4203088774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCD5B70000000001030307) 
Nov 23 09:09:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18871 DF PROTO=TCP SPT=43202 DPT=9882 SEQ=4203088774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCE1A10000000001030307) 
Nov 23 09:09:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45555 DF PROTO=TCP SPT=44470 DPT=9102 SEQ=2349228216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCEE210000000001030307) 
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  Converting 2753 SID table entries...
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:09:18 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:09:19 np0005532585.localdomain sudo[119549]: pam_unix(sudo:session): session closed for user root
Nov 23 09:09:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9231 DF PROTO=TCP SPT=47990 DPT=9102 SEQ=725927483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD02E00000000001030307) 
Nov 23 09:09:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18873 DF PROTO=TCP SPT=43202 DPT=9882 SEQ=4203088774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD12200000000001030307) 
Nov 23 09:09:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25452 DF PROTO=TCP SPT=58466 DPT=9101 SEQ=3728500903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD1BCC0000000001030307) 
Nov 23 09:09:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54929 DF PROTO=TCP SPT=45208 DPT=9105 SEQ=1747848640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD1C6B0000000001030307) 
Nov 23 09:09:30 np0005532585.localdomain sshd[120661]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:09:31 np0005532585.localdomain sshd[120661]: Invalid user solana from 193.32.162.146 port 40766
Nov 23 09:09:31 np0005532585.localdomain sshd[120661]: Connection closed by invalid user solana 193.32.162.146 port 40766 [preauth]
Nov 23 09:09:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25454 DF PROTO=TCP SPT=58466 DPT=9101 SEQ=3728500903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD27E00000000001030307) 
Nov 23 09:09:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52400 DF PROTO=TCP SPT=37434 DPT=9100 SEQ=1444354613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD34210000000001030307) 
Nov 23 09:09:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40258 DF PROTO=TCP SPT=47692 DPT=9100 SEQ=2806583860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD40A00000000001030307) 
Nov 23 09:09:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22938 DF PROTO=TCP SPT=54772 DPT=9882 SEQ=48654059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD4EE00000000001030307) 
Nov 23 09:09:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22939 DF PROTO=TCP SPT=54772 DPT=9882 SEQ=48654059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD56E00000000001030307) 
Nov 23 09:09:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9233 DF PROTO=TCP SPT=47990 DPT=9102 SEQ=725927483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD64200000000001030307) 
Nov 23 09:09:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21549 DF PROTO=TCP SPT=47754 DPT=9102 SEQ=1955267961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD77E00000000001030307) 
Nov 23 09:09:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22941 DF PROTO=TCP SPT=54772 DPT=9882 SEQ=48654059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD86200000000001030307) 
Nov 23 09:09:57 np0005532585.localdomain sudo[120664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:09:57 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Nov 23 09:09:57 np0005532585.localdomain sudo[120664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:09:57 np0005532585.localdomain sudo[120664]: pam_unix(sudo:session): session closed for user root
Nov 23 09:09:58 np0005532585.localdomain sudo[120679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:09:58 np0005532585.localdomain sudo[120679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:09:58 np0005532585.localdomain sudo[120679]: pam_unix(sudo:session): session closed for user root
Nov 23 09:09:59 np0005532585.localdomain sudo[120726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:09:59 np0005532585.localdomain sudo[120726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:09:59 np0005532585.localdomain sudo[120726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:09:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10183 DF PROTO=TCP SPT=39124 DPT=9101 SEQ=4154972505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD90EA0000000001030307) 
Nov 23 09:09:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2605 DF PROTO=TCP SPT=41726 DPT=9105 SEQ=737562409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD919B0000000001030307) 
Nov 23 09:10:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2607 DF PROTO=TCP SPT=41726 DPT=9105 SEQ=737562409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD9DA00000000001030307) 
Nov 23 09:10:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55186 DF PROTO=TCP SPT=40146 DPT=9100 SEQ=3954503623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDAA200000000001030307) 
Nov 23 09:10:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36949 DF PROTO=TCP SPT=42594 DPT=9100 SEQ=3324275901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDB5E00000000001030307) 
Nov 23 09:10:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12221 DF PROTO=TCP SPT=48334 DPT=9882 SEQ=1978559223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDC4200000000001030307) 
Nov 23 09:10:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12222 DF PROTO=TCP SPT=48334 DPT=9882 SEQ=1978559223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDCC200000000001030307) 
Nov 23 09:10:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21551 DF PROTO=TCP SPT=47754 DPT=9102 SEQ=1955267961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDD8200000000001030307) 
Nov 23 09:10:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42051 DF PROTO=TCP SPT=51010 DPT=9102 SEQ=2763510964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDED200000000001030307) 
Nov 23 09:10:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12224 DF PROTO=TCP SPT=48334 DPT=9882 SEQ=1978559223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDFC200000000001030307) 
Nov 23 09:10:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53453 DF PROTO=TCP SPT=44628 DPT=9101 SEQ=1408718102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE06190000000001030307) 
Nov 23 09:10:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8043 DF PROTO=TCP SPT=60518 DPT=9105 SEQ=1221331075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE06CB0000000001030307) 
Nov 23 09:10:30 np0005532585.localdomain sudo[120816]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifsaoecvwkpakkkmomfovkquyotxxmad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889029.8584847-403-124718824111597/AnsiballZ_file.py
Nov 23 09:10:30 np0005532585.localdomain sudo[120816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:30 np0005532585.localdomain python3.9[120818]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:10:30 np0005532585.localdomain sudo[120816]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:30 np0005532585.localdomain sudo[120908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uowjxbncrzkoaudtlaieebjuztauaxgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889030.572388-427-229022161023713/AnsiballZ_stat.py
Nov 23 09:10:30 np0005532585.localdomain sudo[120908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:31 np0005532585.localdomain python3.9[120910]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:10:31 np0005532585.localdomain sudo[120908]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:31 np0005532585.localdomain sudo[120981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riybdefypxybmvngqfcetfyilxkvpamq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889030.572388-427-229022161023713/AnsiballZ_copy.py
Nov 23 09:10:31 np0005532585.localdomain sudo[120981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:31 np0005532585.localdomain python3.9[120983]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889030.572388-427-229022161023713/.source.fact _original_basename=.96v1mfal follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:10:31 np0005532585.localdomain sudo[120981]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53455 DF PROTO=TCP SPT=44628 DPT=9101 SEQ=1408718102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE12200000000001030307) 
Nov 23 09:10:32 np0005532585.localdomain python3.9[121073]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:10:33 np0005532585.localdomain sudo[121169]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekzkrnwirghohqroacfosuycmyornuut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889033.4999223-502-178902092428228/AnsiballZ_setup.py
Nov 23 09:10:33 np0005532585.localdomain sudo[121169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:34 np0005532585.localdomain python3.9[121171]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:10:34 np0005532585.localdomain sudo[121169]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:34 np0005532585.localdomain sudo[121223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfeqmqwuvcxhuqdaacxdeuhcpledeant ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889033.4999223-502-178902092428228/AnsiballZ_dnf.py
Nov 23 09:10:34 np0005532585.localdomain sudo[121223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:35 np0005532585.localdomain python3.9[121225]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:10:35 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40261 DF PROTO=TCP SPT=47692 DPT=9100 SEQ=2806583860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE1E200000000001030307) 
Nov 23 09:10:38 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:10:38 np0005532585.localdomain systemd-sysv-generator[121260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:10:38 np0005532585.localdomain systemd-rc-local-generator[121257]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:10:38 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:10:38 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 09:10:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44916 DF PROTO=TCP SPT=54794 DPT=9100 SEQ=1547639768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE2B200000000001030307) 
Nov 23 09:10:39 np0005532585.localdomain sudo[121223]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:40 np0005532585.localdomain sudo[121363]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yccevmxmqmgcqxngxwmllfrbutreufkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889040.1162982-538-186614291371039/AnsiballZ_command.py
Nov 23 09:10:40 np0005532585.localdomain sudo[121363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:40 np0005532585.localdomain python3.9[121365]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:10:41 np0005532585.localdomain sudo[121363]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:42 np0005532585.localdomain sudo[121602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asrxmhczplvdhxtktmgmechxrjcfkiro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889041.5789726-562-143534146663978/AnsiballZ_selinux.py
Nov 23 09:10:42 np0005532585.localdomain sudo[121602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:42 np0005532585.localdomain python3.9[121604]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 23 09:10:42 np0005532585.localdomain sudo[121602]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40894 DF PROTO=TCP SPT=43820 DPT=9882 SEQ=2384301406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE39610000000001030307) 
Nov 23 09:10:43 np0005532585.localdomain sudo[121694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qppsfhkmfqrhkxjtkjkmdkwhvybadfwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889043.0482814-595-255698038039122/AnsiballZ_command.py
Nov 23 09:10:43 np0005532585.localdomain sudo[121694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:43 np0005532585.localdomain python3.9[121696]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 23 09:10:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40895 DF PROTO=TCP SPT=43820 DPT=9882 SEQ=2384301406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE41600000000001030307) 
Nov 23 09:10:45 np0005532585.localdomain sudo[121694]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:45 np0005532585.localdomain sudo[121787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjwqnwrwiahgglxfpduewslojevndnqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889045.2800093-619-181647981365170/AnsiballZ_file.py
Nov 23 09:10:45 np0005532585.localdomain sudo[121787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:45 np0005532585.localdomain python3.9[121789]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:10:45 np0005532585.localdomain sudo[121787]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:46 np0005532585.localdomain sudo[121879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sinywjvgfmukyejmptjfhrczmdzefidh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889045.9519694-643-98072388595822/AnsiballZ_mount.py
Nov 23 09:10:46 np0005532585.localdomain sudo[121879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:46 np0005532585.localdomain python3.9[121881]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 23 09:10:46 np0005532585.localdomain sudo[121879]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:47 np0005532585.localdomain sudo[121971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxrfamhuibiaooibldllvvtqhzcprlpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889047.7201822-727-281056544496150/AnsiballZ_file.py
Nov 23 09:10:47 np0005532585.localdomain sudo[121971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:48 np0005532585.localdomain python3.9[121973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:10:48 np0005532585.localdomain sudo[121971]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42053 DF PROTO=TCP SPT=51010 DPT=9102 SEQ=2763510964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE4E200000000001030307) 
Nov 23 09:10:48 np0005532585.localdomain sudo[122063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzwgcyejtcudbkckbjvsznemoucylnrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889048.4341-751-174432665865727/AnsiballZ_stat.py
Nov 23 09:10:48 np0005532585.localdomain sudo[122063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:48 np0005532585.localdomain python3.9[122065]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:10:48 np0005532585.localdomain sudo[122063]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:49 np0005532585.localdomain sudo[122136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrqzmjdlbmkkkwwmmggegxvcxeaklxlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889048.4341-751-174432665865727/AnsiballZ_copy.py
Nov 23 09:10:49 np0005532585.localdomain sudo[122136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:53 np0005532585.localdomain python3.9[122138]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889048.4341-751-174432665865727/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:10:53 np0005532585.localdomain sudo[122136]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7635 DF PROTO=TCP SPT=52454 DPT=9102 SEQ=3828433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE62600000000001030307) 
Nov 23 09:10:54 np0005532585.localdomain sudo[122229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edjimukchedgmhbequeupdhkbumsgere ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889054.5026197-823-108042718120760/AnsiballZ_stat.py
Nov 23 09:10:54 np0005532585.localdomain sudo[122229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:54 np0005532585.localdomain python3.9[122231]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:10:54 np0005532585.localdomain sudo[122229]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:56 np0005532585.localdomain sudo[122323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixczusvguvxjdetvbnptqujkqzfhwaul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889055.8231447-862-144848685315207/AnsiballZ_getent.py
Nov 23 09:10:56 np0005532585.localdomain sudo[122323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:56 np0005532585.localdomain python3.9[122325]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 23 09:10:56 np0005532585.localdomain sudo[122323]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:57 np0005532585.localdomain sudo[122416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgdifccagjwlnytcyxcfhfjhxjvuwjcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889056.869575-892-255429298620746/AnsiballZ_getent.py
Nov 23 09:10:57 np0005532585.localdomain sudo[122416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:57 np0005532585.localdomain python3.9[122418]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 23 09:10:57 np0005532585.localdomain sudo[122416]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40897 DF PROTO=TCP SPT=43820 DPT=9882 SEQ=2384301406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE72200000000001030307) 
Nov 23 09:10:58 np0005532585.localdomain sudo[122509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tacambqpvrwbvbberppjaoyskjifrzyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889057.6490092-916-246993389948658/AnsiballZ_group.py
Nov 23 09:10:58 np0005532585.localdomain sudo[122509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:58 np0005532585.localdomain python3.9[122511]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 09:10:58 np0005532585.localdomain groupmod[122512]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Nov 23 09:10:58 np0005532585.localdomain groupmod[122512]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Nov 23 09:10:58 np0005532585.localdomain sudo[122509]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:58 np0005532585.localdomain sudo[122607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sruqolarkgiooikbdhqwfnjjvznwswjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889058.6824584-943-123854262900876/AnsiballZ_file.py
Nov 23 09:10:58 np0005532585.localdomain sudo[122607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:10:59 np0005532585.localdomain python3.9[122609]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 23 09:10:59 np0005532585.localdomain sudo[122607]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:59 np0005532585.localdomain sudo[122624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:10:59 np0005532585.localdomain sudo[122624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:10:59 np0005532585.localdomain sudo[122624]: pam_unix(sudo:session): session closed for user root
Nov 23 09:10:59 np0005532585.localdomain sudo[122639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:10:59 np0005532585.localdomain sudo[122639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:10:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4894 DF PROTO=TCP SPT=50386 DPT=9101 SEQ=1257906011 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE7B4A0000000001030307) 
Nov 23 09:10:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32158 DF PROTO=TCP SPT=47962 DPT=9105 SEQ=689963143 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE7BFB0000000001030307) 
Nov 23 09:11:00 np0005532585.localdomain sudo[122749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cukoghcjyfskfkcrpvabmfjbmwaarwgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889059.9323518-976-205501991375187/AnsiballZ_dnf.py
Nov 23 09:11:00 np0005532585.localdomain sudo[122749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:00 np0005532585.localdomain sudo[122639]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:00 np0005532585.localdomain python3.9[122757]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:11:00 np0005532585.localdomain sudo[122765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:11:00 np0005532585.localdomain sudo[122765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:11:00 np0005532585.localdomain sudo[122765]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:00 np0005532585.localdomain sudo[122780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:11:00 np0005532585.localdomain sudo[122780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:11:01 np0005532585.localdomain sudo[122780]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4896 DF PROTO=TCP SPT=50386 DPT=9101 SEQ=1257906011 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE87610000000001030307) 
Nov 23 09:11:03 np0005532585.localdomain sudo[122749]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:04 np0005532585.localdomain sudo[122905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puxdoafoabvqsytrytojielalxjgcasw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889063.9617317-1000-64533274666192/AnsiballZ_file.py
Nov 23 09:11:04 np0005532585.localdomain sudo[122905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:04 np0005532585.localdomain python3.9[122907]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:04 np0005532585.localdomain sudo[122905]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:05 np0005532585.localdomain sudo[122922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:11:05 np0005532585.localdomain sudo[122922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:11:05 np0005532585.localdomain sudo[122922]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36952 DF PROTO=TCP SPT=42594 DPT=9100 SEQ=3324275901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE94200000000001030307) 
Nov 23 09:11:08 np0005532585.localdomain sudo[123012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afjtjyksnpaqxtzrrspvfwucaipxmqvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889068.0918384-1024-236790695428880/AnsiballZ_stat.py
Nov 23 09:11:08 np0005532585.localdomain sudo[123012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:08 np0005532585.localdomain python3.9[123014]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:11:08 np0005532585.localdomain sudo[123012]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:08 np0005532585.localdomain sudo[123085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nskrpeejmjqcvhicfpevqgtxzjiaootr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889068.0918384-1024-236790695428880/AnsiballZ_copy.py
Nov 23 09:11:08 np0005532585.localdomain sudo[123085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:09 np0005532585.localdomain python3.9[123087]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889068.0918384-1024-236790695428880/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:09 np0005532585.localdomain sudo[123085]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14529 DF PROTO=TCP SPT=35280 DPT=9100 SEQ=1392506286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEA0200000000001030307) 
Nov 23 09:11:10 np0005532585.localdomain sudo[123177]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwlsbgfgzqwkwsgtrgzvtdmpdvjgsnsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889070.025078-1069-128148866207242/AnsiballZ_systemd.py
Nov 23 09:11:10 np0005532585.localdomain sudo[123177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:10 np0005532585.localdomain python3.9[123179]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:11:10 np0005532585.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 09:11:10 np0005532585.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 23 09:11:10 np0005532585.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 23 09:11:10 np0005532585.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 23 09:11:10 np0005532585.localdomain systemd-modules-load[123183]: Module 'msr' is built in
Nov 23 09:11:10 np0005532585.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 23 09:11:10 np0005532585.localdomain sudo[123177]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:11 np0005532585.localdomain sudo[123273]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naqzjinrgmiglnrzoifkjcwctatnjobe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889071.1764095-1093-17762772445675/AnsiballZ_stat.py
Nov 23 09:11:11 np0005532585.localdomain sudo[123273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:11 np0005532585.localdomain python3.9[123275]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:11:11 np0005532585.localdomain sudo[123273]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:11 np0005532585.localdomain sudo[123346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcorbvgmbnxmroakepfqwfmoswnwgjsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889071.1764095-1093-17762772445675/AnsiballZ_copy.py
Nov 23 09:11:11 np0005532585.localdomain sudo[123346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:12 np0005532585.localdomain python3.9[123348]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889071.1764095-1093-17762772445675/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:12 np0005532585.localdomain sudo[123346]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2367 DF PROTO=TCP SPT=53392 DPT=9882 SEQ=3621492434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEAE600000000001030307) 
Nov 23 09:11:13 np0005532585.localdomain sudo[123438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxkvaowulmsqybhbdgodblvmpksomjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889072.7810035-1147-155105484962428/AnsiballZ_dnf.py
Nov 23 09:11:13 np0005532585.localdomain sudo[123438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:13 np0005532585.localdomain python3.9[123440]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:11:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2368 DF PROTO=TCP SPT=53392 DPT=9882 SEQ=3621492434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEB6610000000001030307) 
Nov 23 09:11:16 np0005532585.localdomain sudo[123438]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:17 np0005532585.localdomain python3.9[123532]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:11:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7637 DF PROTO=TCP SPT=52454 DPT=9102 SEQ=3828433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEC2200000000001030307) 
Nov 23 09:11:18 np0005532585.localdomain python3.9[123624]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 23 09:11:19 np0005532585.localdomain python3.9[123714]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:11:20 np0005532585.localdomain sudo[123804]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmrnteypodmcqpnwpeoppcxfqavrywto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889079.960323-1270-46407088829592/AnsiballZ_systemd.py
Nov 23 09:11:20 np0005532585.localdomain sudo[123804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:20 np0005532585.localdomain python3.9[123806]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:11:21 np0005532585.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 23 09:11:21 np0005532585.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 23 09:11:21 np0005532585.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 23 09:11:21 np0005532585.localdomain systemd[1]: tuned.service: Consumed 1.684s CPU time, no IO.
Nov 23 09:11:21 np0005532585.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 09:11:22 np0005532585.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 09:11:22 np0005532585.localdomain sudo[123804]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34201 DF PROTO=TCP SPT=40340 DPT=9102 SEQ=836705543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BED7A10000000001030307) 
Nov 23 09:11:23 np0005532585.localdomain python3.9[123909]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 23 09:11:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2370 DF PROTO=TCP SPT=53392 DPT=9882 SEQ=3621492434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEE6200000000001030307) 
Nov 23 09:11:27 np0005532585.localdomain sudo[123999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyvwkurokdhtygqtwarbhejfvbctkmes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889087.1193888-1442-32820689028330/AnsiballZ_systemd.py
Nov 23 09:11:27 np0005532585.localdomain sudo[123999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:27 np0005532585.localdomain python3.9[124001]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:11:27 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:11:27 np0005532585.localdomain systemd-rc-local-generator[124027]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:11:27 np0005532585.localdomain systemd-sysv-generator[124031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:11:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:11:28 np0005532585.localdomain sudo[123999]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:28 np0005532585.localdomain sudo[124129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxrqofaaolwrkdmyvosjkgfwhxjtybrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889088.2318406-1442-61277034899142/AnsiballZ_systemd.py
Nov 23 09:11:28 np0005532585.localdomain sudo[124129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:28 np0005532585.localdomain python3.9[124131]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:11:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:11:29 np0005532585.localdomain systemd-rc-local-generator[124157]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:11:29 np0005532585.localdomain systemd-sysv-generator[124161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:11:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:11:29 np0005532585.localdomain sudo[124129]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37934 DF PROTO=TCP SPT=50862 DPT=9101 SEQ=128498723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEF07A0000000001030307) 
Nov 23 09:11:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20275 DF PROTO=TCP SPT=60654 DPT=9105 SEQ=437288969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEF12B0000000001030307) 
Nov 23 09:11:31 np0005532585.localdomain sudo[124259]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkssoxtiltwykoiqakiydrnfxrygevuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889091.6257331-1489-235621710908224/AnsiballZ_command.py
Nov 23 09:11:31 np0005532585.localdomain sudo[124259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:32 np0005532585.localdomain python3.9[124261]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:11:32 np0005532585.localdomain sudo[124259]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:32 np0005532585.localdomain sudo[124352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jazcgdkbtuvvvtsfxrbmuprllmnxcpnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889092.3287263-1513-215617841961679/AnsiballZ_command.py
Nov 23 09:11:32 np0005532585.localdomain sudo[124352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:32 np0005532585.localdomain python3.9[124354]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:11:32 np0005532585.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Nov 23 09:11:32 np0005532585.localdomain sudo[124352]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37936 DF PROTO=TCP SPT=50862 DPT=9101 SEQ=128498723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEFCA00000000001030307) 
Nov 23 09:11:33 np0005532585.localdomain sudo[124445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eergqjmzcerotmwcsoiuvgjcpapracvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889093.0792787-1537-67629875690015/AnsiballZ_command.py
Nov 23 09:11:33 np0005532585.localdomain sudo[124445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:33 np0005532585.localdomain python3.9[124447]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:11:34 np0005532585.localdomain sudo[124445]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:35 np0005532585.localdomain sudo[124544]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eilhujdshjbunhnbipnwbwddjshzryvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889094.8642375-1561-237983125664850/AnsiballZ_command.py
Nov 23 09:11:35 np0005532585.localdomain sudo[124544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:35 np0005532585.localdomain python3.9[124546]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:11:35 np0005532585.localdomain sudo[124544]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:35 np0005532585.localdomain sudo[124637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnlhibgcfysjqopizuwgywbceoggsbdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889095.6215422-1586-147360023800327/AnsiballZ_systemd.py
Nov 23 09:11:35 np0005532585.localdomain sudo[124637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:36 np0005532585.localdomain python3.9[124639]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: Stopped Apply Kernel Variables.
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: Stopping Apply Kernel Variables...
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: Starting Apply Kernel Variables...
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: Finished Apply Kernel Variables.
Nov 23 09:11:36 np0005532585.localdomain sudo[124637]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44919 DF PROTO=TCP SPT=54794 DPT=9100 SEQ=1547639768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF0A200000000001030307) 
Nov 23 09:11:36 np0005532585.localdomain sshd[118140]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:11:36 np0005532585.localdomain systemd-logind[761]: Session 38 logged out. Waiting for processes to exit.
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Nov 23 09:11:36 np0005532585.localdomain systemd[1]: session-38.scope: Consumed 1min 55.100s CPU time.
Nov 23 09:11:36 np0005532585.localdomain systemd-logind[761]: Removed session 38.
Nov 23 09:11:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15692 DF PROTO=TCP SPT=38776 DPT=9100 SEQ=4206993237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF15610000000001030307) 
Nov 23 09:11:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51629 DF PROTO=TCP SPT=50654 DPT=9882 SEQ=3001749266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF1FA90000000001030307) 
Nov 23 09:11:42 np0005532585.localdomain sshd[124659]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:11:42 np0005532585.localdomain sshd[124659]: Accepted publickey for zuul from 192.168.122.31 port 53780 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:11:42 np0005532585.localdomain systemd-logind[761]: New session 39 of user zuul.
Nov 23 09:11:42 np0005532585.localdomain systemd[1]: Started Session 39 of User zuul.
Nov 23 09:11:42 np0005532585.localdomain sshd[124659]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:11:43 np0005532585.localdomain python3.9[124752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:11:44 np0005532585.localdomain python3.9[124846]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:11:44 np0005532585.localdomain sshd[124851]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:11:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51631 DF PROTO=TCP SPT=50654 DPT=9882 SEQ=3001749266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF2BA00000000001030307) 
Nov 23 09:11:46 np0005532585.localdomain sudo[124942]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exourefmmnyzueuhahqkahsypnpgsihy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889105.7137232-111-104185397132903/AnsiballZ_command.py
Nov 23 09:11:46 np0005532585.localdomain sudo[124942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:46 np0005532585.localdomain python3.9[124944]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:11:46 np0005532585.localdomain sudo[124942]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:47 np0005532585.localdomain python3.9[125035]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:11:47 np0005532585.localdomain sshd[124851]: Invalid user  from 14.103.205.40 port 37494
Nov 23 09:11:48 np0005532585.localdomain sudo[125129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uphjukrpqsyqmtuajycydriqlwfcgizk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889107.8767688-171-254716600648149/AnsiballZ_setup.py
Nov 23 09:11:48 np0005532585.localdomain sudo[125129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34203 DF PROTO=TCP SPT=40340 DPT=9102 SEQ=836705543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF38200000000001030307) 
Nov 23 09:11:48 np0005532585.localdomain python3.9[125131]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:11:48 np0005532585.localdomain sudo[125129]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:49 np0005532585.localdomain sudo[125183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kglznphmbpwzxgrotvgtuoydlroiedch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889107.8767688-171-254716600648149/AnsiballZ_dnf.py
Nov 23 09:11:49 np0005532585.localdomain sudo[125183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:49 np0005532585.localdomain python3.9[125185]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:11:52 np0005532585.localdomain sudo[125183]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:52 np0005532585.localdomain sshd[124851]: Connection closed by invalid user  14.103.205.40 port 37494 [preauth]
Nov 23 09:11:53 np0005532585.localdomain sudo[125277]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqoyoacpedddhvcsvavssafocvcdqkat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889112.758789-207-249087032487631/AnsiballZ_setup.py
Nov 23 09:11:53 np0005532585.localdomain sudo[125277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:53 np0005532585.localdomain python3.9[125279]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:11:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8145 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=3339985482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF4CA10000000001030307) 
Nov 23 09:11:53 np0005532585.localdomain sudo[125277]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:54 np0005532585.localdomain sudo[125432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgrdjpohoseqaxdksvwgufrqekzibwif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889114.03681-240-38497339014385/AnsiballZ_file.py
Nov 23 09:11:54 np0005532585.localdomain sudo[125432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:54 np0005532585.localdomain python3.9[125434]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:11:54 np0005532585.localdomain sudo[125432]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:55 np0005532585.localdomain sudo[125524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxcmgcwmgxzfikzdgbndnjhslfyqnesh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889114.8625264-264-123620648335148/AnsiballZ_command.py
Nov 23 09:11:55 np0005532585.localdomain sudo[125524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:55 np0005532585.localdomain python3.9[125526]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:11:55 np0005532585.localdomain sudo[125524]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:56 np0005532585.localdomain sudo[125627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hierpssazgeebzgncenuivnaqalanvgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889115.6027849-288-182960956743300/AnsiballZ_stat.py
Nov 23 09:11:56 np0005532585.localdomain sudo[125627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:56 np0005532585.localdomain python3.9[125629]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:11:56 np0005532585.localdomain sudo[125627]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:56 np0005532585.localdomain sudo[125675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlothsvkrhzzvtwpwjrnrdnctmivzqhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889115.6027849-288-182960956743300/AnsiballZ_file.py
Nov 23 09:11:56 np0005532585.localdomain sudo[125675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:56 np0005532585.localdomain python3.9[125677]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:11:56 np0005532585.localdomain sudo[125675]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:57 np0005532585.localdomain sudo[125767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybyvsitmqdikxcimxhrdowuxzgfqyykq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889116.7825787-324-136294773177031/AnsiballZ_stat.py
Nov 23 09:11:57 np0005532585.localdomain sudo[125767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:57 np0005532585.localdomain python3.9[125769]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:11:57 np0005532585.localdomain sudo[125767]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51633 DF PROTO=TCP SPT=50654 DPT=9882 SEQ=3001749266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF5C210000000001030307) 
Nov 23 09:11:57 np0005532585.localdomain sudo[125840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksspjdnpkorauaiqgyxcvoekiopkhluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889116.7825787-324-136294773177031/AnsiballZ_copy.py
Nov 23 09:11:57 np0005532585.localdomain sudo[125840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:57 np0005532585.localdomain python3.9[125842]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889116.7825787-324-136294773177031/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:57 np0005532585.localdomain sudo[125840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:58 np0005532585.localdomain sudo[125932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfzkyuhwjohshnebtatczgkouxssnagy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889118.1476495-372-13820643478944/AnsiballZ_ini_file.py
Nov 23 09:11:58 np0005532585.localdomain sudo[125932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:58 np0005532585.localdomain python3.9[125934]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:58 np0005532585.localdomain sudo[125932]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:58 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Nov 23 09:11:58 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 09:11:58 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:11:58 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:11:58 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:11:59 np0005532585.localdomain sudo[126025]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyozdlnwcjfybtaaxrmltojfslaywfxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889118.9806464-372-267444389769812/AnsiballZ_ini_file.py
Nov 23 09:11:59 np0005532585.localdomain sudo[126025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:59 np0005532585.localdomain python3.9[126027]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:59 np0005532585.localdomain sudo[126025]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:59 np0005532585.localdomain sudo[126117]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlxrxdcjhqnvaaxrrergzyrezuyhoouz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889119.5474784-372-20081150397010/AnsiballZ_ini_file.py
Nov 23 09:11:59 np0005532585.localdomain sudo[126117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:11:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39234 DF PROTO=TCP SPT=34904 DPT=9101 SEQ=293917994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF65AA0000000001030307) 
Nov 23 09:11:59 np0005532585.localdomain python3.9[126119]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:11:59 np0005532585.localdomain sudo[126117]: pam_unix(sudo:session): session closed for user root
Nov 23 09:11:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48078 DF PROTO=TCP SPT=49596 DPT=9105 SEQ=2664954968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF665B0000000001030307) 
Nov 23 09:12:00 np0005532585.localdomain sudo[126209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isonfktynueapmmctmdvjpqnbxsdosri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889120.0859628-372-278777178747272/AnsiballZ_ini_file.py
Nov 23 09:12:00 np0005532585.localdomain sudo[126209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:00 np0005532585.localdomain python3.9[126211]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:12:00 np0005532585.localdomain sudo[126209]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:12:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:12:01 np0005532585.localdomain python3.9[126301]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:12:02 np0005532585.localdomain sudo[126393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnihqiluurjkdwuwbpotjlwmccnpwbad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889121.8005552-492-170733488155970/AnsiballZ_dnf.py
Nov 23 09:12:02 np0005532585.localdomain sudo[126393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:02 np0005532585.localdomain python3.9[126395]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48080 DF PROTO=TCP SPT=49596 DPT=9105 SEQ=2664954968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF72610000000001030307) 
Nov 23 09:12:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:12:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:12:05 np0005532585.localdomain sudo[126393]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:05 np0005532585.localdomain sudo[126412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:12:05 np0005532585.localdomain sudo[126412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:12:05 np0005532585.localdomain sudo[126412]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:05 np0005532585.localdomain sudo[126427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:12:05 np0005532585.localdomain sudo[126427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:12:06 np0005532585.localdomain sudo[126517]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmclwymkgkemjmmibrgpmflxzqgwvnel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889125.7171614-516-169918054732463/AnsiballZ_dnf.py
Nov 23 09:12:06 np0005532585.localdomain sudo[126517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14532 DF PROTO=TCP SPT=35280 DPT=9100 SEQ=1392506286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF7E210000000001030307) 
Nov 23 09:12:06 np0005532585.localdomain python3.9[126521]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:06 np0005532585.localdomain sudo[126427]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64618 DF PROTO=TCP SPT=55576 DPT=9100 SEQ=2967965951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF8AA00000000001030307) 
Nov 23 09:12:09 np0005532585.localdomain sudo[126517]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:09 np0005532585.localdomain sudo[126568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:12:09 np0005532585.localdomain sudo[126568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:12:09 np0005532585.localdomain sudo[126568]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:10 np0005532585.localdomain sudo[126658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hesouzhyvixpgwexxauaolqwjrsghatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889129.97849-546-166136358931341/AnsiballZ_dnf.py
Nov 23 09:12:10 np0005532585.localdomain sudo[126658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:10 np0005532585.localdomain python3.9[126660]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42123 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF94D70000000001030307) 
Nov 23 09:12:13 np0005532585.localdomain sudo[126658]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:14 np0005532585.localdomain sudo[126758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neliynkfhhxkhkxmmuewnjausdgtsfnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889134.3225844-573-107283374451012/AnsiballZ_dnf.py
Nov 23 09:12:14 np0005532585.localdomain sudo[126758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:14 np0005532585.localdomain python3.9[126760]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42125 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFA0E10000000001030307) 
Nov 23 09:12:17 np0005532585.localdomain sudo[126758]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42126 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFB0A00000000001030307) 
Nov 23 09:12:19 np0005532585.localdomain sudo[126852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toqbojrikyxpqmlcvvvcyvztdiqwqnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889138.8740995-609-249739727541321/AnsiballZ_dnf.py
Nov 23 09:12:19 np0005532585.localdomain sudo[126852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:19 np0005532585.localdomain python3.9[126854]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:22 np0005532585.localdomain sudo[126852]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:23 np0005532585.localdomain sudo[126946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geyxkortgnjgfpfzbibgibvdufhmczxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889143.0070171-636-216245670560265/AnsiballZ_dnf.py
Nov 23 09:12:23 np0005532585.localdomain sudo[126946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42907 DF PROTO=TCP SPT=38660 DPT=9102 SEQ=3011424843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFC1E00000000001030307) 
Nov 23 09:12:23 np0005532585.localdomain python3.9[126948]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:26 np0005532585.localdomain sudo[126946]: pam_unix(sudo:session): session closed for user root
Nov 23 09:12:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42127 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFD0200000000001030307) 
Nov 23 09:12:27 np0005532585.localdomain sudo[127040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kigquztnjfddwzgihhalsslpaphbwnht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889146.9569774-663-169870414845980/AnsiballZ_dnf.py
Nov 23 09:12:27 np0005532585.localdomain sudo[127040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:12:27 np0005532585.localdomain python3.9[127042]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:12:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19981 DF PROTO=TCP SPT=56074 DPT=9101 SEQ=1735143310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFDADA0000000001030307) 
Nov 23 09:12:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11134 DF PROTO=TCP SPT=43096 DPT=9105 SEQ=4290280602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFDB8B0000000001030307) 
Nov 23 09:12:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19983 DF PROTO=TCP SPT=56074 DPT=9101 SEQ=1735143310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFE6E00000000001030307) 
Nov 23 09:12:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15695 DF PROTO=TCP SPT=38776 DPT=9100 SEQ=4206993237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFF4200000000001030307) 
Nov 23 09:12:38 np0005532585.localdomain sshd[127054]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:12:38 np0005532585.localdomain sshd[127054]: Invalid user sol from 193.32.162.146 port 49910
Nov 23 09:12:38 np0005532585.localdomain sshd[127054]: Connection closed by invalid user sol 193.32.162.146 port 49910 [preauth]
Nov 23 09:12:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57766 DF PROTO=TCP SPT=43322 DPT=9100 SEQ=2162066529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFFFE00000000001030307) 
Nov 23 09:12:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53595 DF PROTO=TCP SPT=40562 DPT=9882 SEQ=853935234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C00E200000000001030307) 
Nov 23 09:12:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53596 DF PROTO=TCP SPT=40562 DPT=9882 SEQ=853935234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C016200000000001030307) 
Nov 23 09:12:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42909 DF PROTO=TCP SPT=38660 DPT=9102 SEQ=3011424843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C022200000000001030307) 
Nov 23 09:12:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39488 DF PROTO=TCP SPT=48560 DPT=9102 SEQ=3937336017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C037210000000001030307) 
Nov 23 09:12:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53598 DF PROTO=TCP SPT=40562 DPT=9882 SEQ=853935234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C046200000000001030307) 
Nov 23 09:12:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55379 DF PROTO=TCP SPT=49582 DPT=9101 SEQ=171878102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0500A0000000001030307) 
Nov 23 09:12:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1152 DF PROTO=TCP SPT=41672 DPT=9105 SEQ=2488236121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C050BB0000000001030307) 
Nov 23 09:13:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55381 DF PROTO=TCP SPT=49582 DPT=9101 SEQ=171878102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C05C210000000001030307) 
Nov 23 09:13:05 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64621 DF PROTO=TCP SPT=55576 DPT=9100 SEQ=2967965951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C068200000000001030307) 
Nov 23 09:13:08 np0005532585.localdomain sudo[127040]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50431 DF PROTO=TCP SPT=53328 DPT=9100 SEQ=172578527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C074E10000000001030307) 
Nov 23 09:13:09 np0005532585.localdomain sudo[127199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:13:09 np0005532585.localdomain sudo[127199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:13:09 np0005532585.localdomain sudo[127199]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:09 np0005532585.localdomain sudo[127229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzrwgboajogyklgdcztzysddohfddqdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889188.698767-699-130385958307634/AnsiballZ_file.py
Nov 23 09:13:09 np0005532585.localdomain sudo[127229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:09 np0005532585.localdomain sudo[127232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:13:09 np0005532585.localdomain sudo[127232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:13:10 np0005532585.localdomain python3.9[127234]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:13:10 np0005532585.localdomain sudo[127229]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:10 np0005532585.localdomain sudo[127382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipgtndkgtitavdrxqvzjqnkrvfwvnbhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889190.1948092-723-266870941394687/AnsiballZ_stat.py
Nov 23 09:13:10 np0005532585.localdomain sudo[127382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:10 np0005532585.localdomain sudo[127232]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:10 np0005532585.localdomain sudo[127385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:13:10 np0005532585.localdomain python3.9[127384]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:13:10 np0005532585.localdomain sudo[127385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:13:10 np0005532585.localdomain sudo[127385]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:10 np0005532585.localdomain sudo[127382]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:10 np0005532585.localdomain sudo[127400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 09:13:10 np0005532585.localdomain sudo[127400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:13:11 np0005532585.localdomain sudo[127485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkeuofnxlmdzrwlskgfeajsvufzfejof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889190.1948092-723-266870941394687/AnsiballZ_copy.py
Nov 23 09:13:11 np0005532585.localdomain sudo[127485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:11 np0005532585.localdomain python3.9[127487]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763889190.1948092-723-266870941394687/.source.json _original_basename=.a1mc2uzn follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:13:11 np0005532585.localdomain sudo[127485]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 2025-11-23 09:13:11.368458539 +0000 UTC m=+0.057860286 container create 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True)
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: Started libpod-conmon-1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4.scope.
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 2025-11-23 09:13:11.436078307 +0000 UTC m=+0.125480074 container init 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12)
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 2025-11-23 09:13:11.343137714 +0000 UTC m=+0.032539471 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: tmp-crun.OYtmwB.mount: Deactivated successfully.
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 2025-11-23 09:13:11.448276276 +0000 UTC m=+0.137678043 container start 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, RELEASE=main)
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 2025-11-23 09:13:11.448559114 +0000 UTC m=+0.137960911 container attach 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, io.openshift.expose-services=, ceph=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Nov 23 09:13:11 np0005532585.localdomain compassionate_hypatia[127559]: 167 167
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: libpod-1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4.scope: Deactivated successfully.
Nov 23 09:13:11 np0005532585.localdomain podman[127542]: 2025-11-23 09:13:11.452546127 +0000 UTC m=+0.141947874 container died 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:13:11 np0005532585.localdomain podman[127564]: 2025-11-23 09:13:11.532392735 +0000 UTC m=+0.073761520 container remove 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, release=553, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: libpod-conmon-1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4.scope: Deactivated successfully.
Nov 23 09:13:11 np0005532585.localdomain podman[127584]: 
Nov 23 09:13:11 np0005532585.localdomain podman[127584]: 2025-11-23 09:13:11.746536587 +0000 UTC m=+0.084376948 container create c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: Started libpod-conmon-c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398.scope.
Nov 23 09:13:11 np0005532585.localdomain podman[127584]: 2025-11-23 09:13:11.713181652 +0000 UTC m=+0.051022043 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:13:11 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:13:11 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 09:13:11 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:13:11 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 09:13:11 np0005532585.localdomain podman[127584]: 2025-11-23 09:13:11.829320025 +0000 UTC m=+0.167160386 container init c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.)
Nov 23 09:13:11 np0005532585.localdomain podman[127584]: 2025-11-23 09:13:11.83915268 +0000 UTC m=+0.176993041 container start c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., RELEASE=main)
Nov 23 09:13:11 np0005532585.localdomain podman[127584]: 2025-11-23 09:13:11.83948241 +0000 UTC m=+0.177322811 container attach c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:13:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11381 DF PROTO=TCP SPT=44754 DPT=9882 SEQ=3024815876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C07F370000000001030307) 
Nov 23 09:13:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-65e17686d33e0674e72e06c177bb6dbf46f78a6084367725a7d3bbdbf9108ce5-merged.mount: Deactivated successfully.
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]: [
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:     {
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "available": false,
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "ceph_device": false,
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "lsm_data": {},
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "lvs": [],
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "path": "/dev/sr0",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "rejected_reasons": [
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "Has a FileSystem",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "Insufficient space (<5GB)"
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         ],
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         "sys_api": {
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "actuators": null,
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "device_nodes": "sr0",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "human_readable_size": "482.00 KB",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "id_bus": "ata",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "model": "QEMU DVD-ROM",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "nr_requests": "2",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "partitions": {},
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "path": "/dev/sr0",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "removable": "1",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "rev": "2.5+",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "ro": "0",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "rotational": "1",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "sas_address": "",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "sas_device_handle": "",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "scheduler_mode": "mq-deadline",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "sectors": 0,
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "sectorsize": "2048",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "size": 493568.0,
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "support_discard": "0",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "type": "disk",
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:             "vendor": "QEMU"
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:         }
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]:     }
Nov 23 09:13:12 np0005532585.localdomain gallant_kowalevski[127612]: ]
Nov 23 09:13:12 np0005532585.localdomain systemd[1]: libpod-c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398.scope: Deactivated successfully.
Nov 23 09:13:12 np0005532585.localdomain podman[127584]: 2025-11-23 09:13:12.793516624 +0000 UTC m=+1.131357015 container died c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container)
Nov 23 09:13:12 np0005532585.localdomain systemd[1]: tmp-crun.dP8qce.mount: Deactivated successfully.
Nov 23 09:13:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9-merged.mount: Deactivated successfully.
Nov 23 09:13:12 np0005532585.localdomain podman[129211]: 2025-11-23 09:13:12.875828387 +0000 UTC m=+0.074073768 container remove c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55)
Nov 23 09:13:12 np0005532585.localdomain systemd[1]: libpod-conmon-c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398.scope: Deactivated successfully.
Nov 23 09:13:12 np0005532585.localdomain sudo[127400]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:13 np0005532585.localdomain sudo[129276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuxirpdiiurhisjukparwujevsaldetk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889191.7954235-777-278624272502084/AnsiballZ_podman_image.py
Nov 23 09:13:13 np0005532585.localdomain sudo[129276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:13 np0005532585.localdomain python3.9[129278]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 23 09:13:13 np0005532585.localdomain sudo[129290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:13:13 np0005532585.localdomain sudo[129290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:13:13 np0005532585.localdomain sudo[129290]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11383 DF PROTO=TCP SPT=44754 DPT=9882 SEQ=3024815876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C08B200000000001030307) 
Nov 23 09:13:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39490 DF PROTO=TCP SPT=48560 DPT=9102 SEQ=3937336017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C098210000000001030307) 
Nov 23 09:13:19 np0005532585.localdomain podman[129296]: 2025-11-23 09:13:13.555563753 +0000 UTC m=+0.031111626 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 23 09:13:19 np0005532585.localdomain sudo[129276]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:20 np0005532585.localdomain sudo[129505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-porhohmabhmymdkoccmftfulzkedoazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889200.475837-810-236244778868082/AnsiballZ_podman_image.py
Nov 23 09:13:20 np0005532585.localdomain sudo[129505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:20 np0005532585.localdomain python3.9[129507]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 23 09:13:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10025 DF PROTO=TCP SPT=47812 DPT=9102 SEQ=1161413978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0AC610000000001030307) 
Nov 23 09:13:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11385 DF PROTO=TCP SPT=44754 DPT=9882 SEQ=3024815876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0BC200000000001030307) 
Nov 23 09:13:28 np0005532585.localdomain podman[129520]: 2025-11-23 09:13:21.083317802 +0000 UTC m=+0.033165160 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:13:28 np0005532585.localdomain sudo[129505]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27869 DF PROTO=TCP SPT=51206 DPT=9101 SEQ=1671673324 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0C53A0000000001030307) 
Nov 23 09:13:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29626 DF PROTO=TCP SPT=43566 DPT=9105 SEQ=1467313874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0C5EB0000000001030307) 
Nov 23 09:13:32 np0005532585.localdomain sudo[129718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdiddgqeldqorcjzefotxiivdnhepevs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889212.5231488-846-168399472048602/AnsiballZ_podman_image.py
Nov 23 09:13:32 np0005532585.localdomain sudo[129718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27871 DF PROTO=TCP SPT=51206 DPT=9101 SEQ=1671673324 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0D1610000000001030307) 
Nov 23 09:13:33 np0005532585.localdomain python3.9[129720]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 23 09:13:34 np0005532585.localdomain podman[129733]: 2025-11-23 09:13:33.164923116 +0000 UTC m=+0.037336220 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 23 09:13:34 np0005532585.localdomain sudo[129718]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:35 np0005532585.localdomain sudo[129893]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrbmtaqdtehrlfjcsvoaxqujkrwjrelr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889215.359881-873-141905162565782/AnsiballZ_podman_image.py
Nov 23 09:13:35 np0005532585.localdomain sudo[129893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:35 np0005532585.localdomain python3.9[129895]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 23 09:13:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57769 DF PROTO=TCP SPT=43322 DPT=9100 SEQ=2162066529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0DE200000000001030307) 
Nov 23 09:13:37 np0005532585.localdomain podman[129908]: 2025-11-23 09:13:35.958514722 +0000 UTC m=+0.043345236 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:13:37 np0005532585.localdomain sudo[129893]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:38 np0005532585.localdomain sudo[130071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngygsjkhfuuhinxzisqysiegzqqwybik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889217.7639134-900-144419987378675/AnsiballZ_podman_image.py
Nov 23 09:13:38 np0005532585.localdomain sudo[130071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:38 np0005532585.localdomain python3.9[130073]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 23 09:13:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65212 DF PROTO=TCP SPT=49614 DPT=9100 SEQ=143119108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0EA200000000001030307) 
Nov 23 09:13:41 np0005532585.localdomain podman[130086]: 2025-11-23 09:13:38.366794256 +0000 UTC m=+0.044463220 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 23 09:13:41 np0005532585.localdomain sudo[130071]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:42 np0005532585.localdomain sudo[130261]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybbnkbzmmzthjbzlcnlqcxirpwjjuyja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889222.0205898-900-109856048936224/AnsiballZ_podman_image.py
Nov 23 09:13:42 np0005532585.localdomain sudo[130261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:42 np0005532585.localdomain python3.9[130263]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 23 09:13:42 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48386 DF PROTO=TCP SPT=56700 DPT=9882 SEQ=1000996218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0F8610000000001030307) 
Nov 23 09:13:44 np0005532585.localdomain podman[130276]: 2025-11-23 09:13:42.617797371 +0000 UTC m=+0.046799713 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 23 09:13:44 np0005532585.localdomain sudo[130261]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48387 DF PROTO=TCP SPT=56700 DPT=9882 SEQ=1000996218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C100600000000001030307) 
Nov 23 09:13:45 np0005532585.localdomain sshd[124659]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:13:45 np0005532585.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Nov 23 09:13:45 np0005532585.localdomain systemd[1]: session-39.scope: Consumed 1min 27.874s CPU time.
Nov 23 09:13:45 np0005532585.localdomain systemd-logind[761]: Session 39 logged out. Waiting for processes to exit.
Nov 23 09:13:45 np0005532585.localdomain systemd-logind[761]: Removed session 39.
Nov 23 09:13:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10027 DF PROTO=TCP SPT=47812 DPT=9102 SEQ=1161413978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C10C200000000001030307) 
Nov 23 09:13:51 np0005532585.localdomain sshd[130568]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:13:51 np0005532585.localdomain sshd[130568]: Accepted publickey for zuul from 192.168.122.31 port 46590 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:13:51 np0005532585.localdomain systemd-logind[761]: New session 40 of user zuul.
Nov 23 09:13:51 np0005532585.localdomain systemd[1]: Started Session 40 of User zuul.
Nov 23 09:13:51 np0005532585.localdomain sshd[130568]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:13:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30671 DF PROTO=TCP SPT=52646 DPT=9102 SEQ=2360694826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C121600000000001030307) 
Nov 23 09:13:54 np0005532585.localdomain python3.9[130719]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:13:55 np0005532585.localdomain sudo[130813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-febpncltvdadqeklxhnfymracuantaps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889235.5254369-69-260962443273239/AnsiballZ_getent.py
Nov 23 09:13:55 np0005532585.localdomain sudo[130813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:56 np0005532585.localdomain python3.9[130815]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 23 09:13:56 np0005532585.localdomain sudo[130813]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:56 np0005532585.localdomain sudo[130906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiksvrjbsfycwfzofxajekiywtiaqrld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889236.668444-105-167040817380934/AnsiballZ_setup.py
Nov 23 09:13:56 np0005532585.localdomain sudo[130906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48389 DF PROTO=TCP SPT=56700 DPT=9882 SEQ=1000996218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C130200000000001030307) 
Nov 23 09:13:57 np0005532585.localdomain python3.9[130908]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:13:57 np0005532585.localdomain sudo[130906]: pam_unix(sudo:session): session closed for user root
Nov 23 09:13:57 np0005532585.localdomain sudo[130960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfbmszxmljmbnzmalznrvnwgnifyksqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889236.668444-105-167040817380934/AnsiballZ_dnf.py
Nov 23 09:13:58 np0005532585.localdomain sudo[130960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:13:58 np0005532585.localdomain python3.9[130962]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:13:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2348 DF PROTO=TCP SPT=56584 DPT=9101 SEQ=3911532318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C13A6A0000000001030307) 
Nov 23 09:13:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46040 DF PROTO=TCP SPT=42082 DPT=9105 SEQ=719711906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C13B1B0000000001030307) 
Nov 23 09:14:01 np0005532585.localdomain sudo[130960]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46042 DF PROTO=TCP SPT=42082 DPT=9105 SEQ=719711906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C147200000000001030307) 
Nov 23 09:14:03 np0005532585.localdomain sudo[131054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmzxgtyatkhogharqpwxbwcasmqugeoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889243.1995835-147-186029152697904/AnsiballZ_dnf.py
Nov 23 09:14:03 np0005532585.localdomain sudo[131054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:03 np0005532585.localdomain python3.9[131056]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:14:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50434 DF PROTO=TCP SPT=53328 DPT=9100 SEQ=172578527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C154210000000001030307) 
Nov 23 09:14:06 np0005532585.localdomain sudo[131054]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:07 np0005532585.localdomain sudo[131148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rghpbwtbnlieewxxffbxxhcirzymnhlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889247.026427-171-180995203577239/AnsiballZ_systemd.py
Nov 23 09:14:07 np0005532585.localdomain sudo[131148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:07 np0005532585.localdomain python3.9[131150]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:14:08 np0005532585.localdomain sudo[131148]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14709 DF PROTO=TCP SPT=53830 DPT=9100 SEQ=3683994720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C15F600000000001030307) 
Nov 23 09:14:09 np0005532585.localdomain python3.9[131243]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:14:11 np0005532585.localdomain sudo[131333]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bguzrafszbbgetfqyerfjivtgsygpsdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889251.1119368-225-157157388918807/AnsiballZ_sefcontext.py
Nov 23 09:14:11 np0005532585.localdomain sudo[131333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:11 np0005532585.localdomain python3.9[131335]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  Converting 2755 SID table entries...
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:14:12 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:14:12 np0005532585.localdomain sudo[131333]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53861 DF PROTO=TCP SPT=40180 DPT=9882 SEQ=2067870612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C16DA10000000001030307) 
Nov 23 09:14:13 np0005532585.localdomain sudo[131355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:14:13 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Nov 23 09:14:13 np0005532585.localdomain sudo[131355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:14:13 np0005532585.localdomain sudo[131355]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:13 np0005532585.localdomain sudo[131370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:14:13 np0005532585.localdomain sudo[131370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:14:14 np0005532585.localdomain sudo[131370]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:14 np0005532585.localdomain python3.9[131491]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:14:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53862 DF PROTO=TCP SPT=40180 DPT=9882 SEQ=2067870612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C175A00000000001030307) 
Nov 23 09:14:15 np0005532585.localdomain sudo[131587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrpxajapsdwoeiokysxajrutmodzsvba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889255.258358-279-71218902940416/AnsiballZ_dnf.py
Nov 23 09:14:15 np0005532585.localdomain sudo[131587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:15 np0005532585.localdomain python3.9[131589]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:14:15 np0005532585.localdomain sudo[131590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:14:15 np0005532585.localdomain sudo[131590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:14:15 np0005532585.localdomain sudo[131590]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30673 DF PROTO=TCP SPT=52646 DPT=9102 SEQ=2360694826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C182210000000001030307) 
Nov 23 09:14:18 np0005532585.localdomain sudo[131587]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:20 np0005532585.localdomain sudo[131696]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtnimalfnaqkkyfbssfyhiinzohgguol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889259.5008764-303-15296405898854/AnsiballZ_command.py
Nov 23 09:14:20 np0005532585.localdomain sudo[131696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:20 np0005532585.localdomain python3.9[131698]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:14:21 np0005532585.localdomain sudo[131696]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:21 np0005532585.localdomain sudo[131941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuhaqnpilvzkvsbiqkiwnadshazdnhmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889261.4852395-327-181351032064044/AnsiballZ_file.py
Nov 23 09:14:21 np0005532585.localdomain sudo[131941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:22 np0005532585.localdomain python3.9[131943]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 09:14:22 np0005532585.localdomain sudo[131941]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:22 np0005532585.localdomain python3.9[132033]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:14:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58547 DF PROTO=TCP SPT=36646 DPT=9102 SEQ=2258823922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C196A00000000001030307) 
Nov 23 09:14:23 np0005532585.localdomain sudo[132125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwmnbkhzwlgreimnntofuxchkysdwlky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889263.2310016-381-232498886079943/AnsiballZ_dnf.py
Nov 23 09:14:23 np0005532585.localdomain sudo[132125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:23 np0005532585.localdomain python3.9[132127]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:14:26 np0005532585.localdomain sudo[132125]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:27 np0005532585.localdomain sudo[132219]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzkniooqmiknzpaygdetyoshpldmczrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889267.0703511-405-184283296438697/AnsiballZ_dnf.py
Nov 23 09:14:27 np0005532585.localdomain sudo[132219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53864 DF PROTO=TCP SPT=40180 DPT=9882 SEQ=2067870612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1A6200000000001030307) 
Nov 23 09:14:27 np0005532585.localdomain python3.9[132221]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:14:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29136 DF PROTO=TCP SPT=56520 DPT=9101 SEQ=3907057434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1AF9A0000000001030307) 
Nov 23 09:14:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13123 DF PROTO=TCP SPT=36824 DPT=9105 SEQ=1887706419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1B04B0000000001030307) 
Nov 23 09:14:30 np0005532585.localdomain sudo[132219]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:31 np0005532585.localdomain sudo[132313]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xodjrxlvveegypaxrjvqtfeudkoqcbls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889270.891495-429-246844509064996/AnsiballZ_systemd.py
Nov 23 09:14:31 np0005532585.localdomain sudo[132313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:31 np0005532585.localdomain python3.9[132315]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 09:14:31 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:14:31 np0005532585.localdomain systemd-sysv-generator[132350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:14:31 np0005532585.localdomain systemd-rc-local-generator[132346]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:14:31 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:14:31 np0005532585.localdomain sudo[132313]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29138 DF PROTO=TCP SPT=56520 DPT=9101 SEQ=3907057434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1BBA00000000001030307) 
Nov 23 09:14:33 np0005532585.localdomain sudo[132445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nooknmskahjxeuaxreslbszqomzqnlgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889273.23047-459-153613568777665/AnsiballZ_stat.py
Nov 23 09:14:33 np0005532585.localdomain sudo[132445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:33 np0005532585.localdomain python3.9[132447]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:14:33 np0005532585.localdomain sudo[132445]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:34 np0005532585.localdomain sudo[132537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkzjzryyivuemvsrjjbvcvshyjsoqqat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889273.9876735-486-59011782268524/AnsiballZ_ini_file.py
Nov 23 09:14:34 np0005532585.localdomain sudo[132537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:34 np0005532585.localdomain python3.9[132539]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:34 np0005532585.localdomain sudo[132537]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:35 np0005532585.localdomain sudo[132631]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqwuhzhlpccqoierzelrmojdsebdcjfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889274.8313546-510-195843534378458/AnsiballZ_ini_file.py
Nov 23 09:14:35 np0005532585.localdomain sudo[132631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:35 np0005532585.localdomain python3.9[132633]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:35 np0005532585.localdomain sudo[132631]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:35 np0005532585.localdomain sudo[132723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktudersvtyovzdyhpjudvmnwvxgjctgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889275.5247073-534-106171227311610/AnsiballZ_ini_file.py
Nov 23 09:14:35 np0005532585.localdomain sudo[132723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:35 np0005532585.localdomain python3.9[132725]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:35 np0005532585.localdomain sudo[132723]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65215 DF PROTO=TCP SPT=49614 DPT=9100 SEQ=143119108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1C8210000000001030307) 
Nov 23 09:14:36 np0005532585.localdomain sudo[132815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmjonkqlhxbkzocwexnfdofflgrodkfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889276.4766986-564-224444163799672/AnsiballZ_stat.py
Nov 23 09:14:36 np0005532585.localdomain sudo[132815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:36 np0005532585.localdomain python3.9[132817]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:14:36 np0005532585.localdomain sudo[132815]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:37 np0005532585.localdomain sudo[132888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilrbxttyalqdwdzsshzphxcertiblyzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889276.4766986-564-224444163799672/AnsiballZ_copy.py
Nov 23 09:14:37 np0005532585.localdomain sudo[132888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:37 np0005532585.localdomain python3.9[132890]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889276.4766986-564-224444163799672/.source _original_basename=.uz1rk4or follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:37 np0005532585.localdomain sudo[132888]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:38 np0005532585.localdomain sudo[132980]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbnggotuqklmbrmumrnnytpagetjyyhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889278.0793493-609-18649053289309/AnsiballZ_file.py
Nov 23 09:14:38 np0005532585.localdomain sudo[132980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:38 np0005532585.localdomain python3.9[132982]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:38 np0005532585.localdomain sudo[132980]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:39 np0005532585.localdomain sudo[133072]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xljnqjpzfnehjfeirotgaliewnbbzshu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889278.799511-633-91606804447108/AnsiballZ_edpm_os_net_config_mappings.py
Nov 23 09:14:39 np0005532585.localdomain sudo[133072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7792 DF PROTO=TCP SPT=60806 DPT=9100 SEQ=202949221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1D4A00000000001030307) 
Nov 23 09:14:39 np0005532585.localdomain python3.9[133074]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 23 09:14:39 np0005532585.localdomain sudo[133072]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:39 np0005532585.localdomain sudo[133164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnjqdvlxnnkbsuzltpuvcriuwpjjwuet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889279.7102458-660-238362664270086/AnsiballZ_file.py
Nov 23 09:14:39 np0005532585.localdomain sudo[133164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:40 np0005532585.localdomain python3.9[133166]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:40 np0005532585.localdomain sudo[133164]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:40 np0005532585.localdomain sudo[133256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hydnjlcbhxelvwkqcredkesydfgufofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889280.6226838-690-50264484314571/AnsiballZ_stat.py
Nov 23 09:14:40 np0005532585.localdomain sudo[133256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:41 np0005532585.localdomain python3.9[133258]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:14:41 np0005532585.localdomain sudo[133256]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:41 np0005532585.localdomain sudo[133329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyfmcbbnwkmahkgatdzeaekqqezypnqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889280.6226838-690-50264484314571/AnsiballZ_copy.py
Nov 23 09:14:41 np0005532585.localdomain sudo[133329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:41 np0005532585.localdomain python3.9[133331]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889280.6226838-690-50264484314571/.source.yaml _original_basename=.2210qgsh follow=False checksum=06d744ebe702728c19f6d1a8f97158d086012058 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:41 np0005532585.localdomain sudo[133329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:41 np0005532585.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=208.81.1.244 DST=38.102.83.198 LEN=308 TOS=0x08 PREC=0x20 TTL=54 ID=65320 DF PROTO=TCP SPT=443 DPT=52082 SEQ=3804090244 ACK=3369547015 WINDOW=131 RES=0x00 ACK URGP=0 OPT (0101080AA82F48A0D9E26288) 
Nov 23 09:14:42 np0005532585.localdomain sudo[133421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azenlpvqmcuayuzrxlnwnuwxiwbcbpbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889281.9143753-735-217471820555143/AnsiballZ_slurp.py
Nov 23 09:14:42 np0005532585.localdomain sudo[133421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:42 np0005532585.localdomain python3.9[133423]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 23 09:14:42 np0005532585.localdomain sudo[133421]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:43 np0005532585.localdomain sudo[133526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hipzfmdprfgoadxlexgfhlyjreyeessa ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889282.8782218-762-238711728402442/async_wrapper.py j676011251751 300 /home/zuul/.ansible/tmp/ansible-tmp-1763889282.8782218-762-238711728402442/AnsiballZ_edpm_os_net_config.py _
Nov 23 09:14:43 np0005532585.localdomain sudo[133526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:43 np0005532585.localdomain ansible-async_wrapper.py[133528]: Invoked with j676011251751 300 /home/zuul/.ansible/tmp/ansible-tmp-1763889282.8782218-762-238711728402442/AnsiballZ_edpm_os_net_config.py _
Nov 23 09:14:43 np0005532585.localdomain ansible-async_wrapper.py[133531]: Starting module and watcher
Nov 23 09:14:43 np0005532585.localdomain ansible-async_wrapper.py[133531]: Start watching 133532 (300)
Nov 23 09:14:43 np0005532585.localdomain ansible-async_wrapper.py[133532]: Start module (133532)
Nov 23 09:14:43 np0005532585.localdomain ansible-async_wrapper.py[133528]: Return async_wrapper task started.
Nov 23 09:14:43 np0005532585.localdomain sudo[133526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:43 np0005532585.localdomain python3.9[133533]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Nov 23 09:14:44 np0005532585.localdomain ansible-async_wrapper.py[133532]: Module complete (133532)
Nov 23 09:14:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25870 DF PROTO=TCP SPT=59884 DPT=9882 SEQ=81733405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1EAE00000000001030307) 
Nov 23 09:14:47 np0005532585.localdomain sudo[133623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rplococbbwtinbglrsokuqxolpxzsust ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889286.8463733-762-117668491907031/AnsiballZ_async_status.py
Nov 23 09:14:47 np0005532585.localdomain sudo[133623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:47 np0005532585.localdomain python3.9[133625]: ansible-ansible.legacy.async_status Invoked with jid=j676011251751.133528 mode=status _async_dir=/root/.ansible_async
Nov 23 09:14:47 np0005532585.localdomain sudo[133623]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:47 np0005532585.localdomain sudo[133682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgarqstpylzamlircvfntvpcivjsnkzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889286.8463733-762-117668491907031/AnsiballZ_async_status.py
Nov 23 09:14:47 np0005532585.localdomain sudo[133682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:47 np0005532585.localdomain python3.9[133684]: ansible-ansible.legacy.async_status Invoked with jid=j676011251751.133528 mode=cleanup _async_dir=/root/.ansible_async
Nov 23 09:14:47 np0005532585.localdomain sudo[133682]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:48 np0005532585.localdomain sudo[133774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tufayilxpakpzwcvahicvytyggjimclm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889288.1842752-828-485485887441/AnsiballZ_stat.py
Nov 23 09:14:48 np0005532585.localdomain sudo[133774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:48 np0005532585.localdomain python3.9[133776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:14:48 np0005532585.localdomain sudo[133774]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:48 np0005532585.localdomain ansible-async_wrapper.py[133531]: Done in kid B.
Nov 23 09:14:48 np0005532585.localdomain sudo[133847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svnfuuotwkdazihxzoceizbedabidbcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889288.1842752-828-485485887441/AnsiballZ_copy.py
Nov 23 09:14:48 np0005532585.localdomain sudo[133847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25871 DF PROTO=TCP SPT=59884 DPT=9882 SEQ=81733405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1FAA10000000001030307) 
Nov 23 09:14:49 np0005532585.localdomain python3.9[133849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889288.1842752-828-485485887441/.source.returncode _original_basename=.dsus64tb follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:49 np0005532585.localdomain sudo[133847]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:49 np0005532585.localdomain sudo[133939]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxdrbelgkqpayqtirasnqbrzlqslmher ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889289.448784-876-242957066281776/AnsiballZ_stat.py
Nov 23 09:14:49 np0005532585.localdomain sudo[133939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:49 np0005532585.localdomain python3.9[133941]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:14:49 np0005532585.localdomain sudo[133939]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:50 np0005532585.localdomain sudo[134012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdoawsxdxykrykcthjcnszcsmfjariwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889289.448784-876-242957066281776/AnsiballZ_copy.py
Nov 23 09:14:50 np0005532585.localdomain sudo[134012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:50 np0005532585.localdomain python3.9[134014]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889289.448784-876-242957066281776/.source.cfg _original_basename=.t81fwlb4 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:14:50 np0005532585.localdomain sudo[134012]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:51 np0005532585.localdomain sudo[134104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocgdiwcwzeryjuymqvcwhvrvsbgvmmmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889290.748671-921-158170517282783/AnsiballZ_systemd.py
Nov 23 09:14:51 np0005532585.localdomain sudo[134104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:14:51 np0005532585.localdomain python3.9[134106]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:14:51 np0005532585.localdomain systemd[1]: Reloading Network Manager...
Nov 23 09:14:51 np0005532585.localdomain NetworkManager[5975]: <info>  [1763889291.4559] audit: op="reload" arg="0" pid=134110 uid=0 result="success"
Nov 23 09:14:51 np0005532585.localdomain NetworkManager[5975]: <info>  [1763889291.4571] config: signal: SIGHUP (no changes from disk)
Nov 23 09:14:51 np0005532585.localdomain systemd[1]: Reloaded Network Manager.
Nov 23 09:14:51 np0005532585.localdomain sudo[134104]: pam_unix(sudo:session): session closed for user root
Nov 23 09:14:51 np0005532585.localdomain sshd[130568]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:14:51 np0005532585.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Nov 23 09:14:51 np0005532585.localdomain systemd[1]: session-40.scope: Consumed 34.908s CPU time.
Nov 23 09:14:51 np0005532585.localdomain systemd-logind[761]: Session 40 logged out. Waiting for processes to exit.
Nov 23 09:14:51 np0005532585.localdomain systemd-logind[761]: Removed session 40.
Nov 23 09:14:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52279 DF PROTO=TCP SPT=57730 DPT=9102 SEQ=2741214403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C20BE10000000001030307) 
Nov 23 09:14:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25872 DF PROTO=TCP SPT=59884 DPT=9882 SEQ=81733405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C21A200000000001030307) 
Nov 23 09:14:57 np0005532585.localdomain sshd[134125]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:14:57 np0005532585.localdomain sshd[134125]: Accepted publickey for zuul from 192.168.122.31 port 44906 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:14:57 np0005532585.localdomain systemd-logind[761]: New session 41 of user zuul.
Nov 23 09:14:57 np0005532585.localdomain systemd[1]: Started Session 41 of User zuul.
Nov 23 09:14:57 np0005532585.localdomain sshd[134125]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:14:58 np0005532585.localdomain python3.9[134218]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:14:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12195 DF PROTO=TCP SPT=58778 DPT=9101 SEQ=2273225292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C224CA0000000001030307) 
Nov 23 09:14:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11887 DF PROTO=TCP SPT=58770 DPT=9105 SEQ=265828421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2257B0000000001030307) 
Nov 23 09:15:00 np0005532585.localdomain python3.9[134312]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:15:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12197 DF PROTO=TCP SPT=58778 DPT=9101 SEQ=2273225292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C230E00000000001030307) 
Nov 23 09:15:03 np0005532585.localdomain python3.9[134465]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:15:04 np0005532585.localdomain sshd[134125]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:15:04 np0005532585.localdomain systemd-logind[761]: Session 41 logged out. Waiting for processes to exit.
Nov 23 09:15:04 np0005532585.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Nov 23 09:15:04 np0005532585.localdomain systemd[1]: session-41.scope: Consumed 2.160s CPU time.
Nov 23 09:15:04 np0005532585.localdomain systemd-logind[761]: Removed session 41.
Nov 23 09:15:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14712 DF PROTO=TCP SPT=53830 DPT=9100 SEQ=3683994720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C23E200000000001030307) 
Nov 23 09:15:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37579 DF PROTO=TCP SPT=59010 DPT=9100 SEQ=3929823100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C249A00000000001030307) 
Nov 23 09:15:10 np0005532585.localdomain sshd[134481]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:15:10 np0005532585.localdomain sshd[134481]: Accepted publickey for zuul from 192.168.122.31 port 57462 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:15:10 np0005532585.localdomain systemd-logind[761]: New session 42 of user zuul.
Nov 23 09:15:10 np0005532585.localdomain systemd[1]: Started Session 42 of User zuul.
Nov 23 09:15:10 np0005532585.localdomain sshd[134481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:15:11 np0005532585.localdomain python3.9[134574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:15:11 np0005532585.localdomain sshd[134593]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:15:12 np0005532585.localdomain python3.9[134669]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:15:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56783 DF PROTO=TCP SPT=49390 DPT=9882 SEQ=2287916637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C257E00000000001030307) 
Nov 23 09:15:13 np0005532585.localdomain sudo[134763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqlnlyuwemtyqyputnbfenebcfafplju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889313.6981158-81-120384712951400/AnsiballZ_setup.py
Nov 23 09:15:13 np0005532585.localdomain sudo[134763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:14 np0005532585.localdomain python3.9[134765]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:15:14 np0005532585.localdomain sudo[134763]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56784 DF PROTO=TCP SPT=49390 DPT=9882 SEQ=2287916637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C25FE10000000001030307) 
Nov 23 09:15:14 np0005532585.localdomain sudo[134818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsnijvevjogsowqsbjjxodkjpikawkvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889313.6981158-81-120384712951400/AnsiballZ_dnf.py
Nov 23 09:15:14 np0005532585.localdomain sudo[134818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:15 np0005532585.localdomain python3.9[134820]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:15:15 np0005532585.localdomain sudo[134823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:15:15 np0005532585.localdomain sudo[134823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:15:15 np0005532585.localdomain sudo[134823]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:16 np0005532585.localdomain sshd[134593]: Invalid user unknown from 128.185.199.14 port 53412
Nov 23 09:15:16 np0005532585.localdomain sudo[134838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 09:15:16 np0005532585.localdomain sudo[134838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:15:16 np0005532585.localdomain sudo[134838]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:16 np0005532585.localdomain sudo[134875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:15:16 np0005532585.localdomain sudo[134875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:15:16 np0005532585.localdomain sudo[134875]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:16 np0005532585.localdomain sshd[134593]: Connection closed by invalid user unknown 128.185.199.14 port 53412 [preauth]
Nov 23 09:15:16 np0005532585.localdomain sudo[134890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:15:16 np0005532585.localdomain sudo[134890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:15:17 np0005532585.localdomain sudo[134890]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:17 np0005532585.localdomain sudo[134936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:15:17 np0005532585.localdomain sudo[134936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:15:17 np0005532585.localdomain sudo[134936]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52281 DF PROTO=TCP SPT=57730 DPT=9102 SEQ=2741214403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C26C200000000001030307) 
Nov 23 09:15:18 np0005532585.localdomain sudo[134818]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:19 np0005532585.localdomain sudo[135040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zorhrugrttlnzecrutxmsdffuqlavllz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889319.082985-117-263588988389965/AnsiballZ_setup.py
Nov 23 09:15:19 np0005532585.localdomain sudo[135040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:19 np0005532585.localdomain python3.9[135042]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:15:20 np0005532585.localdomain sudo[135040]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:21 np0005532585.localdomain sudo[135195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avdnechfbcvbfdysufmcvkiputescgmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889320.5862527-150-104251194518925/AnsiballZ_file.py
Nov 23 09:15:21 np0005532585.localdomain sudo[135195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:21 np0005532585.localdomain python3.9[135197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:15:21 np0005532585.localdomain sudo[135195]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:21 np0005532585.localdomain sudo[135287]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwypcxxnqzflhzvqcclxvblmanjzkfhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889321.460004-174-107543033109666/AnsiballZ_command.py
Nov 23 09:15:21 np0005532585.localdomain sudo[135287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:22 np0005532585.localdomain python3.9[135289]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:15:22 np0005532585.localdomain sudo[135287]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:22 np0005532585.localdomain sudo[135391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmrxbelsxphhqsifeygefgrsciyqbqkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889322.4160237-198-140657644976077/AnsiballZ_stat.py
Nov 23 09:15:22 np0005532585.localdomain sudo[135391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:23 np0005532585.localdomain python3.9[135393]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:15:23 np0005532585.localdomain sudo[135391]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:23 np0005532585.localdomain sudo[135439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynhobfczyqyocvyesbhjyzjvekpmdfqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889322.4160237-198-140657644976077/AnsiballZ_file.py
Nov 23 09:15:23 np0005532585.localdomain sudo[135439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58996 DF PROTO=TCP SPT=42950 DPT=9102 SEQ=1523901771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C281210000000001030307) 
Nov 23 09:15:23 np0005532585.localdomain python3.9[135441]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:15:23 np0005532585.localdomain sudo[135439]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:24 np0005532585.localdomain sudo[135531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzhkqjzviqlcoazcwekgdzdhetxyqsjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889323.7651565-234-203962745738425/AnsiballZ_stat.py
Nov 23 09:15:24 np0005532585.localdomain sudo[135531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:24 np0005532585.localdomain python3.9[135533]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:15:24 np0005532585.localdomain sudo[135531]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:24 np0005532585.localdomain sudo[135579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whjcqlnbjysyhdxywkcabjlkzntvrfjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889323.7651565-234-203962745738425/AnsiballZ_file.py
Nov 23 09:15:24 np0005532585.localdomain sudo[135579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:24 np0005532585.localdomain python3.9[135581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:15:24 np0005532585.localdomain sudo[135579]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:25 np0005532585.localdomain sudo[135671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkizjbpwbcifqjznrlfbxutohvpbheim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889324.9875093-273-208817037131305/AnsiballZ_ini_file.py
Nov 23 09:15:25 np0005532585.localdomain sudo[135671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:25 np0005532585.localdomain python3.9[135673]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:15:25 np0005532585.localdomain sudo[135671]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:26 np0005532585.localdomain sudo[135763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzdrurejplvkkeqnsqwxtsccwagozqjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889325.7641318-273-196222136962156/AnsiballZ_ini_file.py
Nov 23 09:15:26 np0005532585.localdomain sudo[135763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:26 np0005532585.localdomain python3.9[135765]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:15:26 np0005532585.localdomain sudo[135763]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:26 np0005532585.localdomain sudo[135855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcgzqozwehtujaaqmtqwytfkrntxbpwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889326.6797097-273-258364461445378/AnsiballZ_ini_file.py
Nov 23 09:15:26 np0005532585.localdomain sudo[135855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:27 np0005532585.localdomain python3.9[135857]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:15:27 np0005532585.localdomain sudo[135855]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56786 DF PROTO=TCP SPT=49390 DPT=9882 SEQ=2287916637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C290200000000001030307) 
Nov 23 09:15:27 np0005532585.localdomain sudo[135947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jafyilndoqhcjnkqhisnxfhzxslferyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889327.2683134-273-74167103880427/AnsiballZ_ini_file.py
Nov 23 09:15:27 np0005532585.localdomain sudo[135947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:27 np0005532585.localdomain python3.9[135949]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:15:27 np0005532585.localdomain sudo[135947]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:29 np0005532585.localdomain sudo[136039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnkjewtjyuhcmjyurriesdrhtxaiglzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889329.190122-366-195143040358801/AnsiballZ_dnf.py
Nov 23 09:15:29 np0005532585.localdomain sudo[136039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:29 np0005532585.localdomain python3.9[136041]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:15:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22859 DF PROTO=TCP SPT=37668 DPT=9101 SEQ=1362344812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C299FA0000000001030307) 
Nov 23 09:15:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23427 DF PROTO=TCP SPT=57372 DPT=9105 SEQ=3036593496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C29AAB0000000001030307) 
Nov 23 09:15:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22861 DF PROTO=TCP SPT=37668 DPT=9101 SEQ=1362344812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2A6200000000001030307) 
Nov 23 09:15:32 np0005532585.localdomain sudo[136039]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:34 np0005532585.localdomain sudo[136133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypcowafacpqdsktdvieojvvyajpuymis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889333.9603167-399-82638034081137/AnsiballZ_setup.py
Nov 23 09:15:34 np0005532585.localdomain sudo[136133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:34 np0005532585.localdomain python3.9[136135]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:15:34 np0005532585.localdomain sudo[136133]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:35 np0005532585.localdomain sudo[136227]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwiktopmdlzxazzmttwybewqccxssmrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889334.8285046-423-263607428859939/AnsiballZ_stat.py
Nov 23 09:15:35 np0005532585.localdomain sudo[136227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:35 np0005532585.localdomain python3.9[136229]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:15:35 np0005532585.localdomain sudo[136227]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:35 np0005532585.localdomain sudo[136319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sregfvtkvfyshcxcbqopztmhggalaypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889335.5860198-450-107661486480881/AnsiballZ_stat.py
Nov 23 09:15:35 np0005532585.localdomain sudo[136319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:35 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7795 DF PROTO=TCP SPT=60806 DPT=9100 SEQ=202949221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2B2200000000001030307) 
Nov 23 09:15:35 np0005532585.localdomain python3.9[136321]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:15:36 np0005532585.localdomain sudo[136319]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:36 np0005532585.localdomain auditd[727]: Audit daemon rotating log files
Nov 23 09:15:36 np0005532585.localdomain sudo[136411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybntfxnzqmgkyigoibohmbkmxhsbeipe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889336.4769876-480-141393664891681/AnsiballZ_command.py
Nov 23 09:15:36 np0005532585.localdomain sudo[136411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:36 np0005532585.localdomain python3.9[136413]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:15:36 np0005532585.localdomain sudo[136411]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65316 DF PROTO=TCP SPT=36684 DPT=9100 SEQ=1024540960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2BEE00000000001030307) 
Nov 23 09:15:39 np0005532585.localdomain sudo[136504]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbkarybliobpbgwhflqjaadjufrextai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889338.59536-510-184279882565159/AnsiballZ_service_facts.py
Nov 23 09:15:39 np0005532585.localdomain sudo[136504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:39 np0005532585.localdomain python3.9[136506]: ansible-service_facts Invoked
Nov 23 09:15:39 np0005532585.localdomain network[136523]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:15:39 np0005532585.localdomain network[136524]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:15:39 np0005532585.localdomain network[136525]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:15:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:15:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33168 DF PROTO=TCP SPT=45594 DPT=9882 SEQ=3886340126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2C9270000000001030307) 
Nov 23 09:15:42 np0005532585.localdomain sudo[136504]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33170 DF PROTO=TCP SPT=45594 DPT=9882 SEQ=3886340126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2D5200000000001030307) 
Nov 23 09:15:45 np0005532585.localdomain sshd[136649]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:15:45 np0005532585.localdomain sshd[136649]: Invalid user solv from 193.32.162.146 port 59064
Nov 23 09:15:45 np0005532585.localdomain sshd[136649]: Connection closed by invalid user solv 193.32.162.146 port 59064 [preauth]
Nov 23 09:15:45 np0005532585.localdomain sudo[136739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anvssdmnqcqhkrevtazsvdmzpqlcpfca ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1763889345.6860616-555-278939445343834/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1763889345.6860616-555-278939445343834/args
Nov 23 09:15:45 np0005532585.localdomain sudo[136739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:46 np0005532585.localdomain sudo[136739]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:46 np0005532585.localdomain sudo[136846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqdokjftajnogigzgdvsoxtbnpacaxta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889346.4312527-588-46017506960451/AnsiballZ_dnf.py
Nov 23 09:15:46 np0005532585.localdomain sudo[136846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:46 np0005532585.localdomain python3.9[136848]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:15:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58998 DF PROTO=TCP SPT=42950 DPT=9102 SEQ=1523901771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2E2200000000001030307) 
Nov 23 09:15:49 np0005532585.localdomain sudo[136846]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:51 np0005532585.localdomain sudo[136940]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwlxikcmnagnxctljubruibmcbcihfpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889350.862784-628-40641092977042/AnsiballZ_package_facts.py
Nov 23 09:15:51 np0005532585.localdomain sudo[136940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:51 np0005532585.localdomain python3.9[136942]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 23 09:15:52 np0005532585.localdomain sudo[136940]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43471 DF PROTO=TCP SPT=51720 DPT=9102 SEQ=2839129972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2F6200000000001030307) 
Nov 23 09:15:53 np0005532585.localdomain sudo[137032]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdlihxpcjrknivtadkkdayvyugggmtgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889353.2755616-657-115628340651080/AnsiballZ_stat.py
Nov 23 09:15:53 np0005532585.localdomain sudo[137032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:54 np0005532585.localdomain python3.9[137034]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:15:54 np0005532585.localdomain sudo[137032]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:55 np0005532585.localdomain sudo[137107]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvcndedgcwgtqgbashaiaswmqbxnhmvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889353.2755616-657-115628340651080/AnsiballZ_copy.py
Nov 23 09:15:55 np0005532585.localdomain sudo[137107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:55 np0005532585.localdomain python3.9[137109]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889353.2755616-657-115628340651080/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:15:55 np0005532585.localdomain sudo[137107]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:56 np0005532585.localdomain sudo[137201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udjeestxedubdjvklpzpwbupqibenjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889355.877056-703-12160223696076/AnsiballZ_stat.py
Nov 23 09:15:56 np0005532585.localdomain sudo[137201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:56 np0005532585.localdomain python3.9[137203]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:15:56 np0005532585.localdomain sudo[137201]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:56 np0005532585.localdomain sudo[137276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itrwdwbvuzkdfolzpyjeakvsnluxffkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889355.877056-703-12160223696076/AnsiballZ_copy.py
Nov 23 09:15:56 np0005532585.localdomain sudo[137276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:56 np0005532585.localdomain python3.9[137278]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889355.877056-703-12160223696076/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:15:56 np0005532585.localdomain sudo[137276]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33172 DF PROTO=TCP SPT=45594 DPT=9882 SEQ=3886340126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C306210000000001030307) 
Nov 23 09:15:58 np0005532585.localdomain sudo[137370]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikpnonnxiffuptzbxtpczzlyaenhgufa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889358.1815276-765-94175018070736/AnsiballZ_lineinfile.py
Nov 23 09:15:58 np0005532585.localdomain sudo[137370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:15:58 np0005532585.localdomain python3.9[137372]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:15:58 np0005532585.localdomain sudo[137370]: pam_unix(sudo:session): session closed for user root
Nov 23 09:15:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35671 DF PROTO=TCP SPT=44574 DPT=9101 SEQ=3990137902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C30F2A0000000001030307) 
Nov 23 09:15:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42861 DF PROTO=TCP SPT=58772 DPT=9105 SEQ=1394186128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C30FDE0000000001030307) 
Nov 23 09:16:00 np0005532585.localdomain sudo[137464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-addogpkhkeqbbkwhljoaapwwuivfwfhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889360.3675048-811-263669045535425/AnsiballZ_setup.py
Nov 23 09:16:00 np0005532585.localdomain sudo[137464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:00 np0005532585.localdomain python3.9[137466]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:16:01 np0005532585.localdomain sudo[137464]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:01 np0005532585.localdomain sudo[137518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amzysxvvzpiumqdoyairecewyffhvbtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889360.3675048-811-263669045535425/AnsiballZ_systemd.py
Nov 23 09:16:01 np0005532585.localdomain sudo[137518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:02 np0005532585.localdomain python3.9[137520]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:16:02 np0005532585.localdomain sudo[137518]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42863 DF PROTO=TCP SPT=58772 DPT=9105 SEQ=1394186128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C31BE00000000001030307) 
Nov 23 09:16:04 np0005532585.localdomain sudo[137612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nphvuzfqyjvxynphvzxlluzsaaqxlcke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889364.3345723-860-104060106974764/AnsiballZ_setup.py
Nov 23 09:16:04 np0005532585.localdomain sudo[137612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:04 np0005532585.localdomain python3.9[137614]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:16:05 np0005532585.localdomain sudo[137612]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:05 np0005532585.localdomain sudo[137666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beqaudlpebrekzibbnmtgqgdgtdufjmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889364.3345723-860-104060106974764/AnsiballZ_systemd.py
Nov 23 09:16:05 np0005532585.localdomain sudo[137666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:06 np0005532585.localdomain python3.9[137668]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:16:06 np0005532585.localdomain systemd[1]: Stopping NTP client/server...
Nov 23 09:16:06 np0005532585.localdomain chronyd[25967]: chronyd exiting
Nov 23 09:16:06 np0005532585.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 23 09:16:06 np0005532585.localdomain systemd[1]: Stopped NTP client/server.
Nov 23 09:16:06 np0005532585.localdomain systemd[1]: Starting NTP client/server...
Nov 23 09:16:06 np0005532585.localdomain chronyd[137676]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 23 09:16:06 np0005532585.localdomain chronyd[137676]: Frequency -30.625 +/- 0.724 ppm read from /var/lib/chrony/drift
Nov 23 09:16:06 np0005532585.localdomain chronyd[137676]: Loaded seccomp filter (level 2)
Nov 23 09:16:06 np0005532585.localdomain systemd[1]: Started NTP client/server.
Nov 23 09:16:06 np0005532585.localdomain sudo[137666]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37582 DF PROTO=TCP SPT=59010 DPT=9100 SEQ=3929823100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3283D0000000001030307) 
Nov 23 09:16:08 np0005532585.localdomain sshd[134481]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:16:08 np0005532585.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Nov 23 09:16:08 np0005532585.localdomain systemd[1]: session-42.scope: Consumed 27.628s CPU time.
Nov 23 09:16:08 np0005532585.localdomain systemd-logind[761]: Session 42 logged out. Waiting for processes to exit.
Nov 23 09:16:08 np0005532585.localdomain systemd-logind[761]: Removed session 42.
Nov 23 09:16:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40600 DF PROTO=TCP SPT=41034 DPT=9100 SEQ=3304842840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C334200000000001030307) 
Nov 23 09:16:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26166 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C33E590000000001030307) 
Nov 23 09:16:13 np0005532585.localdomain sshd[137692]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:16:14 np0005532585.localdomain sshd[137692]: Accepted publickey for zuul from 192.168.122.31 port 33712 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:16:14 np0005532585.localdomain systemd-logind[761]: New session 43 of user zuul.
Nov 23 09:16:14 np0005532585.localdomain systemd[1]: Started Session 43 of User zuul.
Nov 23 09:16:14 np0005532585.localdomain sshd[137692]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:16:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26168 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C34A610000000001030307) 
Nov 23 09:16:15 np0005532585.localdomain python3.9[137785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:16:16 np0005532585.localdomain sudo[137879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyhxgklvpmwwyytzamhgpokplzdomdya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889375.7623293-60-121010422737930/AnsiballZ_file.py
Nov 23 09:16:16 np0005532585.localdomain sudo[137879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:16 np0005532585.localdomain python3.9[137881]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:16 np0005532585.localdomain sudo[137879]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:17 np0005532585.localdomain sudo[137984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooicdrzdjlkpjzoryyjunwozfqouxvfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889376.8308532-84-507046703469/AnsiballZ_stat.py
Nov 23 09:16:17 np0005532585.localdomain sudo[137984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:17 np0005532585.localdomain python3.9[137986]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:17 np0005532585.localdomain sudo[137984]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43473 DF PROTO=TCP SPT=51720 DPT=9102 SEQ=2839129972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C356200000000001030307) 
Nov 23 09:16:18 np0005532585.localdomain sudo[138032]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovaviyqjregbabuhncfljjtezadwalnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889376.8308532-84-507046703469/AnsiballZ_file.py
Nov 23 09:16:18 np0005532585.localdomain sudo[138032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:18 np0005532585.localdomain sudo[138035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:16:18 np0005532585.localdomain sudo[138035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:16:18 np0005532585.localdomain sudo[138035]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:18 np0005532585.localdomain python3.9[138034]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.0gwpw0so recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:18 np0005532585.localdomain sudo[138050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:16:18 np0005532585.localdomain sudo[138050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:16:18 np0005532585.localdomain sudo[138032]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:18 np0005532585.localdomain sudo[138050]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:19 np0005532585.localdomain sudo[138111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:16:19 np0005532585.localdomain sudo[138111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:16:19 np0005532585.localdomain sudo[138111]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:19 np0005532585.localdomain sudo[138201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtwzovahivwodcgkvlcstsmbbeqvbntn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889379.5927913-144-101025634084363/AnsiballZ_stat.py
Nov 23 09:16:19 np0005532585.localdomain sudo[138201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:20 np0005532585.localdomain python3.9[138203]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:20 np0005532585.localdomain sudo[138201]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:20 np0005532585.localdomain sudo[138276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tatkugifvwmwdottvvwyibzzpsyalxcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889379.5927913-144-101025634084363/AnsiballZ_copy.py
Nov 23 09:16:20 np0005532585.localdomain sudo[138276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:20 np0005532585.localdomain python3.9[138278]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889379.5927913-144-101025634084363/.source _original_basename=.xsw4d97h follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:20 np0005532585.localdomain sudo[138276]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:21 np0005532585.localdomain sudo[138368]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnglrpflkkjbecncagqyqeecqebwshtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889380.9634268-192-112509191409772/AnsiballZ_file.py
Nov 23 09:16:21 np0005532585.localdomain sudo[138368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:21 np0005532585.localdomain python3.9[138370]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:16:21 np0005532585.localdomain sudo[138368]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:22 np0005532585.localdomain sudo[138460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyxkgxrdrukgpkanfqbdtchzhcyggmai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889382.4214268-216-117155126347584/AnsiballZ_stat.py
Nov 23 09:16:22 np0005532585.localdomain sudo[138460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:22 np0005532585.localdomain python3.9[138462]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:22 np0005532585.localdomain sudo[138460]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43698 DF PROTO=TCP SPT=43266 DPT=9102 SEQ=3297613898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C36B600000000001030307) 
Nov 23 09:16:23 np0005532585.localdomain sudo[138533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izdnmzfbooqtwkkfilvzmxndgacnkglp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889382.4214268-216-117155126347584/AnsiballZ_copy.py
Nov 23 09:16:23 np0005532585.localdomain sudo[138533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:23 np0005532585.localdomain python3.9[138535]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889382.4214268-216-117155126347584/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:16:23 np0005532585.localdomain sudo[138533]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:24 np0005532585.localdomain sudo[138625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drrhergqrktoxjmlhvakbieaavqeopvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889383.8112419-216-45691580572790/AnsiballZ_stat.py
Nov 23 09:16:24 np0005532585.localdomain sudo[138625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:24 np0005532585.localdomain python3.9[138627]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:24 np0005532585.localdomain sudo[138625]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:24 np0005532585.localdomain sudo[138698]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbvlxgybzfkwarfxdvlssdxmejscrmrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889383.8112419-216-45691580572790/AnsiballZ_copy.py
Nov 23 09:16:24 np0005532585.localdomain sudo[138698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:24 np0005532585.localdomain python3.9[138700]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889383.8112419-216-45691580572790/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:16:24 np0005532585.localdomain sudo[138698]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:25 np0005532585.localdomain sudo[138790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxawagylwxxswdjiswrczeylypchstxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889384.9769938-303-159554036140329/AnsiballZ_file.py
Nov 23 09:16:25 np0005532585.localdomain sudo[138790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:25 np0005532585.localdomain python3.9[138792]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:25 np0005532585.localdomain sudo[138790]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:25 np0005532585.localdomain sudo[138882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irhwzbywjsvylrvarwcrijislnmpovbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889385.57294-327-182839567029953/AnsiballZ_stat.py
Nov 23 09:16:25 np0005532585.localdomain sudo[138882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:26 np0005532585.localdomain python3.9[138884]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:26 np0005532585.localdomain sudo[138882]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:26 np0005532585.localdomain sudo[138955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzjfmghrdhsijaxbzsiyttorslqbyfbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889385.57294-327-182839567029953/AnsiballZ_copy.py
Nov 23 09:16:26 np0005532585.localdomain sudo[138955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:26 np0005532585.localdomain python3.9[138957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889385.57294-327-182839567029953/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:26 np0005532585.localdomain sudo[138955]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:27 np0005532585.localdomain sudo[139047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxhlwyqwusakinhgrbeceejxieuibkos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889386.8292208-372-183767049529610/AnsiballZ_stat.py
Nov 23 09:16:27 np0005532585.localdomain sudo[139047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26170 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C37A210000000001030307) 
Nov 23 09:16:27 np0005532585.localdomain python3.9[139049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:27 np0005532585.localdomain sudo[139047]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:27 np0005532585.localdomain sudo[139120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbrfghirecputtgyistaasgxbkmaejqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889386.8292208-372-183767049529610/AnsiballZ_copy.py
Nov 23 09:16:27 np0005532585.localdomain sudo[139120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:27 np0005532585.localdomain python3.9[139122]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889386.8292208-372-183767049529610/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:27 np0005532585.localdomain sudo[139120]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:28 np0005532585.localdomain sudo[139212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jezmmwmhrvunfmxxixzfirbpidbjomrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889388.0585268-417-11745189441831/AnsiballZ_systemd.py
Nov 23 09:16:28 np0005532585.localdomain sudo[139212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:29 np0005532585.localdomain python3.9[139214]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:16:29 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:16:29 np0005532585.localdomain systemd-sysv-generator[139244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:16:29 np0005532585.localdomain systemd-rc-local-generator[139236]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:16:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:16:29 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:16:29 np0005532585.localdomain systemd-sysv-generator[139279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:16:29 np0005532585.localdomain systemd-rc-local-generator[139274]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:16:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:16:29 np0005532585.localdomain systemd[1]: Starting EDPM Container Shutdown...
Nov 23 09:16:29 np0005532585.localdomain systemd[1]: Finished EDPM Container Shutdown.
Nov 23 09:16:29 np0005532585.localdomain sudo[139212]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19957 DF PROTO=TCP SPT=49976 DPT=9101 SEQ=942024125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3845A0000000001030307) 
Nov 23 09:16:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32280 DF PROTO=TCP SPT=52160 DPT=9105 SEQ=3065586781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3850B0000000001030307) 
Nov 23 09:16:31 np0005532585.localdomain sudo[139383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvynurvajszvmzfvrkwkqmvlfdmorzur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889390.9307582-441-198425369348546/AnsiballZ_stat.py
Nov 23 09:16:31 np0005532585.localdomain sudo[139383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:31 np0005532585.localdomain python3.9[139385]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:31 np0005532585.localdomain sudo[139383]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:31 np0005532585.localdomain sudo[139456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctwamkctqofyrtaivqmycuhkxgsyhmyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889390.9307582-441-198425369348546/AnsiballZ_copy.py
Nov 23 09:16:31 np0005532585.localdomain sudo[139456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:32 np0005532585.localdomain python3.9[139458]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889390.9307582-441-198425369348546/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:32 np0005532585.localdomain sudo[139456]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19959 DF PROTO=TCP SPT=49976 DPT=9101 SEQ=942024125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C390610000000001030307) 
Nov 23 09:16:33 np0005532585.localdomain sudo[139548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbcrurbuayimttxnwzxshhlplvwqjbsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889393.0224657-486-73911689883859/AnsiballZ_stat.py
Nov 23 09:16:33 np0005532585.localdomain sudo[139548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:33 np0005532585.localdomain python3.9[139550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:33 np0005532585.localdomain sudo[139548]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:33 np0005532585.localdomain sudo[139621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-howimnfgnuujynluyfbymctixfuapqgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889393.0224657-486-73911689883859/AnsiballZ_copy.py
Nov 23 09:16:33 np0005532585.localdomain sudo[139621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:34 np0005532585.localdomain python3.9[139623]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889393.0224657-486-73911689883859/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:34 np0005532585.localdomain sudo[139621]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:34 np0005532585.localdomain sudo[139713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yprtmxynvabvqawsoeaogrloxqyocfft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889394.2257242-531-187858640981244/AnsiballZ_systemd.py
Nov 23 09:16:34 np0005532585.localdomain sudo[139713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:34 np0005532585.localdomain python3.9[139715]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:16:34 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:16:34 np0005532585.localdomain systemd-sysv-generator[139742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:16:34 np0005532585.localdomain systemd-rc-local-generator[139739]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:16:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:16:35 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:16:35 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:16:35 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:16:35 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:16:35 np0005532585.localdomain sudo[139713]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65319 DF PROTO=TCP SPT=36684 DPT=9100 SEQ=1024540960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C39E210000000001030307) 
Nov 23 09:16:36 np0005532585.localdomain python3.9[139847]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:16:36 np0005532585.localdomain network[139864]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:16:36 np0005532585.localdomain network[139865]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:16:36 np0005532585.localdomain network[139866]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:16:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:16:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56760 DF PROTO=TCP SPT=46368 DPT=9100 SEQ=2594710531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3A9610000000001030307) 
Nov 23 09:16:41 np0005532585.localdomain sudo[140065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbsgynjtzgksruawknbqwuwjatndvdfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889401.3739276-609-48713027567405/AnsiballZ_stat.py
Nov 23 09:16:41 np0005532585.localdomain sudo[140065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=523 DF PROTO=TCP SPT=43022 DPT=9882 SEQ=1100943462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3B3860000000001030307) 
Nov 23 09:16:41 np0005532585.localdomain python3.9[140067]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:41 np0005532585.localdomain sudo[140065]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:42 np0005532585.localdomain sudo[140140]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldyzflfdluudryosikkbxfcxnihvdqhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889401.3739276-609-48713027567405/AnsiballZ_copy.py
Nov 23 09:16:42 np0005532585.localdomain sudo[140140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:42 np0005532585.localdomain python3.9[140142]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889401.3739276-609-48713027567405/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:43 np0005532585.localdomain sudo[140140]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:43 np0005532585.localdomain sudo[140233]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzycxkjqpfdjyntwvplufkugowtfbabn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889403.6575472-654-57620510385218/AnsiballZ_systemd.py
Nov 23 09:16:43 np0005532585.localdomain sudo[140233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:44 np0005532585.localdomain python3.9[140235]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:16:44 np0005532585.localdomain systemd[1]: Reloading OpenSSH server daemon...
Nov 23 09:16:44 np0005532585.localdomain sshd[120095]: Received SIGHUP; restarting.
Nov 23 09:16:44 np0005532585.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Nov 23 09:16:44 np0005532585.localdomain sshd[120095]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:16:44 np0005532585.localdomain sshd[120095]: Server listening on 0.0.0.0 port 22.
Nov 23 09:16:44 np0005532585.localdomain sshd[120095]: Server listening on :: port 22.
Nov 23 09:16:44 np0005532585.localdomain sudo[140233]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:44 np0005532585.localdomain sudo[140329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vppuzrgfwckfsyisqnryauacxsouqmqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889404.506295-678-154321004729934/AnsiballZ_file.py
Nov 23 09:16:44 np0005532585.localdomain sudo[140329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=525 DF PROTO=TCP SPT=43022 DPT=9882 SEQ=1100943462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3BFA00000000001030307) 
Nov 23 09:16:44 np0005532585.localdomain python3.9[140331]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:45 np0005532585.localdomain sudo[140329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:45 np0005532585.localdomain sudo[140421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsnivxgxgvkutkxwyyrbzjocaxtgtbvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889405.1718009-702-73984414733202/AnsiballZ_stat.py
Nov 23 09:16:45 np0005532585.localdomain sudo[140421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:45 np0005532585.localdomain python3.9[140423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:45 np0005532585.localdomain sudo[140421]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:46 np0005532585.localdomain sudo[140494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxwbqiyuaeqsljpsclpptocyvnjjfvvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889405.1718009-702-73984414733202/AnsiballZ_copy.py
Nov 23 09:16:46 np0005532585.localdomain sudo[140494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:46 np0005532585.localdomain python3.9[140496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889405.1718009-702-73984414733202/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:46 np0005532585.localdomain sudo[140494]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:47 np0005532585.localdomain sudo[140586]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hncjwrzcbalmvhwnkjljtnziljzfrcvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889407.0859656-756-212637714884214/AnsiballZ_timezone.py
Nov 23 09:16:47 np0005532585.localdomain sudo[140586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:47 np0005532585.localdomain python3.9[140588]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 23 09:16:47 np0005532585.localdomain systemd[1]: Starting Time & Date Service...
Nov 23 09:16:47 np0005532585.localdomain systemd[1]: Started Time & Date Service.
Nov 23 09:16:47 np0005532585.localdomain sudo[140586]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43700 DF PROTO=TCP SPT=43266 DPT=9102 SEQ=3297613898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3CC200000000001030307) 
Nov 23 09:16:48 np0005532585.localdomain sudo[140682]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbtkjhwjmrdlgdsnhwwetjxxqkbcuvtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889408.1436706-783-17230752803604/AnsiballZ_file.py
Nov 23 09:16:48 np0005532585.localdomain sudo[140682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:48 np0005532585.localdomain python3.9[140684]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:48 np0005532585.localdomain sudo[140682]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:49 np0005532585.localdomain sudo[140774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiulpwqsanzmubpujwumddoyabfveasd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889409.586898-807-91240555353654/AnsiballZ_stat.py
Nov 23 09:16:49 np0005532585.localdomain sudo[140774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:50 np0005532585.localdomain python3.9[140776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:50 np0005532585.localdomain sudo[140774]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:50 np0005532585.localdomain sudo[140847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htnzlofvyotxoqpzratuvaqxkabdbpgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889409.586898-807-91240555353654/AnsiballZ_copy.py
Nov 23 09:16:50 np0005532585.localdomain sudo[140847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:50 np0005532585.localdomain python3.9[140849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889409.586898-807-91240555353654/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:50 np0005532585.localdomain sudo[140847]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:51 np0005532585.localdomain sudo[140939]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keuwzruvcusjaalsghvvlgmnxsdgiynt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889410.9179804-852-35600893056117/AnsiballZ_stat.py
Nov 23 09:16:51 np0005532585.localdomain sudo[140939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:51 np0005532585.localdomain python3.9[140941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:51 np0005532585.localdomain sudo[140939]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:51 np0005532585.localdomain sudo[141012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwvldxnghnudskbwmmtubfweyypgqmug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889410.9179804-852-35600893056117/AnsiballZ_copy.py
Nov 23 09:16:51 np0005532585.localdomain sudo[141012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:51 np0005532585.localdomain python3.9[141014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889410.9179804-852-35600893056117/.source.yaml _original_basename=.z1a70904 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:51 np0005532585.localdomain sudo[141012]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:52 np0005532585.localdomain sudo[141104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgfwovxindxrrndlmluuogmpjrzqivsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889412.091772-898-81193597335166/AnsiballZ_stat.py
Nov 23 09:16:52 np0005532585.localdomain sudo[141104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:52 np0005532585.localdomain python3.9[141106]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:52 np0005532585.localdomain sudo[141104]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:52 np0005532585.localdomain sudo[141179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqrzbynddylwsknpiasomyfuotzpmsqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889412.091772-898-81193597335166/AnsiballZ_copy.py
Nov 23 09:16:52 np0005532585.localdomain sudo[141179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:53 np0005532585.localdomain python3.9[141181]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889412.091772-898-81193597335166/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:53 np0005532585.localdomain sudo[141179]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19009 DF PROTO=TCP SPT=42260 DPT=9102 SEQ=3700835675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3E0A00000000001030307) 
Nov 23 09:16:53 np0005532585.localdomain sudo[141271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdtyubvltqqvzepzdqqmpxjcvjfzzkqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889413.2967846-942-272783423882343/AnsiballZ_command.py
Nov 23 09:16:53 np0005532585.localdomain sudo[141271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:53 np0005532585.localdomain python3.9[141273]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:16:53 np0005532585.localdomain sudo[141271]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:54 np0005532585.localdomain sudo[141364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmwcoolymrmhkudpagbvaausntfhroqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889414.0698397-966-264471028067101/AnsiballZ_command.py
Nov 23 09:16:54 np0005532585.localdomain sudo[141364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:54 np0005532585.localdomain python3.9[141366]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:16:54 np0005532585.localdomain sudo[141364]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:54 np0005532585.localdomain sshd[141395]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:16:55 np0005532585.localdomain sudo[141458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxicoovdvgqrdnwljuyoffkfecbfozvy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763889414.7264435-990-28483166507605/AnsiballZ_edpm_nftables_from_files.py
Nov 23 09:16:55 np0005532585.localdomain sudo[141458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:55 np0005532585.localdomain python3[141460]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 09:16:55 np0005532585.localdomain sudo[141458]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:55 np0005532585.localdomain sudo[141551]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uexuahyqnwrqzmrjzjjvinexjfadpcry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889415.4855816-1014-248390595810784/AnsiballZ_stat.py
Nov 23 09:16:55 np0005532585.localdomain sudo[141551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:56 np0005532585.localdomain python3.9[141553]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:56 np0005532585.localdomain sudo[141551]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:56 np0005532585.localdomain sudo[141624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmtwgtkhbqdbpyskbutohuzqoamneltc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889415.4855816-1014-248390595810784/AnsiballZ_copy.py
Nov 23 09:16:56 np0005532585.localdomain sudo[141624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:56 np0005532585.localdomain sshd[141395]: Connection closed by authenticating user root 185.156.73.233 port 62712 [preauth]
Nov 23 09:16:56 np0005532585.localdomain python3.9[141626]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889415.4855816-1014-248390595810784/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:56 np0005532585.localdomain sudo[141624]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:57 np0005532585.localdomain sudo[141716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckhmpfkthjoctttrswfwlhfqslvwtwzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889416.7638268-1059-194305208002513/AnsiballZ_stat.py
Nov 23 09:16:57 np0005532585.localdomain sudo[141716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:57 np0005532585.localdomain python3.9[141718]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:57 np0005532585.localdomain sudo[141716]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=527 DF PROTO=TCP SPT=43022 DPT=9882 SEQ=1100943462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3F0200000000001030307) 
Nov 23 09:16:57 np0005532585.localdomain sudo[141789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwhnnfbxvqhvybotzpiqiyjtviltshpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889416.7638268-1059-194305208002513/AnsiballZ_copy.py
Nov 23 09:16:57 np0005532585.localdomain sudo[141789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:57 np0005532585.localdomain python3.9[141791]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889416.7638268-1059-194305208002513/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:57 np0005532585.localdomain sudo[141789]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:58 np0005532585.localdomain sudo[141881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlsovxeemnsvzfzqnpfgnicwxmbwhvns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889417.917644-1104-248489580866708/AnsiballZ_stat.py
Nov 23 09:16:58 np0005532585.localdomain sudo[141881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:58 np0005532585.localdomain python3.9[141883]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:58 np0005532585.localdomain sudo[141881]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:58 np0005532585.localdomain sudo[141954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivvtfwohydivfwkfmqlyixiikstdcojm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889417.917644-1104-248489580866708/AnsiballZ_copy.py
Nov 23 09:16:58 np0005532585.localdomain sudo[141954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:58 np0005532585.localdomain python3.9[141956]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889417.917644-1104-248489580866708/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:16:58 np0005532585.localdomain sudo[141954]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:59 np0005532585.localdomain sudo[142046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjeqzedrihxccvftaxiufcqbxquilala ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889419.1506188-1149-139431629150104/AnsiballZ_stat.py
Nov 23 09:16:59 np0005532585.localdomain sudo[142046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:16:59 np0005532585.localdomain python3.9[142048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:16:59 np0005532585.localdomain sudo[142046]: pam_unix(sudo:session): session closed for user root
Nov 23 09:16:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1594 DF PROTO=TCP SPT=58290 DPT=9101 SEQ=2574048636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3F98A0000000001030307) 
Nov 23 09:16:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42867 DF PROTO=TCP SPT=47352 DPT=9105 SEQ=375247215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3FA3B0000000001030307) 
Nov 23 09:17:01 np0005532585.localdomain sudo[142119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxuqtjjfchzldliatxslgouketohqxjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889419.1506188-1149-139431629150104/AnsiballZ_copy.py
Nov 23 09:17:01 np0005532585.localdomain sudo[142119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:01 np0005532585.localdomain python3.9[142121]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889419.1506188-1149-139431629150104/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:01 np0005532585.localdomain sudo[142119]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:01 np0005532585.localdomain sudo[142211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwnjzvmtjmdxsxpmrzxzsqpsmhijbzmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889421.4095821-1194-33983946343571/AnsiballZ_stat.py
Nov 23 09:17:01 np0005532585.localdomain sudo[142211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:01 np0005532585.localdomain python3.9[142213]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:17:01 np0005532585.localdomain sudo[142211]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:02 np0005532585.localdomain sudo[142284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcohgckzgkihqizucmenvyrrdveojcev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889421.4095821-1194-33983946343571/AnsiballZ_copy.py
Nov 23 09:17:02 np0005532585.localdomain sudo[142284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:02 np0005532585.localdomain python3.9[142286]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889421.4095821-1194-33983946343571/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:02 np0005532585.localdomain sudo[142284]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:03 np0005532585.localdomain sudo[142376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prhkurqkdjuhxfllguvcgmyddwjexdqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889423.4364293-1239-65661862719062/AnsiballZ_file.py
Nov 23 09:17:03 np0005532585.localdomain sudo[142376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:03 np0005532585.localdomain python3.9[142378]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:03 np0005532585.localdomain sudo[142376]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:04 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56762 DF PROTO=TCP SPT=46368 DPT=9100 SEQ=2594710531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C40A200000000001030307) 
Nov 23 09:17:04 np0005532585.localdomain sudo[142468]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfpcqpfvtueqsmwlzhegaiwitoedvwqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889424.1186345-1263-29916157930371/AnsiballZ_command.py
Nov 23 09:17:04 np0005532585.localdomain sudo[142468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:04 np0005532585.localdomain python3.9[142470]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:17:04 np0005532585.localdomain sudo[142468]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:05 np0005532585.localdomain sudo[142563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdyqaykahcfnwvtnxtbvkyeoakbhmfzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889424.8281386-1287-108594706812163/AnsiballZ_blockinfile.py
Nov 23 09:17:05 np0005532585.localdomain sudo[142563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:05 np0005532585.localdomain python3.9[142565]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:05 np0005532585.localdomain sudo[142563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40603 DF PROTO=TCP SPT=41034 DPT=9100 SEQ=3304842840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C412210000000001030307) 
Nov 23 09:17:06 np0005532585.localdomain sudo[142656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ossscwjoqaqakvcbyqbfszcexgklfcqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889425.8887975-1314-115280620223805/AnsiballZ_file.py
Nov 23 09:17:06 np0005532585.localdomain sudo[142656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:06 np0005532585.localdomain python3.9[142658]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:06 np0005532585.localdomain sudo[142656]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:06 np0005532585.localdomain sudo[142748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qziteuseccqqtikskhkcvymmildtglxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889426.458127-1314-144726802414581/AnsiballZ_file.py
Nov 23 09:17:06 np0005532585.localdomain sudo[142748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:06 np0005532585.localdomain python3.9[142750]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:06 np0005532585.localdomain sudo[142748]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:07 np0005532585.localdomain sudo[142840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzdfudocxeyfmzoralncnkwmjiqnzjlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889427.2439296-1359-280204900390370/AnsiballZ_mount.py
Nov 23 09:17:07 np0005532585.localdomain sudo[142840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:07 np0005532585.localdomain python3.9[142842]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 23 09:17:07 np0005532585.localdomain sudo[142840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:08 np0005532585.localdomain sudo[142933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfiahazufnuponwirxbxqckapblgiawa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889428.0772452-1359-234512346494865/AnsiballZ_mount.py
Nov 23 09:17:08 np0005532585.localdomain sudo[142933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:08 np0005532585.localdomain python3.9[142935]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 23 09:17:08 np0005532585.localdomain sudo[142933]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:09 np0005532585.localdomain sshd[137692]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:17:09 np0005532585.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Nov 23 09:17:09 np0005532585.localdomain systemd[1]: session-43.scope: Consumed 27.784s CPU time.
Nov 23 09:17:09 np0005532585.localdomain systemd-logind[761]: Session 43 logged out. Waiting for processes to exit.
Nov 23 09:17:09 np0005532585.localdomain systemd-logind[761]: Removed session 43.
Nov 23 09:17:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29806 DF PROTO=TCP SPT=50736 DPT=9882 SEQ=2661700928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C428B90000000001030307) 
Nov 23 09:17:14 np0005532585.localdomain sshd[142952]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:17:15 np0005532585.localdomain sshd[142952]: Accepted publickey for zuul from 192.168.122.31 port 47980 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:17:15 np0005532585.localdomain systemd-logind[761]: New session 44 of user zuul.
Nov 23 09:17:15 np0005532585.localdomain systemd[1]: Started Session 44 of User zuul.
Nov 23 09:17:15 np0005532585.localdomain sshd[142952]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:17:15 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26172 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C438200000000001030307) 
Nov 23 09:17:15 np0005532585.localdomain sudo[143045]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhswftyriennpphfqukhurpotgilrktt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889435.1263144-22-34418071134680/AnsiballZ_tempfile.py
Nov 23 09:17:15 np0005532585.localdomain sudo[143045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:16 np0005532585.localdomain python3.9[143047]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 23 09:17:16 np0005532585.localdomain sudo[143045]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41390 DF PROTO=TCP SPT=48326 DPT=9102 SEQ=2754492045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C439FB0000000001030307) 
Nov 23 09:17:17 np0005532585.localdomain sudo[143137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chelbkgmrycvtwoeyxhryzlfbkfpmjpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889437.1112268-94-47068536464773/AnsiballZ_stat.py
Nov 23 09:17:17 np0005532585.localdomain sudo[143137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:17 np0005532585.localdomain python3.9[143139]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:17:17 np0005532585.localdomain sudo[143137]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:17 np0005532585.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 09:17:19 np0005532585.localdomain sudo[143234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqxtmpmotbptiasyhxmhimttrfcdyzzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889438.9589922-142-200380624886817/AnsiballZ_slurp.py
Nov 23 09:17:19 np0005532585.localdomain sudo[143234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:19 np0005532585.localdomain python3.9[143236]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 23 09:17:19 np0005532585.localdomain sudo[143234]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:19 np0005532585.localdomain sudo[143237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:17:19 np0005532585.localdomain sudo[143237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:17:19 np0005532585.localdomain sudo[143237]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:19 np0005532585.localdomain sudo[143252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:17:19 np0005532585.localdomain sudo[143252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:17:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43701 DF PROTO=TCP SPT=43266 DPT=9102 SEQ=3297613898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C44A210000000001030307) 
Nov 23 09:17:20 np0005532585.localdomain systemd[1]: tmp-crun.GkxPKC.mount: Deactivated successfully.
Nov 23 09:17:20 np0005532585.localdomain podman[143352]: 2025-11-23 09:17:20.523915535 +0000 UTC m=+0.089593207 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:17:20 np0005532585.localdomain podman[143352]: 2025-11-23 09:17:20.641509048 +0000 UTC m=+0.207186760 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12)
Nov 23 09:17:20 np0005532585.localdomain sudo[143252]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:20 np0005532585.localdomain sudo[143492]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iompcatpxydnnqpghtzdtjflckooserh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889440.649029-190-211340832161534/AnsiballZ_stat.py
Nov 23 09:17:20 np0005532585.localdomain sudo[143492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:20 np0005532585.localdomain sudo[143494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:17:20 np0005532585.localdomain sudo[143494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:17:20 np0005532585.localdomain sudo[143494]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:21 np0005532585.localdomain sudo[143510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:17:21 np0005532585.localdomain sudo[143510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:17:21 np0005532585.localdomain python3.9[143507]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.6hjawuys follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:17:21 np0005532585.localdomain sudo[143492]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:21 np0005532585.localdomain sudo[143510]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:21 np0005532585.localdomain sudo[143629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnmwtsgzwottymmfdohvygkwenzsqgfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889440.649029-190-211340832161534/AnsiballZ_copy.py
Nov 23 09:17:21 np0005532585.localdomain sudo[143629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:21 np0005532585.localdomain python3.9[143631]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.6hjawuys mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889440.649029-190-211340832161534/.source.6hjawuys _original_basename=.qnu7sme_ follow=False checksum=86d7095ff15f9038e30789829322247c323137f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:21 np0005532585.localdomain sudo[143629]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:22 np0005532585.localdomain sudo[143646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:17:22 np0005532585.localdomain sudo[143646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:17:22 np0005532585.localdomain sudo[143646]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:23 np0005532585.localdomain sudo[143736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehelbhvtmxlcmllbsclxsxckspqmnwoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889443.2840335-280-117875420988423/AnsiballZ_setup.py
Nov 23 09:17:23 np0005532585.localdomain sudo[143736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:24 np0005532585.localdomain python3.9[143738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:17:24 np0005532585.localdomain sudo[143736]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:25 np0005532585.localdomain sudo[143828]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlcenfwgmycknekcabxnwszncuvzjzxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889445.4010565-329-249321107003780/AnsiballZ_blockinfile.py
Nov 23 09:17:25 np0005532585.localdomain sudo[143828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:26 np0005532585.localdomain python3.9[143830]: ansible-ansible.builtin.blockinfile Invoked with block=np0005532581.localdomain,192.168.122.103,np0005532581* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRibSMIP5+E9lJWuaKDEuCaJoGhGPTqff+o8SP2Twk+NhPOa5FC7WQhHPLXVhKAtlCX60ckYE53Q/H/RVRZ55JdWQLSdY/1tQCD6c0Ry6N+UD+mxo9iN9cHk6vd6J5kJu+v/gBEmFY1A9pjzsD1CTR8gZJHZFqbUTzXrKkoUjK3Kqa8UtvzyhgYQtYIaUwaf1z7CMNQ3A4EaGVKyRsVwb11jlaT9fjB43E3tp9p5EG6PPJEGux/Xea6iHnhSwZHpkD/ylneDOkBbGvYKhL33bpXMcbuHy32jAFr+2Q07sKvgy/b5/f/nTgNCyxEIpoXUbEhX+Vlh+gycU7KJw6FRyR3dQFjooV97NQ/oov2VP9DnTObziZA8lhaJ20ChTfDVUyvFCFi3dKgBUPCeNWCGI69eNHu3dQcwCNJ3kANqhHdkYpBd00PVBritJfxfzH1DCLo0I9CSi1buWYhein9VHZWtzePv/+ucWERRIo+J04QPkV+6P6vgOTRl5U75RctJU=
                                                            np0005532581.localdomain,192.168.122.103,np0005532581* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG7auGqCubvIeT+Z8+DFgAyuqWDpDfRlZtndf8hFQOt7
                                                            np0005532581.localdomain,192.168.122.103,np0005532581* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKGIbd1xFE29cgvdOZ+Uh6ipkdk4QfLnBLiJP+rzeHVtOUTgjR98CvJhrHQdGAxaTty6xRV53oj5EhBdMCJFc5I=
                                                            np0005532584.localdomain,192.168.122.106,np0005532584* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3OrbPXlomvlluk5pGQwXwJu+cR1IMLHg5EnGcI5epB1SB6q/EzlEo5+bOYmmvILsoesUzBIBq21mRhn1Wi2yjlys0pArFDqiLkUBvTW9ro6MKci9Smc12m7AkLus6UO6h3pzqcOdRZQ3KOQDL/83yYJVBCJyqlISXWzzHJpGRVnZHeT4CgKZ1nG5UEvOrtPXRAVWkz3v5TghJrYXvWaPQPmWcEy1rfhCjkCfQY++JB/Dlgammmd1+ZldadeXQi1b2X02a6GFyW0pUMFLjAP7Wr+KcRa5FIPmGwsPuc1NhveAH6zyLrabrh7jPR5O0tBjz9KcNYXbQmJetGt9ZWzFsl0qzXrvI38q5RlGptbqg0iSez61VBAUtnfs33hnYc3dvzJKXReR76PoU3yu/tLrhdK6szqIVsMdw2LGEro7l3KKMKXHSpi8n77fH8ICiU3F5Oif+nvS/e7xr4LccSEnFEHA9PdNxOWxJYLcxTQCt3BkNFrWw4oB1LiDsn98HlS8=
                                                            np0005532584.localdomain,192.168.122.106,np0005532584* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIACxkoVt3BLqmT5JuJibOj2srWJ99rHYxhxT/gCbLdIM
                                                            np0005532584.localdomain,192.168.122.106,np0005532584* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJi5N6oeJPjl3EunvvHi6baJIH9ibE30q8MR/UiZkuoStWh4NAj+cNFWO47723JbHkDzCF1p+3RJ1FLROkiZ4W0=
                                                            np0005532583.localdomain,192.168.122.105,np0005532583* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkB1Cq8AQaEBYTlv5Hzs024jg//D6wieNnvsI5WcYj7wckm9vKTJQfUD6yZBMmyPw6+vVzsM16bj2hagkDR5wkO7uSIaMqWrcoQ1h9HkJQLK8QB0iuzUvQzdr22kUgkLII8thNHK4VxF4VhAKNmzqCofZ4ZSaLUMwauFCFUjx1VJISEZdgYRZ4+++wAN5bdK+WrwSOAHJYJWQX2pRRsPiunSdY1BOUKB3sp7IBcQ3MDJgnKlkR7tiGSYB2W8JsLvIsIb0I2EaqmPUTIzKUuxSJnWEls/WyDT9MNkjhobVeAyFZ5TEik4OvobUhVGJ8CsU7O101KQNQ3IywPM+V0UpjA1yK49z5Qs0LjApmqORsTcjOojYaKGr9n64dVjXdFOMwajB9UmMEFtlIngm6kx7mJQGXqYxVAscW34JY832iKOEzQWrUSdo6mVJ7TXhYYcbdFp+G/128SfhNrbHwKinHeE9Nqu48BR7bmRZXO7ef+UMY1dG3AIvFt4JwFvLihZc=
                                                            np0005532583.localdomain,192.168.122.105,np0005532583* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIH4H0HJaVZZzbQbH92x/ePbqiic7VLTV0Kle7XvCiMNK
                                                            np0005532583.localdomain,192.168.122.105,np0005532583* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEb0/1S3v0DC07ZQnLEp9URjtv9BKwGlPRsb47Ua8w+WgbOM0JmtKaPebzMcBow+04/+k7+HcCDBj6p5Yd4q3M4=
                                                            np0005532585.localdomain,192.168.122.107,np0005532585* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCU6ocW8HWtJJyWPSFUqcN5z70XYnNrE5KeWh/VJ4bDkpVePpxxcdD8r8cKL121q0MKPRgia3jLqnKz+o4MH3AqTAWCZamBc1+ePq9OvZDenK69byea8TM176uYzfePjNlud4LSZ6lfkgneO5jeNE6/RcHgBc8Me+2mlzpavioA814r6Ci6hFaEIOS1Zd2b/yKzI4QRl6xg/aJKvlIe9w3G3BvKOG5pixPx2ng4wYc0OMtJb9ItJgZLY92GGuvVRwn9e0D4lab84+x/Nn3XatQdqU69ev7da/bQCUeBivyEZo03olh56YxCKvNfG3ZYwwhMTn9Hg/EdnwrGHYHj0ZgfSR1+Dzvnk0WW/MRs0276Ojj5O0hhnlaAh5n97W6fgHldGKvdEafYeD602C1Zkd+ISqF13W56MWhtUhiUsdUHShnpM/EBOITg6mTDFP1i/qMS0PjRaCzBpdqpJIoKzQpsi4Z3QTHTZ7uK/lqOEaE/wqXHuYlMKcTuOuX33gIp28k=
                                                            np0005532585.localdomain,192.168.122.107,np0005532585* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILpJc3/w4q1RFXE8+NzyjCJ0R7ySeHFy75KPVpy/YiB/
                                                            np0005532585.localdomain,192.168.122.107,np0005532585* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLtz4IM2aQZoQ7CuTS4jfYDH5LZPyutyvm+ZyFuW7jdHvK3umSrNYFwsqiHwWHvM9peuWot0GAUC8rCc1UO+ZWk=
                                                            np0005532586.localdomain,192.168.122.108,np0005532586* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD6U4JggC29IKqxQ7GjhK23AehQb1S2zLryOxLwLEs9rP0qOZpJ9wR1VsBNLXDCmoRVTsH2+3V00hmkvlanKUuzgmLO61hdur+5NQD0xHnY7lOLpOoyR7hJiMuHj/nRgBLWY2OB8Gim121dgfuc2zRF92igDYe65Uf0et83vWlgRmc7KlziaJ91iVcBUmhGYf3Ij7QxfhQH5TTnGoQizdiBpuP+yVuU2AepbvQ8ZFvzioCwzWAVu/xfdRFp9QyLT4JP1jM6dadTjD5RUAjRL6qR1tLXVq/rvqtXSL8ruBSYm3NCOys9RtdrNolZ7frd+zmvF+VzMNLtlRxiuy1ReR+ZO3felB+4TwfEfLZ+DqE1s3+ksCQH/sVCrxzFsRz5lamWG3p78ZBWTiQ/7WdJS1dQOHz+pKNSSW/NYMIqitxsCsEWPJLq/EWoHVxvjREucCb5YvWHPKOv5RLlbm5lSHFLuFVV8O3AAzD/3JsjTbKGOjJhmtxPCgEy7RPqtIUX90s=
                                                            np0005532586.localdomain,192.168.122.108,np0005532586* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKzaUMbW2RXGluOr1nHypPwK+dIm5zaIFHsNA8PEtRqK
                                                            np0005532586.localdomain,192.168.122.108,np0005532586* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLLaE/jo8XH2dLl/mTc9NRhBP3x+ig/gy7tepiJNCqlj0Dgb5vfu6IYaFNrkyisiqhenCsUZQo/guhdX9Nisv9I=
                                                            np0005532582.localdomain,192.168.122.104,np0005532582* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0v47OVdr7YS/5xSUmMc7u26O7OwPomkdDR6s8rrcencbx7seRSeU00QGeRQcJJ023bD3xk26W8iiJTRUDkYSy//cSfHODdDy+CNEfDUTkGzIjiApoLi2b+S4J6wcAldMsj02MZmx67vUHyM5Qwok+22XqopryL8BiGPJbnoUcZy773f5OKPPMNuj3Fyb7jd5mrC7awK4NniZHyHPYBQeBa234HL42fRjcOqCcxuauy5cbz9PeBv5/kg+nYc8cY5qCyLqNhzMVRUa/PcepMBcfThk17LtPGzCYS7IR2cGdUDP6Pe0QD34Hu6+mpwKwYx73v5uHcmy9CeZ8fK83/F84Lr6jxsiwoU2e+hUfzVRq8gnkjk6kuL86eSM2POSGgBYYgCb+Ma6lOkF1MA+rLAh0gAsUhBgVlz6HtaMoDvLOi/NrQeoQyNE1Pv4vPAndmGGc8A7JCtmCMk9VvMy0Ht4IOvtDJFfx1lg7NuMIKqePYTEk56p8wTUNM+BmdJEhFPU=
                                                            np0005532582.localdomain,192.168.122.104,np0005532582* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEaLDeiqlvIGmYCK/pVle4dWQoWUl9JopG1HgV4OQwpm
                                                            np0005532582.localdomain,192.168.122.104,np0005532582* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPG4t0LXPuGTxEFWkant9P4DDIM9mUsBdh3iJHN1QOZUHW9RJuWVAPGkYlb6jz2BktGBRNU2FJD+HyIE3L+OanQ=
                                                             create=True mode=0644 path=/tmp/ansible.6hjawuys state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:26 np0005532585.localdomain sudo[143828]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:27 np0005532585.localdomain sudo[143920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjbwavjiopfyeyqltmyywfmsdeavgkbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889446.7239437-377-135244924298189/AnsiballZ_command.py
Nov 23 09:17:27 np0005532585.localdomain sudo[143920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:27 np0005532585.localdomain python3.9[143922]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.6hjawuys' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:17:27 np0005532585.localdomain sudo[143920]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:28 np0005532585.localdomain sudo[144014]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlhdukctduvptqhfpfrdhcgzbwunmquh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889448.1207185-425-262542691502043/AnsiballZ_file.py
Nov 23 09:17:28 np0005532585.localdomain sudo[144014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:28 np0005532585.localdomain python3.9[144016]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.6hjawuys state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:28 np0005532585.localdomain sudo[144014]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52718 DF PROTO=TCP SPT=55630 DPT=9101 SEQ=2501216965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C46EBA0000000001030307) 
Nov 23 09:17:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21550 DF PROTO=TCP SPT=45448 DPT=9105 SEQ=1821250571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C46F6B0000000001030307) 
Nov 23 09:17:30 np0005532585.localdomain sshd[142952]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:17:30 np0005532585.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Nov 23 09:17:30 np0005532585.localdomain systemd[1]: session-44.scope: Consumed 4.104s CPU time.
Nov 23 09:17:30 np0005532585.localdomain systemd-logind[761]: Session 44 logged out. Waiting for processes to exit.
Nov 23 09:17:30 np0005532585.localdomain systemd-logind[761]: Removed session 44.
Nov 23 09:17:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28501 DF PROTO=TCP SPT=56188 DPT=9100 SEQ=2021828236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C477E10000000001030307) 
Nov 23 09:17:36 np0005532585.localdomain sshd[144031]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:17:36 np0005532585.localdomain sshd[144031]: Accepted publickey for zuul from 192.168.122.31 port 42666 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:17:36 np0005532585.localdomain systemd-logind[761]: New session 45 of user zuul.
Nov 23 09:17:36 np0005532585.localdomain systemd[1]: Started Session 45 of User zuul.
Nov 23 09:17:36 np0005532585.localdomain sshd[144031]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:17:37 np0005532585.localdomain python3.9[144124]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:17:38 np0005532585.localdomain sudo[144218]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hoijhqgdsazvsuedccoltciaglgxvlmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889457.6853817-57-56992181959315/AnsiballZ_systemd.py
Nov 23 09:17:38 np0005532585.localdomain sudo[144218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:38 np0005532585.localdomain python3.9[144220]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 09:17:38 np0005532585.localdomain sudo[144218]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:40 np0005532585.localdomain sudo[144312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbozdirnbgsseqopfkqlhcfsholbzfmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889459.775485-81-199319177502267/AnsiballZ_systemd.py
Nov 23 09:17:40 np0005532585.localdomain sudo[144312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:40 np0005532585.localdomain python3.9[144314]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:17:40 np0005532585.localdomain sudo[144312]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:41 np0005532585.localdomain sudo[144405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyqegjccesxmjnthvlsvnqxgsucwzmby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889460.6598265-108-231694048418750/AnsiballZ_command.py
Nov 23 09:17:41 np0005532585.localdomain sudo[144405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:41 np0005532585.localdomain python3.9[144407]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:17:41 np0005532585.localdomain sudo[144405]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42259 DF PROTO=TCP SPT=39490 DPT=9882 SEQ=3692313124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C49DE70000000001030307) 
Nov 23 09:17:41 np0005532585.localdomain sudo[144498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzzwygnmsedwlnffikypjujadmcelpli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889461.5127282-132-154537950120413/AnsiballZ_stat.py
Nov 23 09:17:41 np0005532585.localdomain sudo[144498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:42 np0005532585.localdomain python3.9[144500]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:17:42 np0005532585.localdomain sudo[144498]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:42 np0005532585.localdomain sudo[144592]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lihohrmsjwnxhfwnoitbvsyownxizont ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889462.3803284-156-76089975718562/AnsiballZ_command.py
Nov 23 09:17:42 np0005532585.localdomain sudo[144592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:42 np0005532585.localdomain python3.9[144594]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:17:42 np0005532585.localdomain sudo[144592]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:43 np0005532585.localdomain sudo[144687]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhigoxyczueeoqxgpbatekydfqjdmnox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889463.1048274-180-278464593630657/AnsiballZ_file.py
Nov 23 09:17:43 np0005532585.localdomain sudo[144687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:43 np0005532585.localdomain python3.9[144689]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:17:43 np0005532585.localdomain sudo[144687]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:44 np0005532585.localdomain sshd[144031]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:17:44 np0005532585.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Nov 23 09:17:44 np0005532585.localdomain systemd[1]: session-45.scope: Consumed 3.898s CPU time.
Nov 23 09:17:44 np0005532585.localdomain systemd-logind[761]: Session 45 logged out. Waiting for processes to exit.
Nov 23 09:17:44 np0005532585.localdomain systemd-logind[761]: Removed session 45.
Nov 23 09:17:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1404 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4AF2B0000000001030307) 
Nov 23 09:17:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1405 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4B3200000000001030307) 
Nov 23 09:17:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1406 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4BB200000000001030307) 
Nov 23 09:17:50 np0005532585.localdomain sshd[144704]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:17:50 np0005532585.localdomain sshd[144704]: Accepted publickey for zuul from 192.168.122.31 port 37414 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:17:50 np0005532585.localdomain systemd-logind[761]: New session 46 of user zuul.
Nov 23 09:17:50 np0005532585.localdomain systemd[1]: Started Session 46 of User zuul.
Nov 23 09:17:50 np0005532585.localdomain sshd[144704]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:17:51 np0005532585.localdomain python3.9[144797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:17:52 np0005532585.localdomain sudo[144891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bupfsqqkqzpvbaeknzjcrrjovhtaggjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889472.1452408-63-266891915687395/AnsiballZ_setup.py
Nov 23 09:17:52 np0005532585.localdomain sudo[144891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:52 np0005532585.localdomain python3.9[144893]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:17:52 np0005532585.localdomain sudo[144891]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:53 np0005532585.localdomain sudo[144945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-envvvdrrttlpuhkuxmwunkikiobsgsat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889472.1452408-63-266891915687395/AnsiballZ_dnf.py
Nov 23 09:17:53 np0005532585.localdomain sudo[144945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:17:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1407 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4CAE00000000001030307) 
Nov 23 09:17:53 np0005532585.localdomain python3.9[144947]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 09:17:56 np0005532585.localdomain sudo[144945]: pam_unix(sudo:session): session closed for user root
Nov 23 09:17:58 np0005532585.localdomain python3.9[145039]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:17:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33465 DF PROTO=TCP SPT=49740 DPT=9101 SEQ=3566375952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E3EA0000000001030307) 
Nov 23 09:17:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39735 DF PROTO=TCP SPT=34974 DPT=9105 SEQ=3316219862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E49B0000000001030307) 
Nov 23 09:18:00 np0005532585.localdomain sudo[145130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsfqzncemdqaiwrofnnoamzterllfiyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889479.902421-126-105974360732116/AnsiballZ_file.py
Nov 23 09:18:00 np0005532585.localdomain sudo[145130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:00 np0005532585.localdomain python3.9[145132]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:00 np0005532585.localdomain sudo[145130]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:00 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33466 DF PROTO=TCP SPT=49740 DPT=9101 SEQ=3566375952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E7E00000000001030307) 
Nov 23 09:18:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39736 DF PROTO=TCP SPT=34974 DPT=9105 SEQ=3316219862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E8A10000000001030307) 
Nov 23 09:18:01 np0005532585.localdomain sudo[145222]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndfpqnswzouldestgbyshenmftoytfkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889480.7694013-150-189704997559981/AnsiballZ_file.py
Nov 23 09:18:01 np0005532585.localdomain sudo[145222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:01 np0005532585.localdomain python3.9[145224]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:01 np0005532585.localdomain sudo[145222]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:01 np0005532585.localdomain sudo[145314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxxfzodcyjvqsmtbzjgxppbylnkadusf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889481.4612484-174-219575287164088/AnsiballZ_lineinfile.py
Nov 23 09:18:01 np0005532585.localdomain sudo[145314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1408 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4EC200000000001030307) 
Nov 23 09:18:02 np0005532585.localdomain python3.9[145316]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:02 np0005532585.localdomain sudo[145314]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26262 DF PROTO=TCP SPT=45630 DPT=9100 SEQ=992795737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4ED100000000001030307) 
Nov 23 09:18:02 np0005532585.localdomain python3.9[145406]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:18:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39737 DF PROTO=TCP SPT=34974 DPT=9105 SEQ=3316219862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4F0A00000000001030307) 
Nov 23 09:18:03 np0005532585.localdomain python3.9[145496]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:18:04 np0005532585.localdomain python3.9[145588]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:18:04 np0005532585.localdomain sshd[144704]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:18:04 np0005532585.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Nov 23 09:18:04 np0005532585.localdomain systemd[1]: session-46.scope: Consumed 8.857s CPU time.
Nov 23 09:18:04 np0005532585.localdomain systemd-logind[761]: Session 46 logged out. Waiting for processes to exit.
Nov 23 09:18:04 np0005532585.localdomain systemd-logind[761]: Removed session 46.
Nov 23 09:18:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33468 DF PROTO=TCP SPT=49740 DPT=9101 SEQ=3566375952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4FFA00000000001030307) 
Nov 23 09:18:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26265 DF PROTO=TCP SPT=45630 DPT=9100 SEQ=992795737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C508E00000000001030307) 
Nov 23 09:18:10 np0005532585.localdomain sshd[145603]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:18:10 np0005532585.localdomain sshd[145603]: Accepted publickey for zuul from 192.168.122.31 port 37644 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:18:10 np0005532585.localdomain systemd-logind[761]: New session 47 of user zuul.
Nov 23 09:18:10 np0005532585.localdomain systemd[1]: Started Session 47 of User zuul.
Nov 23 09:18:10 np0005532585.localdomain sshd[145603]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:18:12 np0005532585.localdomain python3.9[145696]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:18:12 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23051 DF PROTO=TCP SPT=37928 DPT=9882 SEQ=2886131127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C517200000000001030307) 
Nov 23 09:18:14 np0005532585.localdomain sudo[145790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfxmxbtgftfdztvzlnbbvqxffoidwmai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889494.528886-164-19950887711251/AnsiballZ_file.py
Nov 23 09:18:14 np0005532585.localdomain sudo[145790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23052 DF PROTO=TCP SPT=37928 DPT=9882 SEQ=2886131127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C51F200000000001030307) 
Nov 23 09:18:14 np0005532585.localdomain python3.9[145792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:14 np0005532585.localdomain sudo[145790]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:15 np0005532585.localdomain sudo[145882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouypxsxiqpsodciywbsulkzfdadevdgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889495.1274242-187-190268196656080/AnsiballZ_stat.py
Nov 23 09:18:15 np0005532585.localdomain sudo[145882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:15 np0005532585.localdomain chronyd[137676]: Selected source 208.81.1.244 (pool.ntp.org)
Nov 23 09:18:15 np0005532585.localdomain python3.9[145884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:15 np0005532585.localdomain sudo[145882]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:16 np0005532585.localdomain sudo[145955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouqsecbptxjjuapuwqoulrlpbucjbbtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889495.1274242-187-190268196656080/AnsiballZ_copy.py
Nov 23 09:18:16 np0005532585.localdomain sudo[145955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:16 np0005532585.localdomain python3.9[145957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889495.1274242-187-190268196656080/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:16 np0005532585.localdomain sudo[145955]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:16 np0005532585.localdomain sudo[146047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zadgnfwozazobwthsxdqddcjgnkbhsdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889496.544415-231-61363518351545/AnsiballZ_file.py
Nov 23 09:18:16 np0005532585.localdomain sudo[146047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:17 np0005532585.localdomain python3.9[146049]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:17 np0005532585.localdomain sudo[146047]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:17 np0005532585.localdomain sudo[146139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgzvfqxqsmdueibnspbkjrmwumemckwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889497.1730866-257-41015386399268/AnsiballZ_stat.py
Nov 23 09:18:17 np0005532585.localdomain sudo[146139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:17 np0005532585.localdomain python3.9[146141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:17 np0005532585.localdomain sudo[146139]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:17 np0005532585.localdomain sudo[146212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiioayzjzppykiwzgqzmfjwlqvnlosgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889497.1730866-257-41015386399268/AnsiballZ_copy.py
Nov 23 09:18:17 np0005532585.localdomain sudo[146212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:18 np0005532585.localdomain python3.9[146214]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889497.1730866-257-41015386399268/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:18 np0005532585.localdomain sudo[146212]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1409 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C52C210000000001030307) 
Nov 23 09:18:18 np0005532585.localdomain sudo[146304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnzzxixaakhmvilfjjlvqzlkasdcvsxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889498.3843477-302-219544805778639/AnsiballZ_file.py
Nov 23 09:18:18 np0005532585.localdomain sudo[146304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:18 np0005532585.localdomain python3.9[146306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:18 np0005532585.localdomain sudo[146304]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:19 np0005532585.localdomain sudo[146396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pztailyuiazskctcssgrmshbskewnnst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889498.9804406-328-200131937663724/AnsiballZ_stat.py
Nov 23 09:18:19 np0005532585.localdomain sudo[146396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:19 np0005532585.localdomain python3.9[146398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:19 np0005532585.localdomain sudo[146396]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:19 np0005532585.localdomain sudo[146469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yieozxtsoralodfvmarvxhqponkrovqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889498.9804406-328-200131937663724/AnsiballZ_copy.py
Nov 23 09:18:19 np0005532585.localdomain sudo[146469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:19 np0005532585.localdomain python3.9[146471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889498.9804406-328-200131937663724/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:19 np0005532585.localdomain sudo[146469]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:20 np0005532585.localdomain sudo[146561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jorsbqumxkvballlkifdnjsexnvyqpns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889500.1039433-374-183845244105911/AnsiballZ_file.py
Nov 23 09:18:20 np0005532585.localdomain sudo[146561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:20 np0005532585.localdomain python3.9[146563]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:20 np0005532585.localdomain sudo[146561]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:20 np0005532585.localdomain sudo[146653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgdpraqkhwwnmxpidmpxdqxfycsaeqnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889500.7220058-399-176413160202099/AnsiballZ_stat.py
Nov 23 09:18:20 np0005532585.localdomain sudo[146653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:21 np0005532585.localdomain python3.9[146655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:21 np0005532585.localdomain sudo[146653]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:21 np0005532585.localdomain sudo[146726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbpbodvirtoncbelcznmekzdohylprqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889500.7220058-399-176413160202099/AnsiballZ_copy.py
Nov 23 09:18:21 np0005532585.localdomain sudo[146726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:21 np0005532585.localdomain python3.9[146728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889500.7220058-399-176413160202099/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:21 np0005532585.localdomain sudo[146726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:22 np0005532585.localdomain sudo[146818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbukjxllcdguwmibyabzkzcocxsqtshl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889501.9361079-443-135665010092472/AnsiballZ_file.py
Nov 23 09:18:22 np0005532585.localdomain sudo[146818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:22 np0005532585.localdomain python3.9[146820]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:22 np0005532585.localdomain sudo[146818]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:22 np0005532585.localdomain sudo[146821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:18:22 np0005532585.localdomain sudo[146821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:18:22 np0005532585.localdomain sudo[146821]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:22 np0005532585.localdomain sudo[146850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:18:22 np0005532585.localdomain sudo[146850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:18:22 np0005532585.localdomain sudo[146940]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cewricewifobntzgegaowukexyvzhxkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889502.5241666-468-24465139394778/AnsiballZ_stat.py
Nov 23 09:18:22 np0005532585.localdomain sudo[146940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:22 np0005532585.localdomain python3.9[146942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:22 np0005532585.localdomain sudo[146940]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:23 np0005532585.localdomain sudo[146850]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:23 np0005532585.localdomain sudo[147046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-runufkbhvgzgjrabsqvqxzivjfkkwkvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889502.5241666-468-24465139394778/AnsiballZ_copy.py
Nov 23 09:18:23 np0005532585.localdomain sudo[147046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34860 DF PROTO=TCP SPT=43608 DPT=9102 SEQ=2511018635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C540210000000001030307) 
Nov 23 09:18:23 np0005532585.localdomain python3.9[147048]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889502.5241666-468-24465139394778/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:23 np0005532585.localdomain sudo[147046]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:23 np0005532585.localdomain sudo[147108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:18:23 np0005532585.localdomain sudo[147108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:18:23 np0005532585.localdomain sudo[147108]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:23 np0005532585.localdomain sudo[147153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuqgicjryexajwgeamktehtwvxypwecx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889503.67967-510-7436422215488/AnsiballZ_file.py
Nov 23 09:18:23 np0005532585.localdomain sudo[147153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:24 np0005532585.localdomain python3.9[147155]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:24 np0005532585.localdomain sudo[147153]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:24 np0005532585.localdomain sudo[147245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naqvhpluhvsrwtckcbplppxlssswtcgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889504.355742-535-270140173196383/AnsiballZ_stat.py
Nov 23 09:18:24 np0005532585.localdomain sudo[147245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:24 np0005532585.localdomain python3.9[147247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:24 np0005532585.localdomain sudo[147245]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:25 np0005532585.localdomain sudo[147318]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kobdtzrjtccibvywgrdjblgyutgoyesu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889504.355742-535-270140173196383/AnsiballZ_copy.py
Nov 23 09:18:25 np0005532585.localdomain sudo[147318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:25 np0005532585.localdomain python3.9[147320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889504.355742-535-270140173196383/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:25 np0005532585.localdomain sudo[147318]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:25 np0005532585.localdomain sudo[147411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcefyqfimyvbjbhqroitawsiyoxxhtte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889505.555235-580-128366456596823/AnsiballZ_file.py
Nov 23 09:18:25 np0005532585.localdomain sudo[147411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:26 np0005532585.localdomain python3.9[147413]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:26 np0005532585.localdomain sudo[147411]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:26 np0005532585.localdomain sudo[147503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpuswzgphjpvjnvehxrymrjayteltkpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889506.2064142-603-181960204528158/AnsiballZ_stat.py
Nov 23 09:18:26 np0005532585.localdomain sudo[147503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:26 np0005532585.localdomain python3.9[147505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:26 np0005532585.localdomain sudo[147503]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:27 np0005532585.localdomain sudo[147576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiqlglkjjrczmjnhznnybayqwerwdokw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889506.2064142-603-181960204528158/AnsiballZ_copy.py
Nov 23 09:18:27 np0005532585.localdomain sudo[147576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:27 np0005532585.localdomain python3.9[147578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889506.2064142-603-181960204528158/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:27 np0005532585.localdomain sudo[147576]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23054 DF PROTO=TCP SPT=37928 DPT=9882 SEQ=2886131127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C550200000000001030307) 
Nov 23 09:18:27 np0005532585.localdomain sudo[147668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sltliqjaepjiquwappemowndxavikbrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889507.4556634-652-35156604168590/AnsiballZ_file.py
Nov 23 09:18:27 np0005532585.localdomain sudo[147668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:27 np0005532585.localdomain python3.9[147670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:27 np0005532585.localdomain sudo[147668]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:28 np0005532585.localdomain sudo[147760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twejwuzztqrvxsrxsfboeqxgjzynhmbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889508.0596857-675-236799784048460/AnsiballZ_stat.py
Nov 23 09:18:28 np0005532585.localdomain sudo[147760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:28 np0005532585.localdomain python3.9[147762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:28 np0005532585.localdomain sudo[147760]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:28 np0005532585.localdomain sudo[147833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nssplnouqvhfprtphjkkpwrhjpgowvap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889508.0596857-675-236799784048460/AnsiballZ_copy.py
Nov 23 09:18:28 np0005532585.localdomain sudo[147833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:29 np0005532585.localdomain python3.9[147835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889508.0596857-675-236799784048460/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:29 np0005532585.localdomain sudo[147833]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11262 DF PROTO=TCP SPT=52842 DPT=9101 SEQ=1941546333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5591B0000000001030307) 
Nov 23 09:18:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1189 DF PROTO=TCP SPT=49214 DPT=9105 SEQ=3630460857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C559CB0000000001030307) 
Nov 23 09:18:30 np0005532585.localdomain sshd[145603]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:18:30 np0005532585.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Nov 23 09:18:30 np0005532585.localdomain systemd[1]: session-47.scope: Consumed 11.303s CPU time.
Nov 23 09:18:30 np0005532585.localdomain systemd-logind[761]: Session 47 logged out. Waiting for processes to exit.
Nov 23 09:18:30 np0005532585.localdomain systemd-logind[761]: Removed session 47.
Nov 23 09:18:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11264 DF PROTO=TCP SPT=52842 DPT=9101 SEQ=1941546333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C565210000000001030307) 
Nov 23 09:18:36 np0005532585.localdomain sshd[147851]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:18:36 np0005532585.localdomain sshd[147851]: Accepted publickey for zuul from 192.168.122.31 port 37304 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:18:36 np0005532585.localdomain systemd-logind[761]: New session 48 of user zuul.
Nov 23 09:18:36 np0005532585.localdomain systemd[1]: Started Session 48 of User zuul.
Nov 23 09:18:36 np0005532585.localdomain sshd[147851]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:18:36 np0005532585.localdomain sudo[147944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysdklvwtwybivchnxawdiglnboecatga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889516.295563-27-40734111182252/AnsiballZ_file.py
Nov 23 09:18:36 np0005532585.localdomain sudo[147944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11265 DF PROTO=TCP SPT=52842 DPT=9101 SEQ=1941546333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C574E00000000001030307) 
Nov 23 09:18:36 np0005532585.localdomain python3.9[147946]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:36 np0005532585.localdomain sudo[147944]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:38 np0005532585.localdomain sudo[148036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hplymkyzbrewfdyyceyiredzyfqfjwmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889518.2519784-63-29923199660148/AnsiballZ_stat.py
Nov 23 09:18:38 np0005532585.localdomain sudo[148036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:38 np0005532585.localdomain python3.9[148038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:38 np0005532585.localdomain sudo[148036]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:39 np0005532585.localdomain sudo[148109]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqcjbaliufojkdtoccpjwftnlvtuokoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889518.2519784-63-29923199660148/AnsiballZ_copy.py
Nov 23 09:18:39 np0005532585.localdomain sudo[148109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33013 DF PROTO=TCP SPT=35954 DPT=9100 SEQ=1598340131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C57E200000000001030307) 
Nov 23 09:18:39 np0005532585.localdomain python3.9[148111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889518.2519784-63-29923199660148/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=5f137984986c8cf5df5aec7749430e0dc129d0db backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:39 np0005532585.localdomain sudo[148109]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:40 np0005532585.localdomain sudo[148201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqqlczmvlxtxnekzxrtljcxboewtckqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889519.6122842-63-277090203241126/AnsiballZ_stat.py
Nov 23 09:18:40 np0005532585.localdomain sudo[148201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:40 np0005532585.localdomain python3.9[148203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:18:40 np0005532585.localdomain sudo[148201]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:40 np0005532585.localdomain sudo[148274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubjnfrxroapztikqxhquwjonjcxcyinm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889519.6122842-63-277090203241126/AnsiballZ_copy.py
Nov 23 09:18:40 np0005532585.localdomain sudo[148274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:41 np0005532585.localdomain python3.9[148276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889519.6122842-63-277090203241126/.source.conf _original_basename=ceph.conf follow=False checksum=d6d906a745260c838693e085b1f329bd1daad564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:18:41 np0005532585.localdomain sudo[148274]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:41 np0005532585.localdomain sshd[147851]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:18:41 np0005532585.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Nov 23 09:18:41 np0005532585.localdomain systemd[1]: session-48.scope: Consumed 2.204s CPU time.
Nov 23 09:18:41 np0005532585.localdomain systemd-logind[761]: Session 48 logged out. Waiting for processes to exit.
Nov 23 09:18:41 np0005532585.localdomain systemd-logind[761]: Removed session 48.
Nov 23 09:18:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19783 DF PROTO=TCP SPT=60312 DPT=9882 SEQ=1253814909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C588460000000001030307) 
Nov 23 09:18:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19785 DF PROTO=TCP SPT=60312 DPT=9882 SEQ=1253814909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C594600000000001030307) 
Nov 23 09:18:47 np0005532585.localdomain sshd[148292]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:18:47 np0005532585.localdomain sshd[148292]: Accepted publickey for zuul from 192.168.122.31 port 48012 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:18:47 np0005532585.localdomain systemd-logind[761]: New session 49 of user zuul.
Nov 23 09:18:47 np0005532585.localdomain systemd[1]: Started Session 49 of User zuul.
Nov 23 09:18:47 np0005532585.localdomain sshd[148292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:18:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34862 DF PROTO=TCP SPT=43608 DPT=9102 SEQ=2511018635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5A0210000000001030307) 
Nov 23 09:18:48 np0005532585.localdomain python3.9[148385]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:18:49 np0005532585.localdomain sudo[148479]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slcfraxataywpwdbiagoaunfarhonpbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889528.747558-63-264635133400381/AnsiballZ_file.py
Nov 23 09:18:49 np0005532585.localdomain sudo[148479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:49 np0005532585.localdomain python3.9[148481]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:49 np0005532585.localdomain sudo[148479]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:49 np0005532585.localdomain sudo[148571]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptpgcdedpdughpawivuqymrcrpsdwmao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889529.4711955-63-136509323070273/AnsiballZ_file.py
Nov 23 09:18:49 np0005532585.localdomain sudo[148571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:49 np0005532585.localdomain python3.9[148573]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:18:49 np0005532585.localdomain sudo[148571]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:51 np0005532585.localdomain python3.9[148663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:18:51 np0005532585.localdomain sudo[148753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyxderzhzadiaentrwwfjacvftncleum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889531.5344527-132-15224195107347/AnsiballZ_seboolean.py
Nov 23 09:18:51 np0005532585.localdomain sudo[148753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:52 np0005532585.localdomain python3.9[148755]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 23 09:18:52 np0005532585.localdomain sudo[148753]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28867 DF PROTO=TCP SPT=33842 DPT=9102 SEQ=2421140341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5B5600000000001030307) 
Nov 23 09:18:53 np0005532585.localdomain sudo[148845]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyfexdzcdvtpqmwewoehjqvvygkmffws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889532.666181-162-262335183776683/AnsiballZ_setup.py
Nov 23 09:18:53 np0005532585.localdomain sudo[148845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:53 np0005532585.localdomain python3.9[148847]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:18:54 np0005532585.localdomain sudo[148845]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:54 np0005532585.localdomain sudo[148899]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odwqcmsokjllpierejovekqhtwqtauid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889532.666181-162-262335183776683/AnsiballZ_dnf.py
Nov 23 09:18:54 np0005532585.localdomain sudo[148899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:54 np0005532585.localdomain python3.9[148901]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:18:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19787 DF PROTO=TCP SPT=60312 DPT=9882 SEQ=1253814909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5C4200000000001030307) 
Nov 23 09:18:57 np0005532585.localdomain sshd[148904]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:18:57 np0005532585.localdomain sshd[148904]: Invalid user solv from 193.32.162.146 port 39960
Nov 23 09:18:57 np0005532585.localdomain sudo[148899]: pam_unix(sudo:session): session closed for user root
Nov 23 09:18:57 np0005532585.localdomain sshd[148904]: Connection closed by invalid user solv 193.32.162.146 port 39960 [preauth]
Nov 23 09:18:58 np0005532585.localdomain sudo[148995]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlypssbgqzyadmvyhwumfcvdzsikzqiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889538.3974645-198-27179273924362/AnsiballZ_systemd.py
Nov 23 09:18:58 np0005532585.localdomain sudo[148995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:18:59 np0005532585.localdomain python3.9[148997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:18:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57340 DF PROTO=TCP SPT=32782 DPT=9101 SEQ=2393264113 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5CE490000000001030307) 
Nov 23 09:18:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5954 DF PROTO=TCP SPT=42362 DPT=9105 SEQ=3315083620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5CEFB0000000001030307) 
Nov 23 09:19:00 np0005532585.localdomain sudo[148995]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:01 np0005532585.localdomain sudo[149090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndaeimiskxgmeumoizvfquxtdgioxkum ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763889540.5961044-222-131670815248607/AnsiballZ_edpm_nftables_snippet.py
Nov 23 09:19:01 np0005532585.localdomain sudo[149090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:01 np0005532585.localdomain python3[149092]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 23 09:19:01 np0005532585.localdomain sudo[149090]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:01 np0005532585.localdomain sudo[149182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esmruiutlqypslpxjobwrimzvukwrldb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889541.6111863-249-153422053240503/AnsiballZ_file.py
Nov 23 09:19:01 np0005532585.localdomain sudo[149182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:02 np0005532585.localdomain python3.9[149184]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:02 np0005532585.localdomain sudo[149182]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:02 np0005532585.localdomain sudo[149274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqayxikpxjhimmkvhschsovftyownicq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889542.3221884-273-129115585186862/AnsiballZ_stat.py
Nov 23 09:19:02 np0005532585.localdomain sudo[149274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57342 DF PROTO=TCP SPT=32782 DPT=9101 SEQ=2393264113 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5DA600000000001030307) 
Nov 23 09:19:02 np0005532585.localdomain python3.9[149276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:02 np0005532585.localdomain sudo[149274]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:03 np0005532585.localdomain sudo[149322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoyfqflsyuzsngsxtltqlaqdvogkrqss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889542.3221884-273-129115585186862/AnsiballZ_file.py
Nov 23 09:19:03 np0005532585.localdomain sudo[149322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:03 np0005532585.localdomain python3.9[149324]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:03 np0005532585.localdomain sudo[149322]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:03 np0005532585.localdomain sudo[149414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yusjosgrvqhowbdyqdsdelxddcyydfwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889543.62417-309-199055503278310/AnsiballZ_stat.py
Nov 23 09:19:03 np0005532585.localdomain sudo[149414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:04 np0005532585.localdomain python3.9[149416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:04 np0005532585.localdomain sudo[149414]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:04 np0005532585.localdomain sudo[149462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhxmtfdwzfajcqqlhmqclzuorskfjfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889543.62417-309-199055503278310/AnsiballZ_file.py
Nov 23 09:19:04 np0005532585.localdomain sudo[149462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:04 np0005532585.localdomain python3.9[149464]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.27u5gurr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:04 np0005532585.localdomain sudo[149462]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:05 np0005532585.localdomain sudo[149554]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uenbcyzqbmegfsukczoayypqpqwiphir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889544.721469-345-242585619100124/AnsiballZ_stat.py
Nov 23 09:19:05 np0005532585.localdomain sudo[149554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:06 np0005532585.localdomain python3.9[149556]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:06 np0005532585.localdomain sudo[149554]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:06 np0005532585.localdomain sudo[149602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivqbmpgiyoklvkrzzfqvjriomlqsnfsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889544.721469-345-242585619100124/AnsiballZ_file.py
Nov 23 09:19:06 np0005532585.localdomain sudo[149602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26268 DF PROTO=TCP SPT=45630 DPT=9100 SEQ=992795737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5E8200000000001030307) 
Nov 23 09:19:06 np0005532585.localdomain python3.9[149604]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:06 np0005532585.localdomain sudo[149602]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:07 np0005532585.localdomain sudo[149694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbedpmqovebgijrwfqmvrlsovayzjamy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889546.802897-384-163890701212391/AnsiballZ_command.py
Nov 23 09:19:07 np0005532585.localdomain sudo[149694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:07 np0005532585.localdomain python3.9[149696]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:19:07 np0005532585.localdomain sudo[149694]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:08 np0005532585.localdomain sudo[149787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpyawajcburxehwxjuflwvhqjaxtdvfi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763889547.6300793-408-2330219134981/AnsiballZ_edpm_nftables_from_files.py
Nov 23 09:19:08 np0005532585.localdomain sudo[149787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:09 np0005532585.localdomain python3[149789]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 09:19:09 np0005532585.localdomain sudo[149787]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31382 DF PROTO=TCP SPT=45264 DPT=9100 SEQ=3512850857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5F3200000000001030307) 
Nov 23 09:19:09 np0005532585.localdomain sudo[149879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgbqwkqwfoedvwfkdhoualezkzfnisae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889549.3007653-432-252287506334687/AnsiballZ_stat.py
Nov 23 09:19:09 np0005532585.localdomain sudo[149879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:09 np0005532585.localdomain python3.9[149881]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:09 np0005532585.localdomain sudo[149879]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:10 np0005532585.localdomain sudo[149954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oswagntkeriymljvrhbudiefcfsycguf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889549.3007653-432-252287506334687/AnsiballZ_copy.py
Nov 23 09:19:10 np0005532585.localdomain sudo[149954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:10 np0005532585.localdomain python3.9[149956]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889549.3007653-432-252287506334687/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:10 np0005532585.localdomain sudo[149954]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:10 np0005532585.localdomain sudo[150046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddynkzllpjxaayhdkwvwifdgwnuasscz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889550.679844-477-167595841453505/AnsiballZ_stat.py
Nov 23 09:19:10 np0005532585.localdomain sudo[150046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:11 np0005532585.localdomain python3.9[150048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:11 np0005532585.localdomain sudo[150046]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:11 np0005532585.localdomain sudo[150121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhkbchyirfmqtcuwgnbghyhcphoqowte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889550.679844-477-167595841453505/AnsiballZ_copy.py
Nov 23 09:19:11 np0005532585.localdomain sudo[150121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:11 np0005532585.localdomain python3.9[150123]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889550.679844-477-167595841453505/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:11 np0005532585.localdomain sudo[150121]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61088 DF PROTO=TCP SPT=58554 DPT=9882 SEQ=73534146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5FD760000000001030307) 
Nov 23 09:19:12 np0005532585.localdomain sudo[150213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sinyrfxbigztyifwynlwztkfuafgclxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889552.3813303-522-128284585097782/AnsiballZ_stat.py
Nov 23 09:19:12 np0005532585.localdomain sudo[150213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:12 np0005532585.localdomain python3.9[150215]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:12 np0005532585.localdomain sudo[150213]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:13 np0005532585.localdomain sudo[150288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsywbvhuibiqvsdnhmfxdzhmfwylscop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889552.3813303-522-128284585097782/AnsiballZ_copy.py
Nov 23 09:19:13 np0005532585.localdomain sudo[150288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:13 np0005532585.localdomain python3.9[150290]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889552.3813303-522-128284585097782/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:13 np0005532585.localdomain sudo[150288]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:13 np0005532585.localdomain sudo[150380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbijgjbcnhsroxdsgtfpklehoqhdbhmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889553.5688262-567-256022277069330/AnsiballZ_stat.py
Nov 23 09:19:13 np0005532585.localdomain sudo[150380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:14 np0005532585.localdomain python3.9[150382]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:14 np0005532585.localdomain sudo[150380]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:14 np0005532585.localdomain sudo[150455]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkewacfzqjixgpmixasvgxnbvmtkinzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889553.5688262-567-256022277069330/AnsiballZ_copy.py
Nov 23 09:19:14 np0005532585.localdomain sudo[150455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:14 np0005532585.localdomain python3.9[150457]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889553.5688262-567-256022277069330/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:14 np0005532585.localdomain sudo[150455]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61090 DF PROTO=TCP SPT=58554 DPT=9882 SEQ=73534146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C609600000000001030307) 
Nov 23 09:19:15 np0005532585.localdomain sudo[150547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwiudayajegkeutuegfkierbcsiqdfuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889554.892879-612-237908566681366/AnsiballZ_stat.py
Nov 23 09:19:15 np0005532585.localdomain sudo[150547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:15 np0005532585.localdomain python3.9[150549]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:15 np0005532585.localdomain sudo[150547]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:15 np0005532585.localdomain sudo[150622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzkwsttcwkmdtwcvlqzzibaiidwrudfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889554.892879-612-237908566681366/AnsiballZ_copy.py
Nov 23 09:19:15 np0005532585.localdomain sudo[150622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:16 np0005532585.localdomain python3.9[150624]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889554.892879-612-237908566681366/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:16 np0005532585.localdomain sudo[150622]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28869 DF PROTO=TCP SPT=33842 DPT=9102 SEQ=2421140341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C616200000000001030307) 
Nov 23 09:19:19 np0005532585.localdomain sudo[150714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzmxrbjjcnazxuooxudbagdrfpgahuiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889559.0994594-657-99207858227011/AnsiballZ_file.py
Nov 23 09:19:19 np0005532585.localdomain sudo[150714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:19 np0005532585.localdomain python3.9[150716]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:19 np0005532585.localdomain sudo[150714]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:20 np0005532585.localdomain sudo[150806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmbznnanllvmchzeftkcxdgirjkgxnfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889559.7644308-681-10118113026736/AnsiballZ_command.py
Nov 23 09:19:20 np0005532585.localdomain sudo[150806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:20 np0005532585.localdomain python3.9[150808]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:19:20 np0005532585.localdomain sudo[150806]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:21 np0005532585.localdomain sudo[150901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbekbdyjqeehdtfvbsrpxvnvqrokapuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889560.454339-705-55365373865836/AnsiballZ_blockinfile.py
Nov 23 09:19:21 np0005532585.localdomain sudo[150901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:21 np0005532585.localdomain python3.9[150903]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:21 np0005532585.localdomain sudo[150901]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:22 np0005532585.localdomain sudo[150993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlizpxorhdvsrnsowgimhkvnclvdqmtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889561.9931443-732-43957587074560/AnsiballZ_command.py
Nov 23 09:19:22 np0005532585.localdomain sudo[150993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:22 np0005532585.localdomain python3.9[150995]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:19:22 np0005532585.localdomain sudo[150993]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:22 np0005532585.localdomain sudo[151086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orgvjpesioalggvjhpvopsodacvfaxek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889562.7028446-756-166783153289645/AnsiballZ_stat.py
Nov 23 09:19:22 np0005532585.localdomain sudo[151086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:23 np0005532585.localdomain python3.9[151088]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:19:23 np0005532585.localdomain sudo[151086]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40792 DF PROTO=TCP SPT=52980 DPT=9102 SEQ=3629331533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C62AA00000000001030307) 
Nov 23 09:19:23 np0005532585.localdomain sudo[151180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htthblxovpczhvaybvimvmgognhqjlme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889563.3471928-780-214031881697082/AnsiballZ_command.py
Nov 23 09:19:23 np0005532585.localdomain sudo[151180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:23 np0005532585.localdomain python3.9[151182]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:19:23 np0005532585.localdomain sudo[151180]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:24 np0005532585.localdomain sudo[151200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:19:24 np0005532585.localdomain sudo[151200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:19:24 np0005532585.localdomain sudo[151200]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:24 np0005532585.localdomain sudo[151233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:19:24 np0005532585.localdomain sudo[151233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:19:24 np0005532585.localdomain sudo[151305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkjpspxnxpnsauxrhsvxtcebkynohkjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889564.0486052-804-226963798349307/AnsiballZ_file.py
Nov 23 09:19:24 np0005532585.localdomain sudo[151305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:24 np0005532585.localdomain python3.9[151307]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:24 np0005532585.localdomain sudo[151305]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:24 np0005532585.localdomain sudo[151233]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:25 np0005532585.localdomain sudo[151415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:19:25 np0005532585.localdomain sudo[151415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:19:25 np0005532585.localdomain sudo[151415]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:25 np0005532585.localdomain python3.9[151441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:19:26 np0005532585.localdomain sudo[151534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztwvszrsllndtvvhjlhiphntdmwivjhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889566.546444-924-122348638241107/AnsiballZ_command.py
Nov 23 09:19:26 np0005532585.localdomain sudo[151534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:27 np0005532585.localdomain python3.9[151536]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005532585.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:1d:b8:fa:41" external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:19:27 np0005532585.localdomain ovs-vsctl[151537]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005532585.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:1d:b8:fa:41 external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 23 09:19:27 np0005532585.localdomain sudo[151534]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61092 DF PROTO=TCP SPT=58554 DPT=9882 SEQ=73534146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C63A210000000001030307) 
Nov 23 09:19:27 np0005532585.localdomain sudo[151627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkyojzriuzwqpacdbdyqxttoorsnjdlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889567.3087332-951-80446199037189/AnsiballZ_command.py
Nov 23 09:19:27 np0005532585.localdomain sudo[151627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:27 np0005532585.localdomain python3.9[151629]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:19:27 np0005532585.localdomain sudo[151627]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:28 np0005532585.localdomain python3.9[151722]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:19:29 np0005532585.localdomain sudo[151814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyawrkugenndotvwmfyzdgousjqpuygf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889568.7937784-1005-78622810470679/AnsiballZ_file.py
Nov 23 09:19:29 np0005532585.localdomain sudo[151814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:29 np0005532585.localdomain python3.9[151816]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:19:29 np0005532585.localdomain sudo[151814]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34980 DF PROTO=TCP SPT=35690 DPT=9101 SEQ=2587280388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6437A0000000001030307) 
Nov 23 09:19:29 np0005532585.localdomain sudo[151906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxvqismuebueosxkzxfctftryatdxild ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889569.521878-1029-92417809544633/AnsiballZ_stat.py
Nov 23 09:19:29 np0005532585.localdomain sudo[151906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57869 DF PROTO=TCP SPT=40678 DPT=9105 SEQ=182249479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6442B0000000001030307) 
Nov 23 09:19:30 np0005532585.localdomain python3.9[151908]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:30 np0005532585.localdomain sudo[151906]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:30 np0005532585.localdomain sudo[151954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fevotxlwonipnztrbqozzpwfyxbjrhwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889569.521878-1029-92417809544633/AnsiballZ_file.py
Nov 23 09:19:30 np0005532585.localdomain sudo[151954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:30 np0005532585.localdomain python3.9[151956]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:19:30 np0005532585.localdomain sudo[151954]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:30 np0005532585.localdomain sudo[152046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjosjkhkwmyxtpumthgkgefjxiywabcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889570.6230032-1029-7694330003481/AnsiballZ_stat.py
Nov 23 09:19:30 np0005532585.localdomain sudo[152046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:31 np0005532585.localdomain python3.9[152048]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:31 np0005532585.localdomain sudo[152046]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:31 np0005532585.localdomain sudo[152094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjaqhgnqcfthirdllryupnpfhkbgohft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889570.6230032-1029-7694330003481/AnsiballZ_file.py
Nov 23 09:19:31 np0005532585.localdomain sudo[152094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:31 np0005532585.localdomain python3.9[152096]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:19:31 np0005532585.localdomain sudo[152094]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:32 np0005532585.localdomain sudo[152186]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdchrhhkwvnkbhzczryinqtreefqndau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889571.7317867-1098-264999542683662/AnsiballZ_file.py
Nov 23 09:19:32 np0005532585.localdomain sudo[152186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:32 np0005532585.localdomain python3.9[152188]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:32 np0005532585.localdomain sudo[152186]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34982 DF PROTO=TCP SPT=35690 DPT=9101 SEQ=2587280388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C64FA10000000001030307) 
Nov 23 09:19:33 np0005532585.localdomain sudo[152278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfjdbuwxesunftndbgiqxlkaatpbromg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889573.016813-1122-2962050818923/AnsiballZ_stat.py
Nov 23 09:19:33 np0005532585.localdomain sudo[152278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:33 np0005532585.localdomain python3.9[152280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:33 np0005532585.localdomain sudo[152278]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:33 np0005532585.localdomain sudo[152326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpdcreiauomguwxuacrnroklteqwemxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889573.016813-1122-2962050818923/AnsiballZ_file.py
Nov 23 09:19:33 np0005532585.localdomain sudo[152326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:33 np0005532585.localdomain python3.9[152328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:33 np0005532585.localdomain sudo[152326]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:34 np0005532585.localdomain sudo[152418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bftaojigpelyxgadechsghgnwjndbbzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889574.1587968-1158-29446886743931/AnsiballZ_stat.py
Nov 23 09:19:34 np0005532585.localdomain sudo[152418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:34 np0005532585.localdomain python3.9[152420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:34 np0005532585.localdomain sudo[152418]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:35 np0005532585.localdomain sudo[152466]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itbblcxymsacswptkxbvjllwdzmyvgfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889574.1587968-1158-29446886743931/AnsiballZ_file.py
Nov 23 09:19:35 np0005532585.localdomain sudo[152466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:35 np0005532585.localdomain python3.9[152468]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:35 np0005532585.localdomain sudo[152466]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33016 DF PROTO=TCP SPT=35954 DPT=9100 SEQ=1598340131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C65C200000000001030307) 
Nov 23 09:19:36 np0005532585.localdomain sudo[152558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkdwshwdflwnhcuehsixaneiuvqlydqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889575.889213-1194-16737808049005/AnsiballZ_systemd.py
Nov 23 09:19:36 np0005532585.localdomain sudo[152558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:36 np0005532585.localdomain python3.9[152560]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:19:36 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:19:36 np0005532585.localdomain systemd-sysv-generator[152587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:19:36 np0005532585.localdomain systemd-rc-local-generator[152584]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:19:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:19:36 np0005532585.localdomain sudo[152558]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:37 np0005532585.localdomain sudo[152688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dysommsbwafmkhoazjeilholwbpopafc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889577.0065901-1218-97808158601475/AnsiballZ_stat.py
Nov 23 09:19:37 np0005532585.localdomain sudo[152688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:37 np0005532585.localdomain python3.9[152690]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:37 np0005532585.localdomain sudo[152688]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:37 np0005532585.localdomain sudo[152736]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwwabbcdsgkcbtlgqhakpgqbxaqafbik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889577.0065901-1218-97808158601475/AnsiballZ_file.py
Nov 23 09:19:37 np0005532585.localdomain sudo[152736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:37 np0005532585.localdomain python3.9[152738]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:37 np0005532585.localdomain sudo[152736]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:38 np0005532585.localdomain sudo[152828]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfpvsyeozutwyqsczxphyuiyijmianur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889578.0983913-1254-116155586212368/AnsiballZ_stat.py
Nov 23 09:19:38 np0005532585.localdomain sudo[152828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:38 np0005532585.localdomain python3.9[152830]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:38 np0005532585.localdomain sudo[152828]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:38 np0005532585.localdomain sudo[152876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqvzyjqiohxqlnkkzfnjfugooxvdhhrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889578.0983913-1254-116155586212368/AnsiballZ_file.py
Nov 23 09:19:38 np0005532585.localdomain sudo[152876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:38 np0005532585.localdomain python3.9[152878]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:38 np0005532585.localdomain sudo[152876]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53347 DF PROTO=TCP SPT=38346 DPT=9100 SEQ=2464325010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C668600000000001030307) 
Nov 23 09:19:39 np0005532585.localdomain sudo[152968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jalqtcdjxulcbmzqsvafqblcmaxzdyge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889579.1981974-1290-184614160275004/AnsiballZ_systemd.py
Nov 23 09:19:39 np0005532585.localdomain sudo[152968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:39 np0005532585.localdomain python3.9[152970]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:19:39 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:19:39 np0005532585.localdomain systemd-rc-local-generator[152996]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:19:39 np0005532585.localdomain systemd-sysv-generator[152999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:19:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:19:40 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:19:40 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:19:40 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:19:40 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:19:40 np0005532585.localdomain sudo[152968]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:40 np0005532585.localdomain sudo[153102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbyrvnqdiklpbmtpsbxpyucdvnvhtiqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889580.5723462-1320-194890423654621/AnsiballZ_file.py
Nov 23 09:19:40 np0005532585.localdomain sudo[153102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:41 np0005532585.localdomain python3.9[153104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:19:41 np0005532585.localdomain sudo[153102]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:41 np0005532585.localdomain sudo[153194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgtsrxhmbctaiqtzbvzjtqvmtgnenubt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889581.2415519-1344-260646082026101/AnsiballZ_stat.py
Nov 23 09:19:41 np0005532585.localdomain sudo[153194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:41 np0005532585.localdomain python3.9[153196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:41 np0005532585.localdomain sudo[153194]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2261 DF PROTO=TCP SPT=44926 DPT=9882 SEQ=3902380751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C672A60000000001030307) 
Nov 23 09:19:42 np0005532585.localdomain sudo[153267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxekanprqshsumlqhkulszdporqjznya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889581.2415519-1344-260646082026101/AnsiballZ_copy.py
Nov 23 09:19:42 np0005532585.localdomain sudo[153267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:42 np0005532585.localdomain python3.9[153269]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889581.2415519-1344-260646082026101/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:19:42 np0005532585.localdomain sudo[153267]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:43 np0005532585.localdomain sudo[153359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itobmiknvtzjcrtbnimegzrwzjzwuigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889582.8418837-1395-224833137703522/AnsiballZ_file.py
Nov 23 09:19:43 np0005532585.localdomain sudo[153359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:43 np0005532585.localdomain python3.9[153361]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:19:43 np0005532585.localdomain sudo[153359]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:43 np0005532585.localdomain sudo[153451]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqspvltlqxvyteejweuonhcfjavdqdgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889583.5034494-1419-171512185753192/AnsiballZ_stat.py
Nov 23 09:19:43 np0005532585.localdomain sudo[153451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:43 np0005532585.localdomain python3.9[153453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:19:43 np0005532585.localdomain sudo[153451]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:44 np0005532585.localdomain sudo[153526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyqtenbxilawkfuvknfyuygvijjuopqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889583.5034494-1419-171512185753192/AnsiballZ_copy.py
Nov 23 09:19:44 np0005532585.localdomain sudo[153526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:44 np0005532585.localdomain python3.9[153528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889583.5034494-1419-171512185753192/.source.json _original_basename=.4ot7ghoo follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:44 np0005532585.localdomain sudo[153526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2263 DF PROTO=TCP SPT=44926 DPT=9882 SEQ=3902380751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C67EA10000000001030307) 
Nov 23 09:19:45 np0005532585.localdomain sudo[153618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elbdskhzbkjifwomwiqpzncwqovgypjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889584.817638-1464-169451447502307/AnsiballZ_file.py
Nov 23 09:19:45 np0005532585.localdomain sudo[153618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:45 np0005532585.localdomain python3.9[153620]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:45 np0005532585.localdomain sudo[153618]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:46 np0005532585.localdomain sudo[153710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbvzjfgkfytvizmhmaeyfejptyffctij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889585.665493-1488-258998997551398/AnsiballZ_stat.py
Nov 23 09:19:46 np0005532585.localdomain sudo[153710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:46 np0005532585.localdomain sudo[153710]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:46 np0005532585.localdomain sudo[153783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozyxtuheadwjpvknswfoleiauqxcalms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889585.665493-1488-258998997551398/AnsiballZ_copy.py
Nov 23 09:19:46 np0005532585.localdomain sudo[153783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:47 np0005532585.localdomain sudo[153783]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40794 DF PROTO=TCP SPT=52980 DPT=9102 SEQ=3629331533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C68A200000000001030307) 
Nov 23 09:19:47 np0005532585.localdomain sudo[153875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewhstgbfzwdfllovnoitjugnyxxppxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889587.5117831-1539-185231977698661/AnsiballZ_container_config_data.py
Nov 23 09:19:47 np0005532585.localdomain sudo[153875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:48 np0005532585.localdomain python3.9[153877]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 23 09:19:48 np0005532585.localdomain sudo[153875]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:49 np0005532585.localdomain sudo[153967]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whjlwoftvfsohcaptbchxwwzfytsppaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889589.1029148-1566-63877778906763/AnsiballZ_container_config_hash.py
Nov 23 09:19:49 np0005532585.localdomain sudo[153967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:49 np0005532585.localdomain python3.9[153969]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:19:49 np0005532585.localdomain sudo[153967]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:50 np0005532585.localdomain sudo[154059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kounfhdivaupkerybgvaeujihorlnuzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889590.0424874-1593-276902807672837/AnsiballZ_podman_container_info.py
Nov 23 09:19:50 np0005532585.localdomain sudo[154059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:50 np0005532585.localdomain python3.9[154061]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 09:19:51 np0005532585.localdomain sudo[154059]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55080 DF PROTO=TCP SPT=52796 DPT=9102 SEQ=3989621481 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C69FA00000000001030307) 
Nov 23 09:19:54 np0005532585.localdomain sudo[154180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-civhzoqclluvdmmnpkmojultwfmzkmfe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763889594.1612775-1632-52841600588251/AnsiballZ_edpm_container_manage.py
Nov 23 09:19:54 np0005532585.localdomain sudo[154180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:54 np0005532585.localdomain python3[154182]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:19:55 np0005532585.localdomain python3[154182]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c",
                                                                    "Digest": "sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:40:43.504967825Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 345731014,
                                                                    "VirtualSize": 345731014,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:2e0f9ca9a8387a3566096aacaecfe5797e3fc2585f07cb97a1706897fa1a86a3",
                                                                              "sha256:db37b2d335b44e6a9cb2eb88713051bc469233d1e0a06670f1303bc9539b97a0"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:24.710862694Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:21.866159039Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:23.706789841Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:40:08.281534836Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:40:43.503408191Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:40:44.731010683Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 23 09:19:55 np0005532585.localdomain podman[154232]: 2025-11-23 09:19:55.346636879 +0000 UTC m=+0.092538965 container remove 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:19:55 np0005532585.localdomain python3[154182]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Nov 23 09:19:55 np0005532585.localdomain podman[154246]: 
Nov 23 09:19:55 np0005532585.localdomain podman[154246]: 2025-11-23 09:19:55.456348282 +0000 UTC m=+0.091462912 container create 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:19:55 np0005532585.localdomain podman[154246]: 2025-11-23 09:19:55.413827684 +0000 UTC m=+0.048942354 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 23 09:19:55 np0005532585.localdomain python3[154182]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 23 09:19:55 np0005532585.localdomain sudo[154180]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:56 np0005532585.localdomain sudo[154372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvatisaebnaeywggqgyshgbvwgblxqoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889596.0964549-1656-204112809310356/AnsiballZ_stat.py
Nov 23 09:19:56 np0005532585.localdomain sudo[154372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:56 np0005532585.localdomain python3.9[154374]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:19:56 np0005532585.localdomain sudo[154372]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2265 DF PROTO=TCP SPT=44926 DPT=9882 SEQ=3902380751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6AE210000000001030307) 
Nov 23 09:19:59 np0005532585.localdomain sudo[154466]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhrucicqvvboquzycieoydsstcoaozbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889598.7479842-1683-90079153814174/AnsiballZ_file.py
Nov 23 09:19:59 np0005532585.localdomain sudo[154466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:59 np0005532585.localdomain python3.9[154468]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:19:59 np0005532585.localdomain sudo[154466]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:59 np0005532585.localdomain sudo[154512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxskzwxnwancjicyasdahbadowxzfbxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889598.7479842-1683-90079153814174/AnsiballZ_stat.py
Nov 23 09:19:59 np0005532585.localdomain sudo[154512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:19:59 np0005532585.localdomain python3.9[154514]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:19:59 np0005532585.localdomain sudo[154512]: pam_unix(sudo:session): session closed for user root
Nov 23 09:19:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29676 DF PROTO=TCP SPT=51840 DPT=9101 SEQ=568539744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6B8AA0000000001030307) 
Nov 23 09:19:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5710 DF PROTO=TCP SPT=59366 DPT=9105 SEQ=1415999924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6B95B0000000001030307) 
Nov 23 09:20:00 np0005532585.localdomain sudo[154603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrobjjgqkcarauwsqzuraazvajgfsryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889599.659171-1683-122901520042008/AnsiballZ_copy.py
Nov 23 09:20:00 np0005532585.localdomain sudo[154603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:00 np0005532585.localdomain python3.9[154605]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763889599.659171-1683-122901520042008/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:00 np0005532585.localdomain sudo[154603]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:00 np0005532585.localdomain sudo[154649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bojdhjwcvuoeokdfregxcgvfelaipukf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889599.659171-1683-122901520042008/AnsiballZ_systemd.py
Nov 23 09:20:00 np0005532585.localdomain sudo[154649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:01 np0005532585.localdomain python3.9[154651]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:20:01 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:20:01 np0005532585.localdomain systemd-sysv-generator[154679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:20:01 np0005532585.localdomain systemd-rc-local-generator[154674]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:20:01 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:20:01 np0005532585.localdomain sudo[154649]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:01 np0005532585.localdomain sudo[154731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqxsbrsuegbhexybocgkiyeceuybicdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889599.659171-1683-122901520042008/AnsiballZ_systemd.py
Nov 23 09:20:01 np0005532585.localdomain sudo[154731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:02 np0005532585.localdomain python3.9[154733]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:20:02 np0005532585.localdomain systemd-rc-local-generator[154759]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:20:02 np0005532585.localdomain systemd-sysv-generator[154765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Starting ovn_controller container...
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:20:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b7408e2b9210e95a255d08712fe6a0aa83c4e2a632b9f510c8ec27c22bd8bbc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:20:02 np0005532585.localdomain podman[154774]: 2025-11-23 09:20:02.636359143 +0000 UTC m=+0.131960868 container init 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: tmp-crun.wolQKz.mount: Deactivated successfully.
Nov 23 09:20:02 np0005532585.localdomain ovn_controller[154788]: + sudo -E kolla_set_configs
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:20:02 np0005532585.localdomain podman[154774]: 2025-11-23 09:20:02.669459806 +0000 UTC m=+0.165061491 container start 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:20:02 np0005532585.localdomain edpm-start-podman-container[154774]: ovn_controller
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 23 09:20:02 np0005532585.localdomain edpm-start-podman-container[154773]: Creating additional drop-in dependency for "ovn_controller" (2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543)
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:20:02 np0005532585.localdomain podman[154795]: 2025-11-23 09:20:02.827451552 +0000 UTC m=+0.152550712 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:20:02 np0005532585.localdomain podman[154795]: 2025-11-23 09:20:02.843147468 +0000 UTC m=+0.168246668 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:20:02 np0005532585.localdomain podman[154795]: unhealthy
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Queued start job for default target Main User Target.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Created slice User Application Slice.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Reached target Paths.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Reached target Timers.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Starting D-Bus User Message Bus Socket...
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Starting Create User's Volatile Files and Directories...
Nov 23 09:20:02 np0005532585.localdomain systemd-rc-local-generator[154872]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Finished Create User's Volatile Files and Directories.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Listening on D-Bus User Message Bus Socket.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Reached target Sockets.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Reached target Basic System.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Reached target Main User Target.
Nov 23 09:20:02 np0005532585.localdomain systemd[154817]: Startup finished in 132ms.
Nov 23 09:20:02 np0005532585.localdomain systemd-sysv-generator[154875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:20:02 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:20:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5712 DF PROTO=TCP SPT=59366 DPT=9105 SEQ=1415999924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6C5600000000001030307) 
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: Started User Manager for UID 0.
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: Started ovn_controller container.
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Failed with result 'exit-code'.
Nov 23 09:20:03 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation.
Nov 23 09:20:03 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 09:20:03 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:20:03 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: Started Session c12 of User root.
Nov 23 09:20:03 np0005532585.localdomain sudo[154731]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: INFO:__main__:Validating config file
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: INFO:__main__:Writing out command to execute
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: ++ cat /run_command
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + ARGS=
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + sudo kolla_copy_cacerts
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: Started Session c13 of User root.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + [[ ! -n '' ]]
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + . kolla_extend_start
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + umask 0022
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Nov 23 09:20:03 np0005532585.localdomain systemd[1]: session-c13.scope: Deactivated successfully.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00013|main|INFO|OVS feature set changed, force recompute.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00021|main|INFO|OVS feature set changed, force recompute.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-c237ed-0
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-49b8a0-0
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-b7d5b3-0
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00026|binding|INFO|Claiming lport d3912d14-a3e0-4df9-b811-f3bd90f44559 for this chassis.
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00027|binding|INFO|d3912d14-a3e0-4df9-b811-f3bd90f44559: Claiming fa:16:3e:cf:aa:3b 192.168.0.77
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00028|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00029|binding|INFO|Removing lport d3912d14-a3e0-4df9-b811-f3bd90f44559 ovn-installed in OVS
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-c237ed-0
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-49b8a0-0
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-b7d5b3-0
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00033|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00034|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00035|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00036|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00037|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:03Z|00038|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:04Z|00039|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:04Z|00040|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:04 np0005532585.localdomain sudo[154988]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fueicshiucvapcsqjjqcgilrfaojgzrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889604.5126362-1767-115967579215400/AnsiballZ_command.py
Nov 23 09:20:04 np0005532585.localdomain sudo[154988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:04 np0005532585.localdomain python3.9[154990]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:20:05 np0005532585.localdomain ovs-vsctl[154991]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 23 09:20:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:05Z|00041|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:05 np0005532585.localdomain sudo[154988]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:05Z|00042|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:20:05 np0005532585.localdomain sudo[155081]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koaestksyhbehtqdrcplovsitdnsfiki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889605.2047837-1791-18930102888705/AnsiballZ_command.py
Nov 23 09:20:05 np0005532585.localdomain sudo[155081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:05 np0005532585.localdomain python3.9[155083]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:20:05 np0005532585.localdomain ovs-vsctl[155085]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 23 09:20:05 np0005532585.localdomain sudo[155081]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31385 DF PROTO=TCP SPT=45264 DPT=9100 SEQ=3512850857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6D2210000000001030307) 
Nov 23 09:20:06 np0005532585.localdomain sudo[155176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ousgooqhzimbjnjochtyfuzpfbxcuzki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889606.1824112-1833-52926978765004/AnsiballZ_command.py
Nov 23 09:20:06 np0005532585.localdomain sudo[155176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:06 np0005532585.localdomain python3.9[155178]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:20:06 np0005532585.localdomain ovs-vsctl[155179]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 23 09:20:06 np0005532585.localdomain sudo[155176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:07 np0005532585.localdomain sshd[148292]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:20:07 np0005532585.localdomain systemd-logind[761]: Session 49 logged out. Waiting for processes to exit.
Nov 23 09:20:07 np0005532585.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Nov 23 09:20:07 np0005532585.localdomain systemd[1]: session-49.scope: Consumed 39.948s CPU time.
Nov 23 09:20:07 np0005532585.localdomain systemd-logind[761]: Removed session 49.
Nov 23 09:20:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40269 DF PROTO=TCP SPT=32894 DPT=9100 SEQ=2119202913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6DDA00000000001030307) 
Nov 23 09:20:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:11Z|00043|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 ovn-installed in OVS
Nov 23 09:20:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:11Z|00044|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 up in Southbound
Nov 23 09:20:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59514 DF PROTO=TCP SPT=32770 DPT=9882 SEQ=1739129171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6E7D70000000001030307) 
Nov 23 09:20:13 np0005532585.localdomain sshd[155195]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:20:13 np0005532585.localdomain sshd[155195]: Accepted publickey for zuul from 192.168.122.30 port 46156 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:20:13 np0005532585.localdomain systemd-logind[761]: New session 51 of user zuul.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: Started Session 51 of User zuul.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Activating special unit Exit the Session...
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped target Main User Target.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped target Basic System.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped target Paths.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped target Sockets.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped target Timers.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Closed D-Bus User Message Bus Socket.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Stopped Create User's Volatile Files and Directories.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Removed slice User Application Slice.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Reached target Shutdown.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Finished Exit the Session.
Nov 23 09:20:13 np0005532585.localdomain systemd[154817]: Reached target Exit the Session.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 23 09:20:13 np0005532585.localdomain sshd[155195]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 23 09:20:13 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 23 09:20:14 np0005532585.localdomain python3.9[155289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:20:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59516 DF PROTO=TCP SPT=32770 DPT=9882 SEQ=1739129171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6F3E00000000001030307) 
Nov 23 09:20:15 np0005532585.localdomain sudo[155383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmxbpmizcklscqsriolfkepyhjsysckv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889614.9199157-63-147427246240298/AnsiballZ_file.py
Nov 23 09:20:15 np0005532585.localdomain sudo[155383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:15 np0005532585.localdomain python3.9[155385]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:15 np0005532585.localdomain sudo[155383]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:15 np0005532585.localdomain sudo[155475]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgkirflsdrcvgimsdmkmzbbaoxlewwrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889615.6386752-63-56247816935851/AnsiballZ_file.py
Nov 23 09:20:15 np0005532585.localdomain sudo[155475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:16 np0005532585.localdomain python3.9[155477]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:16 np0005532585.localdomain sudo[155475]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:16 np0005532585.localdomain sudo[155567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsxsnurtywmbmelqojsdjvqvgytltkut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889616.160995-63-69692824109318/AnsiballZ_file.py
Nov 23 09:20:16 np0005532585.localdomain sudo[155567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:16 np0005532585.localdomain python3.9[155569]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:16 np0005532585.localdomain sudo[155567]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:16 np0005532585.localdomain sudo[155659]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okzbyvnvdeoxjzgbaizazszjmouvwyqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889616.6869621-63-154692715862684/AnsiballZ_file.py
Nov 23 09:20:16 np0005532585.localdomain sudo[155659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:17 np0005532585.localdomain python3.9[155661]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:17 np0005532585.localdomain sudo[155659]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:17 np0005532585.localdomain sudo[155751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zymxbtdaqswqzgssdzlyuixdrcftwiqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889617.1833951-63-157768054028662/AnsiballZ_file.py
Nov 23 09:20:17 np0005532585.localdomain sudo[155751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:17 np0005532585.localdomain python3.9[155753]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:17 np0005532585.localdomain sudo[155751]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55082 DF PROTO=TCP SPT=52796 DPT=9102 SEQ=3989621481 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C700210000000001030307) 
Nov 23 09:20:18 np0005532585.localdomain python3.9[155843]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:20:19 np0005532585.localdomain sudo[155933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sruveyfsxgvdeqvwbhrwpimgctchljtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889619.0277877-195-268629014103660/AnsiballZ_seboolean.py
Nov 23 09:20:19 np0005532585.localdomain sudo[155933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:19 np0005532585.localdomain python3.9[155935]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 23 09:20:19 np0005532585.localdomain sudo[155933]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:20 np0005532585.localdomain python3.9[156025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:21 np0005532585.localdomain python3.9[156098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889619.9685419-219-65904435578925/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:21 np0005532585.localdomain python3.9[156189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:22 np0005532585.localdomain python3.9[156262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889621.4241328-264-30766783328851/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:22 np0005532585.localdomain sudo[156352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agwiuuevihwhqlnbusgknlonqvimsgbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889622.6891172-315-55457128349375/AnsiballZ_setup.py
Nov 23 09:20:22 np0005532585.localdomain sudo[156352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:23 np0005532585.localdomain python3.9[156354]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:20:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57049 DF PROTO=TCP SPT=41364 DPT=9102 SEQ=4036052507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C714E00000000001030307) 
Nov 23 09:20:23 np0005532585.localdomain sudo[156352]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:24 np0005532585.localdomain sudo[156406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gicndcywiiybovjohczvnealyldxqrml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889622.6891172-315-55457128349375/AnsiballZ_dnf.py
Nov 23 09:20:24 np0005532585.localdomain sudo[156406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:24 np0005532585.localdomain python3.9[156408]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:20:25 np0005532585.localdomain sudo[156411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:20:25 np0005532585.localdomain sudo[156411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:20:25 np0005532585.localdomain sudo[156411]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:25 np0005532585.localdomain sudo[156426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:20:25 np0005532585.localdomain sudo[156426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:20:26 np0005532585.localdomain sudo[156426]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:27 np0005532585.localdomain sudo[156472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:20:27 np0005532585.localdomain sudo[156472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:20:27 np0005532585.localdomain sudo[156472]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59518 DF PROTO=TCP SPT=32770 DPT=9882 SEQ=1739129171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C724200000000001030307) 
Nov 23 09:20:27 np0005532585.localdomain sudo[156406]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:28 np0005532585.localdomain sudo[156576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoksaghdrfyadqjblvgdxllpokddfsbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889628.2211733-351-131027018923840/AnsiballZ_systemd.py
Nov 23 09:20:28 np0005532585.localdomain sudo[156576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:29 np0005532585.localdomain python3.9[156578]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:20:29 np0005532585.localdomain sudo[156576]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27028 DF PROTO=TCP SPT=51376 DPT=9101 SEQ=1016082811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C72DDA0000000001030307) 
Nov 23 09:20:29 np0005532585.localdomain python3.9[156671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27969 DF PROTO=TCP SPT=33576 DPT=9105 SEQ=1273410697 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C72E8B0000000001030307) 
Nov 23 09:20:30 np0005532585.localdomain python3.9[156742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889629.414031-375-44334779818017/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:30 np0005532585.localdomain python3.9[156832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:31 np0005532585.localdomain python3.9[156903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889630.449876-375-191040011858470/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27030 DF PROTO=TCP SPT=51376 DPT=9101 SEQ=1016082811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C739E10000000001030307) 
Nov 23 09:20:33 np0005532585.localdomain python3.9[156993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:33 np0005532585.localdomain python3.9[157064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889632.6645074-507-2814951783767/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:20:34 np0005532585.localdomain podman[157155]: 2025-11-23 09:20:34.034509806 +0000 UTC m=+0.085221728 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:20:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:34Z|00045|memory|INFO|16912 kB peak resident set size after 30.8 seconds
Nov 23 09:20:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:34Z|00046|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:288 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:153 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67
Nov 23 09:20:34 np0005532585.localdomain python3.9[157154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:34 np0005532585.localdomain podman[157155]: 2025-11-23 09:20:34.107243646 +0000 UTC m=+0.157955548 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:20:34 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:20:34 np0005532585.localdomain python3.9[157250]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889633.6388807-507-194698171034653/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:35 np0005532585.localdomain python3.9[157340]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:20:35 np0005532585.localdomain sudo[157432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmoznwknzcszjjdkehznckipsmbhfeye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889635.6195085-621-118230267816228/AnsiballZ_file.py
Nov 23 09:20:35 np0005532585.localdomain sudo[157432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53350 DF PROTO=TCP SPT=38346 DPT=9100 SEQ=2464325010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C746200000000001030307) 
Nov 23 09:20:36 np0005532585.localdomain python3.9[157434]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:36 np0005532585.localdomain sudo[157432]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:36 np0005532585.localdomain sudo[157524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-embwbqrxcjgedgfvqdseasrnmijiaovt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889636.2974892-645-208964027320599/AnsiballZ_stat.py
Nov 23 09:20:36 np0005532585.localdomain sudo[157524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:36 np0005532585.localdomain python3.9[157526]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:36 np0005532585.localdomain sudo[157524]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:36 np0005532585.localdomain sudo[157572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohaffnlcoweisswdhrzfwqixuaatqzno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889636.2974892-645-208964027320599/AnsiballZ_file.py
Nov 23 09:20:36 np0005532585.localdomain sudo[157572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:37 np0005532585.localdomain python3.9[157574]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:37 np0005532585.localdomain sudo[157572]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:37 np0005532585.localdomain sudo[157664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdipixfcnbheyogtimurkwhdejgtpuiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889637.306476-645-9923095180200/AnsiballZ_stat.py
Nov 23 09:20:37 np0005532585.localdomain sudo[157664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:37 np0005532585.localdomain python3.9[157666]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:37 np0005532585.localdomain sudo[157664]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:38 np0005532585.localdomain sudo[157712]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpakqzvyxiycayatawdfbhstledfnmvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889637.306476-645-9923095180200/AnsiballZ_file.py
Nov 23 09:20:38 np0005532585.localdomain sudo[157712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:38 np0005532585.localdomain python3.9[157714]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:38 np0005532585.localdomain sudo[157712]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:39 np0005532585.localdomain sudo[157804]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hupvkuphqbzbzpwecynytpkdomffgufz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889638.8352234-714-26957630133391/AnsiballZ_file.py
Nov 23 09:20:39 np0005532585.localdomain sudo[157804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:39 np0005532585.localdomain python3.9[157806]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39601 DF PROTO=TCP SPT=47358 DPT=9100 SEQ=2691335382 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C752E10000000001030307) 
Nov 23 09:20:39 np0005532585.localdomain sudo[157804]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:39 np0005532585.localdomain sudo[157896]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsdvhtkklwnqvvbihxcwcmiyzdafuiec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889639.4557662-738-255616319576861/AnsiballZ_stat.py
Nov 23 09:20:39 np0005532585.localdomain sudo[157896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:39 np0005532585.localdomain python3.9[157898]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:39 np0005532585.localdomain sudo[157896]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:40 np0005532585.localdomain sudo[157944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zojnicseayiaiifjljuxpcbqwduicqhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889639.4557662-738-255616319576861/AnsiballZ_file.py
Nov 23 09:20:40 np0005532585.localdomain sudo[157944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:40 np0005532585.localdomain python3.9[157946]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:40 np0005532585.localdomain sudo[157944]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:41 np0005532585.localdomain sudo[158036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpildevkjupjczejeyhwmwuclewlbzqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889640.9273767-774-53838325880493/AnsiballZ_stat.py
Nov 23 09:20:41 np0005532585.localdomain sudo[158036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:41 np0005532585.localdomain python3.9[158038]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:20:41Z|00047|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Nov 23 09:20:41 np0005532585.localdomain sudo[158036]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:41 np0005532585.localdomain sudo[158084]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlzzdaoikipuwwzwwwniiiwnltapelqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889640.9273767-774-53838325880493/AnsiballZ_file.py
Nov 23 09:20:41 np0005532585.localdomain sudo[158084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:41 np0005532585.localdomain python3.9[158086]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:41 np0005532585.localdomain sudo[158084]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10290 DF PROTO=TCP SPT=34348 DPT=9882 SEQ=1978032639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C75D060000000001030307) 
Nov 23 09:20:42 np0005532585.localdomain sudo[158176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kayzcjxulhqpxinxtvialalwurotoeds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889642.0337548-810-139942519821150/AnsiballZ_systemd.py
Nov 23 09:20:42 np0005532585.localdomain sudo[158176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:42 np0005532585.localdomain python3.9[158178]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:20:42 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:20:42 np0005532585.localdomain systemd-rc-local-generator[158204]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:20:42 np0005532585.localdomain systemd-sysv-generator[158210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:20:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:20:42 np0005532585.localdomain sudo[158176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:43 np0005532585.localdomain sudo[158307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcvwbrghaeuhuugayxbgtgqfjarniofg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889643.1880326-834-192547898119791/AnsiballZ_stat.py
Nov 23 09:20:43 np0005532585.localdomain sudo[158307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:43 np0005532585.localdomain python3.9[158309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:43 np0005532585.localdomain sudo[158307]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:43 np0005532585.localdomain sudo[158355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztcplmxfaqeqltmevltcfyofscebqsyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889643.1880326-834-192547898119791/AnsiballZ_file.py
Nov 23 09:20:43 np0005532585.localdomain sudo[158355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:43 np0005532585.localdomain python3.9[158357]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:44 np0005532585.localdomain sudo[158355]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:44 np0005532585.localdomain sudo[158447]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfnnczfsvqssiqrtzwwbktkfpzvvxwdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889644.3250446-870-205500591599526/AnsiballZ_stat.py
Nov 23 09:20:44 np0005532585.localdomain sudo[158447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:44 np0005532585.localdomain python3.9[158449]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:44 np0005532585.localdomain sudo[158447]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:44 np0005532585.localdomain sudo[158495]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmqpdkahltkxvailvyesbshbkqzuidct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889644.3250446-870-205500591599526/AnsiballZ_file.py
Nov 23 09:20:44 np0005532585.localdomain sudo[158495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10292 DF PROTO=TCP SPT=34348 DPT=9882 SEQ=1978032639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C769200000000001030307) 
Nov 23 09:20:45 np0005532585.localdomain python3.9[158497]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:45 np0005532585.localdomain sudo[158495]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:46 np0005532585.localdomain sudo[158587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuxbixyqojfovogzsnnrbzpbpdjlftdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889646.4782903-906-227525699342637/AnsiballZ_systemd.py
Nov 23 09:20:46 np0005532585.localdomain sudo[158587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:47 np0005532585.localdomain python3.9[158589]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:20:47 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:20:47 np0005532585.localdomain systemd-rc-local-generator[158611]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:20:47 np0005532585.localdomain systemd-sysv-generator[158615]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:20:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:20:47 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:20:47 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:20:47 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:20:47 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:20:47 np0005532585.localdomain sudo[158587]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:48 np0005532585.localdomain sudo[158721]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yimxvbdylynsfqhpaxvlmlcdtrlyneto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889647.8641076-936-182095688003417/AnsiballZ_file.py
Nov 23 09:20:48 np0005532585.localdomain sudo[158721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:48 np0005532585.localdomain python3.9[158723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57051 DF PROTO=TCP SPT=41364 DPT=9102 SEQ=4036052507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C776200000000001030307) 
Nov 23 09:20:48 np0005532585.localdomain sudo[158721]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:48 np0005532585.localdomain sudo[158813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwfxeutydfgupmmvlnmoojkxtbenpcmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889648.5463421-960-192936594959607/AnsiballZ_stat.py
Nov 23 09:20:48 np0005532585.localdomain sudo[158813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:49 np0005532585.localdomain python3.9[158815]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:49 np0005532585.localdomain sudo[158813]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:49 np0005532585.localdomain sudo[158886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxujkevknyfxhhmmkhvfharkkwzmgskj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889648.5463421-960-192936594959607/AnsiballZ_copy.py
Nov 23 09:20:49 np0005532585.localdomain sudo[158886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:49 np0005532585.localdomain python3.9[158888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889648.5463421-960-192936594959607/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:49 np0005532585.localdomain sudo[158886]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:50 np0005532585.localdomain sudo[158978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atdrhdwslloqhvryvjtojvbftixnyavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889650.017828-1011-84847969690875/AnsiballZ_file.py
Nov 23 09:20:50 np0005532585.localdomain sudo[158978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:50 np0005532585.localdomain python3.9[158980]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:20:50 np0005532585.localdomain sudo[158978]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:51 np0005532585.localdomain sudo[159070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhpiomyqasqpdrcfgjkejvhygohcimfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889650.7320113-1035-275893690886433/AnsiballZ_stat.py
Nov 23 09:20:51 np0005532585.localdomain sudo[159070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:51 np0005532585.localdomain python3.9[159072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:20:51 np0005532585.localdomain sudo[159070]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:51 np0005532585.localdomain sudo[159145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrrlgtkjjgsbeovmaebfesqkxapeyssf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889650.7320113-1035-275893690886433/AnsiballZ_copy.py
Nov 23 09:20:51 np0005532585.localdomain sudo[159145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:52 np0005532585.localdomain python3.9[159147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889650.7320113-1035-275893690886433/.source.json _original_basename=.631w_e77 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:52 np0005532585.localdomain sudo[159145]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:52 np0005532585.localdomain sudo[159237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sddgwohdbjroddtappeihynekimnoqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889652.4015856-1080-1657881977498/AnsiballZ_file.py
Nov 23 09:20:52 np0005532585.localdomain sudo[159237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:52 np0005532585.localdomain python3.9[159239]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:20:52 np0005532585.localdomain sudo[159237]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53478 DF PROTO=TCP SPT=38202 DPT=9102 SEQ=3774868968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C78A210000000001030307) 
Nov 23 09:20:53 np0005532585.localdomain sudo[159329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqebjahmxlnmytmloogczpdyutjmtnws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889653.5894437-1104-29328874186721/AnsiballZ_stat.py
Nov 23 09:20:53 np0005532585.localdomain sudo[159329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:54 np0005532585.localdomain sudo[159329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:54 np0005532585.localdomain sudo[159402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdzadxpekyqvoovfmritpjsuwzyxttvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889653.5894437-1104-29328874186721/AnsiballZ_copy.py
Nov 23 09:20:54 np0005532585.localdomain sudo[159402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:54 np0005532585.localdomain sudo[159402]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:55 np0005532585.localdomain sudo[159494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrxxnfslggwzghglqievwkgfsruidiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889654.960415-1155-160394471490097/AnsiballZ_container_config_data.py
Nov 23 09:20:55 np0005532585.localdomain sudo[159494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:55 np0005532585.localdomain python3.9[159496]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 23 09:20:55 np0005532585.localdomain sudo[159494]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:56 np0005532585.localdomain sudo[159586]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcndicumvwuwjfbhristixzutieengrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889655.7952914-1182-139438624352815/AnsiballZ_container_config_hash.py
Nov 23 09:20:56 np0005532585.localdomain sudo[159586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:56 np0005532585.localdomain python3.9[159588]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:20:56 np0005532585.localdomain sudo[159586]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:57 np0005532585.localdomain sudo[159678]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izekxhshlyenjdthvmxlbdkbyupxbffh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889656.6642241-1209-218436824943088/AnsiballZ_podman_container_info.py
Nov 23 09:20:57 np0005532585.localdomain sudo[159678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:20:57 np0005532585.localdomain python3.9[159680]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 09:20:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10294 DF PROTO=TCP SPT=34348 DPT=9882 SEQ=1978032639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C79A210000000001030307) 
Nov 23 09:20:57 np0005532585.localdomain sudo[159678]: pam_unix(sudo:session): session closed for user root
Nov 23 09:20:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14322 DF PROTO=TCP SPT=45352 DPT=9101 SEQ=1901109381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7A30A0000000001030307) 
Nov 23 09:20:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29006 DF PROTO=TCP SPT=59772 DPT=9105 SEQ=2659339466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7A3BB0000000001030307) 
Nov 23 09:21:01 np0005532585.localdomain sudo[159797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvrpyhrxmktwcbmvpruwgxdkwcyclnit ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763889660.6918678-1248-178962896251614/AnsiballZ_edpm_container_manage.py
Nov 23 09:21:01 np0005532585.localdomain sudo[159797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:01 np0005532585.localdomain python3[159799]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:21:01 np0005532585.localdomain python3[159799]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9",
                                                                    "Digest": "sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:31:40.431364621Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784198911,
                                                                    "VirtualSize": 784198911,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc/diff:/var/lib/containers/storage/overlay/cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",
                                                                              "sha256:03228f16e908b0892695bcc077f4378f9669ff86bd51a3747df5ce9269c56477",
                                                                              "sha256:1bc9c5b4c351caaeaa6b900805b43669e78b079f06d9048393517dd05690b8dc",
                                                                              "sha256:83d6638c009d9ced6da21e0f659e23221a9a8d7c283582e370f21a7551100a49"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:19.349843192Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:59.347040136Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:02.744397841Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:22.172613539Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:23.378330332Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:54.286044475Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:59.88193659Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:21:10.229245412Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:21:18.925309245Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:29:36.981771672Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:30:52.373347656Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:30:56.391767756Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:30:59.591401711Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:31:40.430358392Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:31:40.430397643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:31:43.914387084Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:21:01 np0005532585.localdomain podman[159851]: 2025-11-23 09:21:01.813237534 +0000 UTC m=+0.065844018 container remove 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 23 09:21:01 np0005532585.localdomain python3[159799]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Nov 23 09:21:01 np0005532585.localdomain podman[159864]: 
Nov 23 09:21:01 np0005532585.localdomain podman[159864]: 2025-11-23 09:21:01.916592011 +0000 UTC m=+0.087201968 container create 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 09:21:01 np0005532585.localdomain podman[159864]: 2025-11-23 09:21:01.874142098 +0000 UTC m=+0.044752105 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:21:01 np0005532585.localdomain python3[159799]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:21:02 np0005532585.localdomain sudo[159797]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:02 np0005532585.localdomain sudo[159992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpgoeieifvrninkewqtnqxhopukwicoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889662.3927925-1272-192295403090267/AnsiballZ_stat.py
Nov 23 09:21:02 np0005532585.localdomain sudo[159992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:02 np0005532585.localdomain python3.9[159994]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:21:02 np0005532585.localdomain sudo[159992]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14324 DF PROTO=TCP SPT=45352 DPT=9101 SEQ=1901109381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7AF200000000001030307) 
Nov 23 09:21:03 np0005532585.localdomain sudo[160086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrbvcwsuenljnhrhlgrtpozrfqgglvzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889663.595632-1299-78725436935360/AnsiballZ_file.py
Nov 23 09:21:03 np0005532585.localdomain sudo[160086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:04 np0005532585.localdomain python3.9[160088]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:04 np0005532585.localdomain sudo[160086]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:04 np0005532585.localdomain sudo[160132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnncenywwayjjyzuqugoksmvjgdnotwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889663.595632-1299-78725436935360/AnsiballZ_stat.py
Nov 23 09:21:04 np0005532585.localdomain sudo[160132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:21:04 np0005532585.localdomain podman[160135]: 2025-11-23 09:21:04.349908626 +0000 UTC m=+0.087058805 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 23 09:21:04 np0005532585.localdomain podman[160135]: 2025-11-23 09:21:04.416412283 +0000 UTC m=+0.153562452 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 23 09:21:04 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:21:04 np0005532585.localdomain python3.9[160134]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:21:04 np0005532585.localdomain sudo[160132]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:04 np0005532585.localdomain sudo[160249]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaafmzsuxphznwwtocclhrqtluhjglnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889664.5191727-1299-147141301336279/AnsiballZ_copy.py
Nov 23 09:21:04 np0005532585.localdomain sudo[160249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:05 np0005532585.localdomain python3.9[160251]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763889664.5191727-1299-147141301336279/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:05 np0005532585.localdomain sudo[160249]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:05 np0005532585.localdomain sudo[160295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oagngusvkaipqddbqxvthllidvvybrym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889664.5191727-1299-147141301336279/AnsiballZ_systemd.py
Nov 23 09:21:05 np0005532585.localdomain sudo[160295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:06 np0005532585.localdomain python3.9[160297]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:21:06 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:21:06 np0005532585.localdomain systemd-rc-local-generator[160318]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:21:06 np0005532585.localdomain systemd-sysv-generator[160321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:21:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40272 DF PROTO=TCP SPT=32894 DPT=9100 SEQ=2119202913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7BC200000000001030307) 
Nov 23 09:21:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:06 np0005532585.localdomain sudo[160295]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:06 np0005532585.localdomain sudo[160376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwcwtgkpcqqelnurbprirmtlojhnzdwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889664.5191727-1299-147141301336279/AnsiballZ_systemd.py
Nov 23 09:21:06 np0005532585.localdomain sudo[160376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:07 np0005532585.localdomain python3.9[160378]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:21:07 np0005532585.localdomain systemd-rc-local-generator[160405]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:21:07 np0005532585.localdomain systemd-sysv-generator[160408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Starting ovn_metadata_agent container...
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: tmp-crun.bYdoGC.mount: Deactivated successfully.
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:21:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77181651e1832fcae1b91722194a2a35e769c1643eafd1391688458a62aabf61/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 23 09:21:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77181651e1832fcae1b91722194a2a35e769c1643eafd1391688458a62aabf61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:21:07 np0005532585.localdomain podman[160420]: 2025-11-23 09:21:07.588933646 +0000 UTC m=+0.147056671 container init 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + sudo -E kolla_set_configs
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:21:07 np0005532585.localdomain podman[160420]: 2025-11-23 09:21:07.619826842 +0000 UTC m=+0.177949787 container start 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:21:07 np0005532585.localdomain edpm-start-podman-container[160420]: ovn_metadata_agent
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Validating config file
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Copying service configuration files
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Writing out command to execute
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: ++ cat /run_command
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + CMD=neutron-ovn-metadata-agent
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + ARGS=
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + sudo kolla_copy_cacerts
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + [[ ! -n '' ]]
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + . kolla_extend_start
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: Running command: 'neutron-ovn-metadata-agent'
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + umask 0022
Nov 23 09:21:07 np0005532585.localdomain ovn_metadata_agent[160434]: + exec neutron-ovn-metadata-agent
Nov 23 09:21:07 np0005532585.localdomain edpm-start-podman-container[160419]: Creating additional drop-in dependency for "ovn_metadata_agent" (9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e)
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:21:07 np0005532585.localdomain systemd-sysv-generator[160510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:21:07 np0005532585.localdomain systemd-rc-local-generator[160504]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:21:07 np0005532585.localdomain podman[160443]: 2025-11-23 09:21:07.771860665 +0000 UTC m=+0.146391260 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 09:21:07 np0005532585.localdomain podman[160443]: 2025-11-23 09:21:07.800446759 +0000 UTC m=+0.174977314 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:21:07 np0005532585.localdomain systemd[1]: Started ovn_metadata_agent container.
Nov 23 09:21:07 np0005532585.localdomain sudo[160376]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:08 np0005532585.localdomain sshd[155195]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:21:08 np0005532585.localdomain systemd-logind[761]: Session 51 logged out. Waiting for processes to exit.
Nov 23 09:21:08 np0005532585.localdomain systemd[1]: session-51.scope: Deactivated successfully.
Nov 23 09:21:08 np0005532585.localdomain systemd[1]: session-51.scope: Consumed 30.137s CPU time.
Nov 23 09:21:08 np0005532585.localdomain systemd-logind[761]: Removed session 51.
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.197 160439 INFO neutron.common.config [-] Logging enabled!
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 23 09:21:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52272 DF PROTO=TCP SPT=36808 DPT=9100 SEQ=427320919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7C7E00000000001030307) 
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.252 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 26f986a7-6ac7-4ec2-887b-8da6da04a661 (UUID: 26f986a7-6ac7-4ec2-887b-8da6da04a661) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.269 160439 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.269 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.269 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.270 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.271 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.275 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.287 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:aa:3b 192.168.0.77'], port_security=['fa:16:3e:cf:aa:3b 192.168.0.77'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.77/24', 'neutron:device_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005532585.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '1915d3e5d4254231a0517e2dcf35848f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4fe931b0-155c-49e2-b5a5-44d74fa72e9e 6afcee2e-50ee-4b3c-9d1f-24ea7a5a850b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ca41f3-3e94-4959-b1b5-1e81bd2c9bc1, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d3912d14-a3e0-4df9-b811-f3bd90f44559) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.288 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '26f986a7-6ac7-4ec2-887b-8da6da04a661'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], external_ids={'neutron:ovn-metadata-id': '1338671a-acb4-5368-872c-4c2204284319', 'neutron:ovn-metadata-sb-cfg': '1'}, name=26f986a7-6ac7-4ec2-887b-8da6da04a661, nb_cfg_timestamp=1763889611858, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.289 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d3912d14-a3e0-4df9-b811-f3bd90f44559 in datapath bcac49fc-c589-475a-91a8-00a0ba9c2b33 bound to our chassis on insert
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.289 160439 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f08bf8f1b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 INFO oslo_service.service [-] Starting 1 workers
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.293 160439 DEBUG oslo_service.service [-] Started child 160537 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.295 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bcac49fc-c589-475a-91a8-00a0ba9c2b33
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.296 160537 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-242845'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.296 160439 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpkqnv75g4/privsep.sock']
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.310 160537 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.311 160537 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.311 160537 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.316 160537 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.322 160537 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.331 160537 INFO eventlet.wsgi.server [-] (160537) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.894 160439 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.895 160439 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkqnv75g4/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.797 160542 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.802 160542 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.805 160542 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.806 160542 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160542
Nov 23 09:21:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:09.897 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3e87f35a-0aa3-4e6e-b78d-5b2fe079c1ab]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:10.280 160542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:21:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:10.280 160542 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:21:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:10.280 160542 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:21:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:10.731 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3ce10a-f0e4-4307-bde3-3c39859cff0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:10.732 160439 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpicyi7c_h/privsep.sock']
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.362 160439 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.363 160439 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpicyi7c_h/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.255 160553 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.261 160553 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.264 160553 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.264 160553 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160553
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.366 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[154dcbcb-2615-429f-b846-df0a4dc7b9f6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.831 160553 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.831 160553 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:21:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:11.831 160553 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:21:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13325 DF PROTO=TCP SPT=35752 DPT=9882 SEQ=1833969550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7D2360000000001030307) 
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.292 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[0c48477b-0dd2-4fc7-a2cd-7b8676cec6c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.295 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[1972be4d-4812-484a-bb9b-7fe832cf6220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.310 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[db1fd844-2273-4905-8614-f065b2d1b2dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.323 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ad8c74-3e45-4b0e-83e8-7934ea0a9db1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcac49fc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:b2:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628644, 'reachable_time': 31397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 160563, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.336 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c5be5abd-97cb-4042-9d03-49f5e496d618]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbcac49fc-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628651, 'tstamp': 628651}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapbcac49fc-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628653, 'tstamp': 628653}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628654, 'tstamp': 628654}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:b28b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628644, 'tstamp': 628644}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.378 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[caedf812-a3ab-438c-a3eb-656ccd9d86be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.380 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcac49fc-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.417 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcac49fc-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.418 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.419 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbcac49fc-c0, col_values=(('external_ids', {'iface-id': '98ef2da5-f5cb-44e8-a4b2-f6178c6c8332'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.420 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:21:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.424 160439 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpbi_yr_1l/privsep.sock']
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.024 160439 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.025 160439 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbi_yr_1l/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.921 160573 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.925 160573 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.927 160573 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:12.927 160573 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160573
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.028 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[0d406e78-198b-496c-bb63-fe08f0476fd2]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.469 160573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.469 160573 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.469 160573 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:21:13 np0005532585.localdomain sshd[160578]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:21:13 np0005532585.localdomain sshd[160578]: Accepted publickey for zuul from 192.168.122.30 port 48096 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:21:13 np0005532585.localdomain systemd-logind[761]: New session 52 of user zuul.
Nov 23 09:21:13 np0005532585.localdomain systemd[1]: Started Session 52 of User zuul.
Nov 23 09:21:13 np0005532585.localdomain sshd[160578]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.925 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[f41794b4-b99c-4960-8787-6741fa1738fe]: (4, ['ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.928 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, column=external_ids, values=({'neutron:ovn-metadata-id': '1338671a-acb4-5368-872c-4c2204284319'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.929 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.930 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.943 160439 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.943 160439 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.943 160439 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.953 160439 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.955 160439 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.959 160439 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.960 160439 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.972 160439 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.972 160439 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.972 160439 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.975 160439 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.975 160439 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.975 160439 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:21:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:21:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13327 DF PROTO=TCP SPT=35752 DPT=9882 SEQ=1833969550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7DE200000000001030307) 
Nov 23 09:21:15 np0005532585.localdomain python3.9[160671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:21:17 np0005532585.localdomain sudo[160765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vywxactyscbdzjtzyfuvoumwnejnfxtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889675.9475257-63-11847725029962/AnsiballZ_command.py
Nov 23 09:21:17 np0005532585.localdomain sudo[160765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:17 np0005532585.localdomain python3.9[160767]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:17 np0005532585.localdomain sudo[160765]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53480 DF PROTO=TCP SPT=38202 DPT=9102 SEQ=3774868968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7EA200000000001030307) 
Nov 23 09:21:18 np0005532585.localdomain sudo[160870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myjptacpmzblqhxwxmmdqhqlhbchouay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889677.8986309-87-23941269097384/AnsiballZ_command.py
Nov 23 09:21:18 np0005532585.localdomain sudo[160870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:18 np0005532585.localdomain python3.9[160872]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:18 np0005532585.localdomain systemd[1]: libpod-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1.scope: Deactivated successfully.
Nov 23 09:21:18 np0005532585.localdomain podman[160873]: 2025-11-23 09:21:18.648504784 +0000 UTC m=+0.302301251 container died 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 23 09:21:18 np0005532585.localdomain systemd[1]: tmp-crun.fzYe6d.mount: Deactivated successfully.
Nov 23 09:21:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1-userdata-shm.mount: Deactivated successfully.
Nov 23 09:21:18 np0005532585.localdomain podman[160873]: 2025-11-23 09:21:18.686458789 +0000 UTC m=+0.340255166 container cleanup 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=)
Nov 23 09:21:18 np0005532585.localdomain sudo[160870]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:18 np0005532585.localdomain podman[160886]: 2025-11-23 09:21:18.733043421 +0000 UTC m=+0.078742958 container remove 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, release=1761123044)
Nov 23 09:21:18 np0005532585.localdomain systemd[1]: libpod-conmon-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1.scope: Deactivated successfully.
Nov 23 09:21:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf-merged.mount: Deactivated successfully.
Nov 23 09:21:20 np0005532585.localdomain sudo[160990]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmgwtddqkksbinmzfxslchfysgvmkbik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889679.6056263-117-13057696247914/AnsiballZ_systemd_service.py
Nov 23 09:21:20 np0005532585.localdomain sudo[160990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:20 np0005532585.localdomain python3.9[160992]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:21:20 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:21:20 np0005532585.localdomain systemd-rc-local-generator[161015]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:21:20 np0005532585.localdomain systemd-sysv-generator[161022]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:21:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:20 np0005532585.localdomain sudo[160990]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:21 np0005532585.localdomain python3.9[161118]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:21:21 np0005532585.localdomain network[161135]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:21:21 np0005532585.localdomain network[161136]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:21:21 np0005532585.localdomain network[161137]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:21:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3497 DF PROTO=TCP SPT=60650 DPT=9102 SEQ=2952448551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7FF600000000001030307) 
Nov 23 09:21:24 np0005532585.localdomain sshd[161261]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:21:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13329 DF PROTO=TCP SPT=35752 DPT=9882 SEQ=1833969550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C80E200000000001030307) 
Nov 23 09:21:27 np0005532585.localdomain sudo[161263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:21:27 np0005532585.localdomain sudo[161263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:21:27 np0005532585.localdomain sudo[161263]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:27 np0005532585.localdomain sudo[161278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:21:27 np0005532585.localdomain sudo[161278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:21:28 np0005532585.localdomain sudo[161278]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:28 np0005532585.localdomain sudo[161325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:21:28 np0005532585.localdomain sudo[161325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:21:28 np0005532585.localdomain sudo[161325]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:29 np0005532585.localdomain sudo[161415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkycpmktktprmoffzpnexedchivkabru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889689.295906-174-151371051193872/AnsiballZ_systemd_service.py
Nov 23 09:21:29 np0005532585.localdomain sudo[161415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35494 DF PROTO=TCP SPT=39434 DPT=9101 SEQ=550781359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8183A0000000001030307) 
Nov 23 09:21:29 np0005532585.localdomain python3.9[161417]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:29 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:21:30 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31760 DF PROTO=TCP SPT=56020 DPT=9105 SEQ=1694727563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C818FB0000000001030307) 
Nov 23 09:21:30 np0005532585.localdomain systemd-sysv-generator[161448]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:21:30 np0005532585.localdomain systemd-rc-local-generator[161441]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:21:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:30 np0005532585.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Nov 23 09:21:30 np0005532585.localdomain sudo[161415]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:30 np0005532585.localdomain sudo[161547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmxphgnkcnngjqyxuyhixxapgxfzvdui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889690.381207-174-252364177723419/AnsiballZ_systemd_service.py
Nov 23 09:21:30 np0005532585.localdomain sudo[161547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:30 np0005532585.localdomain python3.9[161549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:31 np0005532585.localdomain sudo[161547]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:32 np0005532585.localdomain sshd[161261]: Invalid user admin from 211.196.31.2 port 33405
Nov 23 09:21:32 np0005532585.localdomain sudo[161640]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwcbjofykllxldoaebrdgscfxitjqzlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889691.1077292-174-61695697266730/AnsiballZ_systemd_service.py
Nov 23 09:21:32 np0005532585.localdomain sudo[161640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:32 np0005532585.localdomain python3.9[161642]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:32 np0005532585.localdomain sudo[161640]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35496 DF PROTO=TCP SPT=39434 DPT=9101 SEQ=550781359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C824600000000001030307) 
Nov 23 09:21:32 np0005532585.localdomain sudo[161733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egrtmxgfoyhpdclglirapkdctqkwlyxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889692.6994855-174-248353846855191/AnsiballZ_systemd_service.py
Nov 23 09:21:32 np0005532585.localdomain sudo[161733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:33 np0005532585.localdomain sshd[161261]: Connection closed by invalid user admin 211.196.31.2 port 33405 [preauth]
Nov 23 09:21:33 np0005532585.localdomain python3.9[161735]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:33 np0005532585.localdomain sudo[161733]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:33 np0005532585.localdomain sudo[161826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwlngthzibpqrrwgsgnxtvlpjsuiqeuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889693.4199157-174-239589049536780/AnsiballZ_systemd_service.py
Nov 23 09:21:33 np0005532585.localdomain sudo[161826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:33 np0005532585.localdomain python3.9[161828]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:34 np0005532585.localdomain sudo[161826]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:34 np0005532585.localdomain sudo[161919]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmffsvjmqncmlfbcsxehwwxzdlzsrqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889694.1141427-174-28249140442893/AnsiballZ_systemd_service.py
Nov 23 09:21:34 np0005532585.localdomain sudo[161919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:34 np0005532585.localdomain python3.9[161921]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:21:34 np0005532585.localdomain sudo[161919]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:34 np0005532585.localdomain podman[161923]: 2025-11-23 09:21:34.836681041 +0000 UTC m=+0.086447188 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:21:34 np0005532585.localdomain podman[161923]: 2025-11-23 09:21:34.876562911 +0000 UTC m=+0.126329028 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 23 09:21:34 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:21:35 np0005532585.localdomain sudo[162035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yieuixtozsmqnbxybhmmuasluuhzxcwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889694.883135-174-94981936636759/AnsiballZ_systemd_service.py
Nov 23 09:21:35 np0005532585.localdomain sudo[162035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:35 np0005532585.localdomain python3.9[162037]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:21:35 np0005532585.localdomain sudo[162035]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39604 DF PROTO=TCP SPT=47358 DPT=9100 SEQ=2691335382 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C832200000000001030307) 
Nov 23 09:21:37 np0005532585.localdomain sudo[162128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqmzffkgitrqjvavonpvysfmvmgjifoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889697.1606047-331-168535373389293/AnsiballZ_file.py
Nov 23 09:21:37 np0005532585.localdomain sudo[162128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:37 np0005532585.localdomain python3.9[162130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:37 np0005532585.localdomain sudo[162128]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:38 np0005532585.localdomain sudo[162220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpbpdhndavyenxnbmdvrtgjnamsvyjhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889697.802483-331-193599161899626/AnsiballZ_file.py
Nov 23 09:21:38 np0005532585.localdomain sudo[162220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:21:38 np0005532585.localdomain systemd[1]: tmp-crun.CXIW7p.mount: Deactivated successfully.
Nov 23 09:21:38 np0005532585.localdomain podman[162223]: 2025-11-23 09:21:38.156869725 +0000 UTC m=+0.088208862 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 23 09:21:38 np0005532585.localdomain podman[162223]: 2025-11-23 09:21:38.190202341 +0000 UTC m=+0.121541498 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 09:21:38 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:21:38 np0005532585.localdomain python3.9[162222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:38 np0005532585.localdomain sudo[162220]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:38 np0005532585.localdomain sudo[162331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtvwwwcqpwiapseorrybjsiypmwnpyvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889698.3804083-331-34122886141807/AnsiballZ_file.py
Nov 23 09:21:38 np0005532585.localdomain sudo[162331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:38 np0005532585.localdomain python3.9[162333]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:38 np0005532585.localdomain sudo[162331]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:39 np0005532585.localdomain sudo[162423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxpkeronukerjfsgcgftjvieomzyhhcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889698.933938-331-219294233208741/AnsiballZ_file.py
Nov 23 09:21:39 np0005532585.localdomain sudo[162423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4052 DF PROTO=TCP SPT=40358 DPT=9100 SEQ=3390558412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C83D210000000001030307) 
Nov 23 09:21:39 np0005532585.localdomain python3.9[162425]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:39 np0005532585.localdomain sudo[162423]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:39 np0005532585.localdomain sudo[162515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgfdqpdztfgevaetszakbqlbxxalgluz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889699.4978049-331-237972066200591/AnsiballZ_file.py
Nov 23 09:21:39 np0005532585.localdomain sudo[162515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:39 np0005532585.localdomain python3.9[162517]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:39 np0005532585.localdomain sudo[162515]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:40 np0005532585.localdomain sudo[162607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byguemphpmrkyypzfncdrufzopxloiyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889700.0353787-331-247245567500570/AnsiballZ_file.py
Nov 23 09:21:40 np0005532585.localdomain sudo[162607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:40 np0005532585.localdomain python3.9[162609]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:40 np0005532585.localdomain sudo[162607]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:40 np0005532585.localdomain sudo[162699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdkohyuluwphdzygcabdjfvwejkbybxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889700.6080413-331-138835946654216/AnsiballZ_file.py
Nov 23 09:21:40 np0005532585.localdomain sudo[162699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:41 np0005532585.localdomain python3.9[162701]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:41 np0005532585.localdomain sudo[162699]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1755 DF PROTO=TCP SPT=53198 DPT=9882 SEQ=2658206013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C847660000000001030307) 
Nov 23 09:21:41 np0005532585.localdomain sudo[162791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpczqlduwzuwmobgobcglgbsullxvsya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889701.6457899-480-256939843281381/AnsiballZ_file.py
Nov 23 09:21:41 np0005532585.localdomain sudo[162791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:42 np0005532585.localdomain python3.9[162793]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:42 np0005532585.localdomain sudo[162791]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:42 np0005532585.localdomain sudo[162883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiirnyreevpfwvgabdijgvszmnxqmtop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889702.1710093-480-76627404831898/AnsiballZ_file.py
Nov 23 09:21:42 np0005532585.localdomain sudo[162883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:42 np0005532585.localdomain python3.9[162885]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:42 np0005532585.localdomain sudo[162883]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:43 np0005532585.localdomain sudo[162975]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpiusoihsrqwjrwnndskswzbvyfxabzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889702.736785-480-61495134456733/AnsiballZ_file.py
Nov 23 09:21:43 np0005532585.localdomain sudo[162975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:43 np0005532585.localdomain python3.9[162977]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:43 np0005532585.localdomain sudo[162975]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:44 np0005532585.localdomain sudo[163067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwgdytfprbkubvvifptdwsmonpmpnapv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889703.9891346-480-272830530482275/AnsiballZ_file.py
Nov 23 09:21:44 np0005532585.localdomain sudo[163067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:44 np0005532585.localdomain python3.9[163069]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:44 np0005532585.localdomain sudo[163067]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:44 np0005532585.localdomain sudo[163159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdyloyacutseahjuvxoqzgzrwbnqixda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889704.5530858-480-277746687254412/AnsiballZ_file.py
Nov 23 09:21:44 np0005532585.localdomain sudo[163159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:44 np0005532585.localdomain python3.9[163161]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:44 np0005532585.localdomain sudo[163159]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1757 DF PROTO=TCP SPT=53198 DPT=9882 SEQ=2658206013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C853600000000001030307) 
Nov 23 09:21:45 np0005532585.localdomain sudo[163251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxnmkzncibnlcmqwpeljwzyixwgwobaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889705.0164976-480-258207121853873/AnsiballZ_file.py
Nov 23 09:21:45 np0005532585.localdomain sudo[163251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:45 np0005532585.localdomain python3.9[163253]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:45 np0005532585.localdomain sudo[163251]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:46 np0005532585.localdomain sudo[163343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klnblwacvyjyujpyimhtmspuztyjpkmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889705.601383-480-94611852477895/AnsiballZ_file.py
Nov 23 09:21:46 np0005532585.localdomain sudo[163343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:46 np0005532585.localdomain python3.9[163345]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:21:46 np0005532585.localdomain sudo[163343]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:47 np0005532585.localdomain sudo[163435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibhhvaqaklymxndhttzdpceqqjtrwqcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889707.383404-634-65593389440399/AnsiballZ_command.py
Nov 23 09:21:47 np0005532585.localdomain sudo[163435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:47 np0005532585.localdomain python3.9[163437]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:47 np0005532585.localdomain sudo[163435]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3499 DF PROTO=TCP SPT=60650 DPT=9102 SEQ=2952448551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C860210000000001030307) 
Nov 23 09:21:48 np0005532585.localdomain python3.9[163529]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:21:49 np0005532585.localdomain sudo[163619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbucqbgxbjnsptjiyudokumxjndumios ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889708.8863225-687-1648166659616/AnsiballZ_systemd_service.py
Nov 23 09:21:49 np0005532585.localdomain sudo[163619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:49 np0005532585.localdomain python3.9[163621]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:21:49 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:21:49 np0005532585.localdomain systemd-sysv-generator[163651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:21:49 np0005532585.localdomain systemd-rc-local-generator[163642]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:21:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:21:49 np0005532585.localdomain sudo[163619]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:50 np0005532585.localdomain sudo[163747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceqshwwnlaudyzghyioscbcwndjgvvkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889710.0160916-711-231897771100429/AnsiballZ_command.py
Nov 23 09:21:50 np0005532585.localdomain sudo[163747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:50 np0005532585.localdomain python3.9[163749]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:50 np0005532585.localdomain sudo[163747]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:50 np0005532585.localdomain sudo[163840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygqxtynmcxowyygrhcbvmazlalxanmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889710.5433679-711-76696670363312/AnsiballZ_command.py
Nov 23 09:21:50 np0005532585.localdomain sudo[163840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:50 np0005532585.localdomain python3.9[163842]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:52 np0005532585.localdomain sudo[163840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:52 np0005532585.localdomain sudo[163933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlsplusfdtfuovbjcvmiukrgsswgrjmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889712.1013536-711-11633158391451/AnsiballZ_command.py
Nov 23 09:21:52 np0005532585.localdomain sudo[163933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:52 np0005532585.localdomain python3.9[163935]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:52 np0005532585.localdomain sudo[163933]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:52 np0005532585.localdomain sudo[164026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcsminhjznhpljsuochjpumnthkzdvjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889712.6627903-711-8053113592408/AnsiballZ_command.py
Nov 23 09:21:52 np0005532585.localdomain sudo[164026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:53 np0005532585.localdomain python3.9[164028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:53 np0005532585.localdomain sudo[164026]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:53 np0005532585.localdomain sudo[164119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjuqdebljtbkfowzpfxbazgvimylggvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889713.1641004-711-162204844808568/AnsiballZ_command.py
Nov 23 09:21:53 np0005532585.localdomain sudo[164119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55191 DF PROTO=TCP SPT=60090 DPT=9102 SEQ=1164980385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C874600000000001030307) 
Nov 23 09:21:53 np0005532585.localdomain python3.9[164121]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:53 np0005532585.localdomain sudo[164119]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:53 np0005532585.localdomain sudo[164212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aouazapeoiofksbkpnotikrxpwkajwnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889713.6592584-711-164984465962038/AnsiballZ_command.py
Nov 23 09:21:53 np0005532585.localdomain sudo[164212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:54 np0005532585.localdomain python3.9[164214]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:54 np0005532585.localdomain sudo[164212]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:54 np0005532585.localdomain sudo[164305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zljmwpzpahouyvaotkcuotyjnnxxhywa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889714.2026005-711-192439901770756/AnsiballZ_command.py
Nov 23 09:21:54 np0005532585.localdomain sudo[164305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:54 np0005532585.localdomain python3.9[164307]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:21:54 np0005532585.localdomain sudo[164305]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:57 np0005532585.localdomain sudo[164398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogrymnphbhxtjralxjfasffvlfnnphse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889716.8905473-873-122209049208536/AnsiballZ_getent.py
Nov 23 09:21:57 np0005532585.localdomain sudo[164398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1759 DF PROTO=TCP SPT=53198 DPT=9882 SEQ=2658206013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C884210000000001030307) 
Nov 23 09:21:57 np0005532585.localdomain python3.9[164400]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 23 09:21:57 np0005532585.localdomain sudo[164398]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:58 np0005532585.localdomain sudo[164491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzliartnkfenletzrszflubxbmzylzre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889717.6853638-897-3311712695064/AnsiballZ_group.py
Nov 23 09:21:58 np0005532585.localdomain sudo[164491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:58 np0005532585.localdomain python3.9[164493]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 09:21:58 np0005532585.localdomain groupadd[164494]: group added to /etc/group: name=libvirt, GID=42473
Nov 23 09:21:58 np0005532585.localdomain groupadd[164494]: group added to /etc/gshadow: name=libvirt
Nov 23 09:21:58 np0005532585.localdomain groupadd[164494]: new group: name=libvirt, GID=42473
Nov 23 09:21:58 np0005532585.localdomain sudo[164491]: pam_unix(sudo:session): session closed for user root
Nov 23 09:21:59 np0005532585.localdomain sudo[164589]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okzvdwewojyfzffjfrypkubemejdplye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889719.2536757-921-97387624610838/AnsiballZ_user.py
Nov 23 09:21:59 np0005532585.localdomain sudo[164589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:21:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26471 DF PROTO=TCP SPT=51708 DPT=9101 SEQ=510542425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C88D690000000001030307) 
Nov 23 09:21:59 np0005532585.localdomain python3.9[164591]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 09:21:59 np0005532585.localdomain useradd[164593]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Nov 23 09:21:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37785 DF PROTO=TCP SPT=56578 DPT=9105 SEQ=30648972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C88E1A0000000001030307) 
Nov 23 09:22:00 np0005532585.localdomain sudo[164589]: pam_unix(sudo:session): session closed for user root
Nov 23 09:22:00 np0005532585.localdomain sudo[164689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jplqpoxszoshjslhxgeesohfilkpdpli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889720.465589-954-90563512622933/AnsiballZ_setup.py
Nov 23 09:22:00 np0005532585.localdomain sudo[164689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:22:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:22:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b8127610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 09:22:01 np0005532585.localdomain python3.9[164691]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:22:01 np0005532585.localdomain sudo[164689]: pam_unix(sudo:session): session closed for user root
Nov 23 09:22:01 np0005532585.localdomain sudo[164743]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twuyqcfsunxzdpuowxryoqlngtsxcjdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889720.465589-954-90563512622933/AnsiballZ_dnf.py
Nov 23 09:22:01 np0005532585.localdomain sudo[164743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:22:02 np0005532585.localdomain python3.9[164745]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:22:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37787 DF PROTO=TCP SPT=56578 DPT=9105 SEQ=30648972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C89A210000000001030307) 
Nov 23 09:22:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:22:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dff610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 23 09:22:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:22:05 np0005532585.localdomain podman[164748]: 2025-11-23 09:22:05.024546282 +0000 UTC m=+0.082242767 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:22:05 np0005532585.localdomain podman[164748]: 2025-11-23 09:22:05.058211429 +0000 UTC m=+0.115907974 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 23 09:22:05 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:22:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52275 DF PROTO=TCP SPT=36808 DPT=9100 SEQ=427320919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8A6200000000001030307) 
Nov 23 09:22:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:22:09 np0005532585.localdomain podman[164838]: 2025-11-23 09:22:09.017388717 +0000 UTC m=+0.077427007 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:22:09 np0005532585.localdomain podman[164838]: 2025-11-23 09:22:09.047182814 +0000 UTC m=+0.107221124 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 23 09:22:09 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:22:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:22:09.232 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:22:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:22:09.232 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:22:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:22:09.233 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:22:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46843 DF PROTO=TCP SPT=35548 DPT=9100 SEQ=608490598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8B2600000000001030307) 
Nov 23 09:22:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28026 DF PROTO=TCP SPT=46152 DPT=9882 SEQ=2891797171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8BC960000000001030307) 
Nov 23 09:22:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28028 DF PROTO=TCP SPT=46152 DPT=9882 SEQ=2891797171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8C8A10000000001030307) 
Nov 23 09:22:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55193 DF PROTO=TCP SPT=60090 DPT=9102 SEQ=1164980385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8D4200000000001030307) 
Nov 23 09:22:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47656 DF PROTO=TCP SPT=40302 DPT=9102 SEQ=2900724902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8E9A00000000001030307) 
Nov 23 09:22:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28030 DF PROTO=TCP SPT=46152 DPT=9882 SEQ=2891797171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8F8210000000001030307) 
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  Converting 2758 SID table entries...
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:22:27 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:22:28 np0005532585.localdomain sudo[165630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:22:28 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=19 res=1
Nov 23 09:22:28 np0005532585.localdomain sudo[165630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:22:28 np0005532585.localdomain sudo[165630]: pam_unix(sudo:session): session closed for user root
Nov 23 09:22:29 np0005532585.localdomain sudo[165716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:22:29 np0005532585.localdomain sudo[165716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:22:29 np0005532585.localdomain sudo[165716]: pam_unix(sudo:session): session closed for user root
Nov 23 09:22:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29977 DF PROTO=TCP SPT=57732 DPT=9101 SEQ=168636228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9029A0000000001030307) 
Nov 23 09:22:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49170 DF PROTO=TCP SPT=58140 DPT=9105 SEQ=3084951181 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9034B0000000001030307) 
Nov 23 09:22:30 np0005532585.localdomain sudo[165964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:22:30 np0005532585.localdomain sudo[165964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:22:30 np0005532585.localdomain sudo[165964]: pam_unix(sudo:session): session closed for user root
Nov 23 09:22:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29979 DF PROTO=TCP SPT=57732 DPT=9101 SEQ=168636228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C90EA00000000001030307) 
Nov 23 09:22:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:22:36 np0005532585.localdomain systemd[1]: tmp-crun.gERxBu.mount: Deactivated successfully.
Nov 23 09:22:36 np0005532585.localdomain podman[165983]: 2025-11-23 09:22:36.069054941 +0000 UTC m=+0.106506460 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 23 09:22:36 np0005532585.localdomain podman[165983]: 2025-11-23 09:22:36.136503218 +0000 UTC m=+0.173954707 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:22:36 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:22:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4055 DF PROTO=TCP SPT=40358 DPT=9100 SEQ=3390558412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C91C210000000001030307) 
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  Converting 2761 SID table entries...
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:22:38 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:22:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25888 DF PROTO=TCP SPT=46556 DPT=9100 SEQ=2284853579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C927A00000000001030307) 
Nov 23 09:22:39 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Nov 23 09:22:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:22:40 np0005532585.localdomain podman[166014]: 2025-11-23 09:22:40.019560797 +0000 UTC m=+0.070931887 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:22:40 np0005532585.localdomain podman[166014]: 2025-11-23 09:22:40.054203131 +0000 UTC m=+0.105574241 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 23 09:22:40 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:22:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18373 DF PROTO=TCP SPT=49144 DPT=9882 SEQ=2198007924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C931C70000000001030307) 
Nov 23 09:22:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18375 DF PROTO=TCP SPT=49144 DPT=9882 SEQ=2198007924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C93DE00000000001030307) 
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  Converting 2761 SID table entries...
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:22:46 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:22:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47658 DF PROTO=TCP SPT=40302 DPT=9102 SEQ=2900724902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C94A200000000001030307) 
Nov 23 09:22:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62044 DF PROTO=TCP SPT=35598 DPT=9102 SEQ=3206288983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C95EE00000000001030307) 
Nov 23 09:22:54 np0005532585.localdomain kernel: SELinux:  Converting 2761 SID table entries...
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:22:55 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:22:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18377 DF PROTO=TCP SPT=49144 DPT=9882 SEQ=2198007924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C96E210000000001030307) 
Nov 23 09:22:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31582 DF PROTO=TCP SPT=52862 DPT=9101 SEQ=3900244220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C977C90000000001030307) 
Nov 23 09:22:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5733 DF PROTO=TCP SPT=59552 DPT=9105 SEQ=880615477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9787B0000000001030307) 
Nov 23 09:23:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31584 DF PROTO=TCP SPT=52862 DPT=9101 SEQ=3900244220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C983E00000000001030307) 
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  Converting 2761 SID table entries...
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:23:04 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:23:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46846 DF PROTO=TCP SPT=35548 DPT=9100 SEQ=608490598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C990210000000001030307) 
Nov 23 09:23:06 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Nov 23 09:23:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:23:07 np0005532585.localdomain podman[166062]: 2025-11-23 09:23:07.03483678 +0000 UTC m=+0.085263450 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:23:07 np0005532585.localdomain podman[166062]: 2025-11-23 09:23:07.107047393 +0000 UTC m=+0.157474133 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 23 09:23:07 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:23:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:23:09.233 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:23:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:23:09.233 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:23:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:23:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:23:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54931 DF PROTO=TCP SPT=53956 DPT=9100 SEQ=1081423421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C99CA00000000001030307) 
Nov 23 09:23:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:23:11 np0005532585.localdomain podman[166087]: 2025-11-23 09:23:11.027827125 +0000 UTC m=+0.085156715 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 23 09:23:11 np0005532585.localdomain podman[166087]: 2025-11-23 09:23:11.056506819 +0000 UTC m=+0.113836409 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:23:11 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:23:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19877 DF PROTO=TCP SPT=46712 DPT=9882 SEQ=2888539505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9A6F70000000001030307) 
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  Converting 2761 SID table entries...
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:23:12 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:23:13 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:23:13 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Nov 23 09:23:13 np0005532585.localdomain systemd-sysv-generator[166140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:23:13 np0005532585.localdomain systemd-rc-local-generator[166135]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:23:13 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:23:13 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:23:13 np0005532585.localdomain systemd-rc-local-generator[166176]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:23:13 np0005532585.localdomain systemd-sysv-generator[166179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:23:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:23:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19879 DF PROTO=TCP SPT=46712 DPT=9882 SEQ=2888539505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9B2E10000000001030307) 
Nov 23 09:23:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62046 DF PROTO=TCP SPT=35598 DPT=9102 SEQ=3206288983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9C0200000000001030307) 
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 09:23:22 np0005532585.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 09:23:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56073 DF PROTO=TCP SPT=42040 DPT=9102 SEQ=1421973694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9D4210000000001030307) 
Nov 23 09:23:23 np0005532585.localdomain groupadd[166201]: group added to /etc/group: name=clevis, GID=985
Nov 23 09:23:23 np0005532585.localdomain groupadd[166201]: group added to /etc/gshadow: name=clevis
Nov 23 09:23:23 np0005532585.localdomain groupadd[166201]: new group: name=clevis, GID=985
Nov 23 09:23:23 np0005532585.localdomain useradd[166208]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 23 09:23:23 np0005532585.localdomain usermod[166218]: add 'clevis' to group 'tss'
Nov 23 09:23:23 np0005532585.localdomain usermod[166218]: add 'clevis' to shadow group 'tss'
Nov 23 09:23:26 np0005532585.localdomain groupadd[166240]: group added to /etc/group: name=dnsmasq, GID=984
Nov 23 09:23:26 np0005532585.localdomain groupadd[166240]: group added to /etc/gshadow: name=dnsmasq
Nov 23 09:23:26 np0005532585.localdomain groupadd[166240]: new group: name=dnsmasq, GID=984
Nov 23 09:23:26 np0005532585.localdomain useradd[166247]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 23 09:23:26 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 09:23:26 np0005532585.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Nov 23 09:23:26 np0005532585.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Nov 23 09:23:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19881 DF PROTO=TCP SPT=46712 DPT=9882 SEQ=2888539505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9E2210000000001030307) 
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Reloading rules
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Collecting garbage unconditionally...
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Finished loading, compiling and executing 5 rules
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Reloading rules
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Collecting garbage unconditionally...
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Loading rules from directory /etc/polkit-1/rules.d
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 23 09:23:28 np0005532585.localdomain polkitd[1037]: Finished loading, compiling and executing 5 rules
Nov 23 09:23:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33924 DF PROTO=TCP SPT=46538 DPT=9101 SEQ=4155092695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9ECFA0000000001030307) 
Nov 23 09:23:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32017 DF PROTO=TCP SPT=55642 DPT=9105 SEQ=149405895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9EDAB0000000001030307) 
Nov 23 09:23:30 np0005532585.localdomain sudo[166429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:23:30 np0005532585.localdomain sudo[166429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:23:30 np0005532585.localdomain sudo[166429]: pam_unix(sudo:session): session closed for user root
Nov 23 09:23:30 np0005532585.localdomain sudo[166447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:23:30 np0005532585.localdomain sudo[166447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:23:31 np0005532585.localdomain sudo[166447]: pam_unix(sudo:session): session closed for user root
Nov 23 09:23:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33926 DF PROTO=TCP SPT=46538 DPT=9101 SEQ=4155092695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9F9200000000001030307) 
Nov 23 09:23:33 np0005532585.localdomain sudo[166496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:23:33 np0005532585.localdomain sudo[166496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:23:33 np0005532585.localdomain sudo[166496]: pam_unix(sudo:session): session closed for user root
Nov 23 09:23:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25891 DF PROTO=TCP SPT=46556 DPT=9100 SEQ=2284853579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA06210000000001030307) 
Nov 23 09:23:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:23:38 np0005532585.localdomain systemd[1]: tmp-crun.MCZC0x.mount: Deactivated successfully.
Nov 23 09:23:38 np0005532585.localdomain podman[166514]: 2025-11-23 09:23:38.059188179 +0000 UTC m=+0.109371221 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 23 09:23:38 np0005532585.localdomain podman[166514]: 2025-11-23 09:23:38.172073651 +0000 UTC m=+0.222256673 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:23:38 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:23:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41340 DF PROTO=TCP SPT=56312 DPT=9100 SEQ=2314062371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA11E00000000001030307) 
Nov 23 09:23:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=183 DF PROTO=TCP SPT=35542 DPT=9882 SEQ=1332333384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA1C260000000001030307) 
Nov 23 09:23:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:23:42 np0005532585.localdomain systemd[1]: tmp-crun.fmW4Zq.mount: Deactivated successfully.
Nov 23 09:23:42 np0005532585.localdomain podman[167798]: 2025-11-23 09:23:42.035061345 +0000 UTC m=+0.081769002 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:23:42 np0005532585.localdomain podman[167798]: 2025-11-23 09:23:42.041521225 +0000 UTC m=+0.088228912 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 23 09:23:42 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:23:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=185 DF PROTO=TCP SPT=35542 DPT=9882 SEQ=1332333384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA28200000000001030307) 
Nov 23 09:23:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56075 DF PROTO=TCP SPT=42040 DPT=9102 SEQ=1421973694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA34200000000001030307) 
Nov 23 09:23:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36347 DF PROTO=TCP SPT=49520 DPT=9102 SEQ=3145602236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA49200000000001030307) 
Nov 23 09:23:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=187 DF PROTO=TCP SPT=35542 DPT=9882 SEQ=1332333384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA58200000000001030307) 
Nov 23 09:23:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25764 DF PROTO=TCP SPT=40478 DPT=9101 SEQ=2537767173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA622A0000000001030307) 
Nov 23 09:23:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5139 DF PROTO=TCP SPT=47666 DPT=9105 SEQ=3170194503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA62DB0000000001030307) 
Nov 23 09:24:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5141 DF PROTO=TCP SPT=47666 DPT=9105 SEQ=3170194503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA6EE00000000001030307) 
Nov 23 09:24:04 np0005532585.localdomain groupadd[183372]: group added to /etc/group: name=ceph, GID=167
Nov 23 09:24:04 np0005532585.localdomain groupadd[183372]: group added to /etc/gshadow: name=ceph
Nov 23 09:24:04 np0005532585.localdomain groupadd[183372]: new group: name=ceph, GID=167
Nov 23 09:24:04 np0005532585.localdomain useradd[183378]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 23 09:24:05 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54934 DF PROTO=TCP SPT=53956 DPT=9100 SEQ=1081423421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA7A200000000001030307) 
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 23 09:24:07 np0005532585.localdomain sshd[120095]: Received signal 15; terminating.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: sshd.service: Consumed 1.215s CPU time, read 32.0K from disk, written 0B to disk.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 23 09:24:07 np0005532585.localdomain sshd[184045]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:24:07 np0005532585.localdomain sshd[184045]: Server listening on 0.0.0.0 port 22.
Nov 23 09:24:07 np0005532585.localdomain sshd[184045]: Server listening on :: port 22.
Nov 23 09:24:07 np0005532585.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:24:08 np0005532585.localdomain podman[184073]: 2025-11-23 09:24:08.3170077 +0000 UTC m=+0.086887140 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain podman[184073]: 2025-11-23 09:24:08.435432161 +0000 UTC m=+0.205311631 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:08 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:24:09.234 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:24:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:24:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:24:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:24:09.236 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28877 DF PROTO=TCP SPT=55906 DPT=9100 SEQ=14938984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA87200000000001030307) 
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 09:24:09 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:10 np0005532585.localdomain systemd-sysv-generator[184304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:10 np0005532585.localdomain systemd-rc-local-generator[184298]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 09:24:10 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 09:24:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64444 DF PROTO=TCP SPT=58034 DPT=9882 SEQ=1980338375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA91570000000001030307) 
Nov 23 09:24:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:24:12 np0005532585.localdomain systemd[1]: tmp-crun.hra9Ch.mount: Deactivated successfully.
Nov 23 09:24:12 np0005532585.localdomain podman[187632]: 2025-11-23 09:24:12.5482311 +0000 UTC m=+0.100654795 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 09:24:12 np0005532585.localdomain podman[187632]: 2025-11-23 09:24:12.585484058 +0000 UTC m=+0.137907753 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:24:12 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:24:13 np0005532585.localdomain sudo[164743]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64446 DF PROTO=TCP SPT=58034 DPT=9882 SEQ=1980338375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA9D600000000001030307) 
Nov 23 09:24:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36349 DF PROTO=TCP SPT=49520 DPT=9102 SEQ=3145602236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAAA210000000001030307) 
Nov 23 09:24:21 np0005532585.localdomain sudo[192899]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anypisntayaljfbxkzfbepvgjyjgqcxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889860.5654926-990-13552589728392/AnsiballZ_systemd.py
Nov 23 09:24:21 np0005532585.localdomain sudo[192899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Consumed 13.822s CPU time.
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: run-rfd34d12b1e6647b585e2c3a7a1d21fe5.service: Deactivated successfully.
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: run-r90e79007a5d8497ca59a5df0a1fbebd9.service: Deactivated successfully.
Nov 23 09:24:21 np0005532585.localdomain python3.9[192901]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:21 np0005532585.localdomain systemd-rc-local-generator[192931]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:21 np0005532585.localdomain systemd-sysv-generator[192935]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:21 np0005532585.localdomain sudo[192899]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:22 np0005532585.localdomain sudo[193049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-surhumlytsczzkccqqfnacqastkqygbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889861.9681342-990-172801758175749/AnsiballZ_systemd.py
Nov 23 09:24:22 np0005532585.localdomain sudo[193049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:22 np0005532585.localdomain python3.9[193051]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:22 np0005532585.localdomain systemd-rc-local-generator[193077]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:22 np0005532585.localdomain systemd-sysv-generator[193084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:22 np0005532585.localdomain sudo[193049]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27512 DF PROTO=TCP SPT=60458 DPT=9102 SEQ=2990301430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CABE600000000001030307) 
Nov 23 09:24:24 np0005532585.localdomain sudo[193198]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vstizghkosobbxmgjnbctatdbwpqljcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889863.6813722-990-7716968002733/AnsiballZ_systemd.py
Nov 23 09:24:24 np0005532585.localdomain sudo[193198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:24 np0005532585.localdomain python3.9[193200]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:24 np0005532585.localdomain systemd-rc-local-generator[193225]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:24 np0005532585.localdomain systemd-sysv-generator[193228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:24 np0005532585.localdomain sudo[193198]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:25 np0005532585.localdomain sudo[193347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghzpqakptfuckkuvljvjkrsxtqdoatlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889864.9439638-990-4828104233671/AnsiballZ_systemd.py
Nov 23 09:24:25 np0005532585.localdomain sudo[193347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:25 np0005532585.localdomain python3.9[193349]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:26 np0005532585.localdomain systemd-sysv-generator[193383]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:26 np0005532585.localdomain systemd-rc-local-generator[193380]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:26 np0005532585.localdomain sudo[193347]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64448 DF PROTO=TCP SPT=58034 DPT=9882 SEQ=1980338375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CACE200000000001030307) 
Nov 23 09:24:27 np0005532585.localdomain sudo[193496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgjdyirhhrfksfyllmhhypdmecoobmnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889867.4749072-1077-56634745227691/AnsiballZ_systemd.py
Nov 23 09:24:27 np0005532585.localdomain sudo[193496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:28 np0005532585.localdomain python3.9[193498]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:28 np0005532585.localdomain systemd-rc-local-generator[193526]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:28 np0005532585.localdomain systemd-sysv-generator[193532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:28 np0005532585.localdomain sudo[193496]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:28 np0005532585.localdomain sudo[193645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpkpauwpmfsirwllfchuutpzbgemqbte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889868.534921-1077-113890016418370/AnsiballZ_systemd.py
Nov 23 09:24:28 np0005532585.localdomain sudo[193645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:29 np0005532585.localdomain python3.9[193647]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:29 np0005532585.localdomain systemd-sysv-generator[193676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:29 np0005532585.localdomain systemd-rc-local-generator[193672]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:29 np0005532585.localdomain sudo[193645]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38193 DF PROTO=TCP SPT=34058 DPT=9101 SEQ=1688640829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAD7590000000001030307) 
Nov 23 09:24:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34530 DF PROTO=TCP SPT=39396 DPT=9105 SEQ=3013725068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAD80B0000000001030307) 
Nov 23 09:24:30 np0005532585.localdomain sudo[193794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxmonlwrpevfyckbnsgqbnhtjsvmayvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889869.825553-1077-189264143097561/AnsiballZ_systemd.py
Nov 23 09:24:30 np0005532585.localdomain sudo[193794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:30 np0005532585.localdomain python3.9[193796]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:30 np0005532585.localdomain systemd-rc-local-generator[193823]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:30 np0005532585.localdomain systemd-sysv-generator[193828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:30 np0005532585.localdomain sudo[193794]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:31 np0005532585.localdomain sudo[193943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgrlzttksmuxttwndsrkrmpymjdsmvjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889870.9092586-1077-19829767281413/AnsiballZ_systemd.py
Nov 23 09:24:31 np0005532585.localdomain sudo[193943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:31 np0005532585.localdomain python3.9[193945]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:31 np0005532585.localdomain sudo[193943]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:31 np0005532585.localdomain sudo[194056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqprrcuirosikjblkyscnsqupeyqaywn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889871.6282096-1077-54273217243131/AnsiballZ_systemd.py
Nov 23 09:24:31 np0005532585.localdomain sudo[194056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:32 np0005532585.localdomain python3.9[194058]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:32 np0005532585.localdomain systemd-sysv-generator[194087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:32 np0005532585.localdomain systemd-rc-local-generator[194084]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:32 np0005532585.localdomain sudo[194056]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38195 DF PROTO=TCP SPT=34058 DPT=9101 SEQ=1688640829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAE3600000000001030307) 
Nov 23 09:24:33 np0005532585.localdomain sudo[194205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prftqqxowygtmjrcpzngekpclwpsfipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889873.0086272-1185-270283593001939/AnsiballZ_systemd.py
Nov 23 09:24:33 np0005532585.localdomain sudo[194205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:33 np0005532585.localdomain python3.9[194207]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:24:33 np0005532585.localdomain sudo[194211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:24:33 np0005532585.localdomain systemd-rc-local-generator[194251]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:24:33 np0005532585.localdomain systemd-sysv-generator[194256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:24:33 np0005532585.localdomain sudo[194211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:24:33 np0005532585.localdomain sudo[194211]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:34 np0005532585.localdomain sudo[194205]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:34 np0005532585.localdomain sudo[194265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:24:34 np0005532585.localdomain sudo[194265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:24:34 np0005532585.localdomain sudo[194404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqeqwsjsszbobhamsdsqnxrqzenleczb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889874.2378192-1209-195744993877402/AnsiballZ_systemd.py
Nov 23 09:24:34 np0005532585.localdomain sudo[194404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:34 np0005532585.localdomain sudo[194265]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:34 np0005532585.localdomain python3.9[194409]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:34 np0005532585.localdomain sudo[194404]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:35 np0005532585.localdomain sudo[194482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:24:35 np0005532585.localdomain sudo[194482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:24:35 np0005532585.localdomain sudo[194482]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:36 np0005532585.localdomain sudo[194552]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dssizylkiminhewxrvrqabsiggfabjma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889875.0280044-1209-99978762086710/AnsiballZ_systemd.py
Nov 23 09:24:36 np0005532585.localdomain sudo[194552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41343 DF PROTO=TCP SPT=56312 DPT=9100 SEQ=2314062371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAF0210000000001030307) 
Nov 23 09:24:36 np0005532585.localdomain python3.9[194554]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:36 np0005532585.localdomain sudo[194552]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:36 np0005532585.localdomain sudo[194665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwfmzlrwtnlhzunzttvkzakjoktbxjlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889876.6613555-1209-270987093166008/AnsiballZ_systemd.py
Nov 23 09:24:36 np0005532585.localdomain sudo[194665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:37 np0005532585.localdomain python3.9[194667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:38 np0005532585.localdomain sudo[194665]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:38 np0005532585.localdomain sudo[194778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxwdpmxbrhqrybqumdmxwfetvgaalvcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889878.421619-1209-239809841756669/AnsiballZ_systemd.py
Nov 23 09:24:38 np0005532585.localdomain sudo[194778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:24:38 np0005532585.localdomain systemd[1]: tmp-crun.rbHpv5.mount: Deactivated successfully.
Nov 23 09:24:38 np0005532585.localdomain podman[194780]: 2025-11-23 09:24:38.787918201 +0000 UTC m=+0.097776246 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:24:38 np0005532585.localdomain podman[194780]: 2025-11-23 09:24:38.864195592 +0000 UTC m=+0.174053677 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:24:38 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:24:38 np0005532585.localdomain python3.9[194781]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:39 np0005532585.localdomain sudo[194778]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61661 DF PROTO=TCP SPT=44452 DPT=9100 SEQ=3680421593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAFC600000000001030307) 
Nov 23 09:24:39 np0005532585.localdomain sudo[194917]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qocvxvriegqansboblqcjmnbjsnbnole ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889879.151492-1209-256605339370663/AnsiballZ_systemd.py
Nov 23 09:24:39 np0005532585.localdomain sudo[194917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:39 np0005532585.localdomain python3.9[194919]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:39 np0005532585.localdomain sudo[194917]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:40 np0005532585.localdomain sudo[195030]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qschcdkuuwfvpyzzkvbnuypxbxmlcnti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889879.9876058-1209-161689639244708/AnsiballZ_systemd.py
Nov 23 09:24:40 np0005532585.localdomain sudo[195030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:40 np0005532585.localdomain python3.9[195032]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:40 np0005532585.localdomain sudo[195030]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:40 np0005532585.localdomain sudo[195143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svgbxajjyyllzmxcnryntbhrhphxqatb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889880.7249038-1209-145052066081591/AnsiballZ_systemd.py
Nov 23 09:24:40 np0005532585.localdomain sudo[195143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:41 np0005532585.localdomain python3.9[195145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63862 DF PROTO=TCP SPT=40028 DPT=9882 SEQ=4134728545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB06860000000001030307) 
Nov 23 09:24:42 np0005532585.localdomain sudo[195143]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:42 np0005532585.localdomain sudo[195256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzykqiflxbagmsofnveocdwklywxieyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889882.4766855-1209-56441375156335/AnsiballZ_systemd.py
Nov 23 09:24:42 np0005532585.localdomain sudo[195256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:24:42 np0005532585.localdomain podman[195259]: 2025-11-23 09:24:42.841444171 +0000 UTC m=+0.084099074 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:24:42 np0005532585.localdomain podman[195259]: 2025-11-23 09:24:42.87545951 +0000 UTC m=+0.118114453 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:24:42 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:24:43 np0005532585.localdomain python3.9[195258]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:43 np0005532585.localdomain sudo[195256]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:43 np0005532585.localdomain sudo[195388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgdklhmqdhejulhimwavwgfvoxiqwuod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889883.2377942-1209-107243806371804/AnsiballZ_systemd.py
Nov 23 09:24:43 np0005532585.localdomain sudo[195388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:43 np0005532585.localdomain python3.9[195390]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:44 np0005532585.localdomain sudo[195388]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63864 DF PROTO=TCP SPT=40028 DPT=9882 SEQ=4134728545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB12A10000000001030307) 
Nov 23 09:24:45 np0005532585.localdomain sudo[195501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhojuadverrsiyujqxdjczsiseoxqdry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889885.0194588-1209-57643806679865/AnsiballZ_systemd.py
Nov 23 09:24:45 np0005532585.localdomain sudo[195501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:45 np0005532585.localdomain python3.9[195503]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:45 np0005532585.localdomain sudo[195501]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:46 np0005532585.localdomain sudo[195614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpokslossvmkwprivngoeazzrpgtyprs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889885.7588305-1209-187575731743538/AnsiballZ_systemd.py
Nov 23 09:24:46 np0005532585.localdomain sudo[195614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:46 np0005532585.localdomain python3.9[195616]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:46 np0005532585.localdomain sudo[195614]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:46 np0005532585.localdomain sudo[195727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipkoapzazmstpepsqozqjjmyalktyesp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889886.4848404-1209-234257146746194/AnsiballZ_systemd.py
Nov 23 09:24:46 np0005532585.localdomain sudo[195727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:47 np0005532585.localdomain python3.9[195729]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:47 np0005532585.localdomain sudo[195727]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:47 np0005532585.localdomain sudo[195840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cppsswleftvbhzmptykhrpcesxycjctx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889887.2561684-1209-188383527687354/AnsiballZ_systemd.py
Nov 23 09:24:47 np0005532585.localdomain sudo[195840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:47 np0005532585.localdomain python3.9[195842]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:47 np0005532585.localdomain sudo[195840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27514 DF PROTO=TCP SPT=60458 DPT=9102 SEQ=2990301430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB1E200000000001030307) 
Nov 23 09:24:48 np0005532585.localdomain sudo[195953]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnjnzyqalottqcevkakykrvfbjoefgdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889887.9782789-1209-49562711211930/AnsiballZ_systemd.py
Nov 23 09:24:48 np0005532585.localdomain sudo[195953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:49 np0005532585.localdomain python3.9[195955]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 09:24:49 np0005532585.localdomain sudo[195953]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:51 np0005532585.localdomain sudo[196066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-matgorxlmtwkuzjzblbikctwrjbphclz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889891.6993935-1515-205910065940178/AnsiballZ_file.py
Nov 23 09:24:51 np0005532585.localdomain sudo[196066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:52 np0005532585.localdomain python3.9[196068]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:24:52 np0005532585.localdomain sudo[196066]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:52 np0005532585.localdomain sudo[196176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuhsxcjbrsbdjznmrqkbrlveanhzbsdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889892.2855024-1515-27494891502799/AnsiballZ_file.py
Nov 23 09:24:52 np0005532585.localdomain sudo[196176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:52 np0005532585.localdomain python3.9[196178]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:24:52 np0005532585.localdomain sudo[196176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:53 np0005532585.localdomain sudo[196286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xokecgrqwizglxanjzufwsbewentkavw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889892.8829825-1515-116472599983487/AnsiballZ_file.py
Nov 23 09:24:53 np0005532585.localdomain sudo[196286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:53 np0005532585.localdomain python3.9[196288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:24:53 np0005532585.localdomain sudo[196286]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59770 DF PROTO=TCP SPT=42568 DPT=9102 SEQ=694892868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB33A10000000001030307) 
Nov 23 09:24:53 np0005532585.localdomain sudo[196396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiizvakzgsqejizpcsvpriozsbydjgee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889893.5002463-1515-22733762325047/AnsiballZ_file.py
Nov 23 09:24:53 np0005532585.localdomain sudo[196396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:53 np0005532585.localdomain python3.9[196398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:24:53 np0005532585.localdomain sudo[196396]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:54 np0005532585.localdomain sudo[196506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keewwiczspvbnvimstjqzihrnyvrbzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889894.1008275-1515-255462335064780/AnsiballZ_file.py
Nov 23 09:24:54 np0005532585.localdomain sudo[196506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:54 np0005532585.localdomain python3.9[196508]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:24:54 np0005532585.localdomain sudo[196506]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:54 np0005532585.localdomain sudo[196616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aucgeehjmmkyvwpoysipjbnqpamsakdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889894.7248676-1515-177029276215649/AnsiballZ_file.py
Nov 23 09:24:54 np0005532585.localdomain sudo[196616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:55 np0005532585.localdomain python3.9[196618]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:24:55 np0005532585.localdomain sudo[196616]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:55 np0005532585.localdomain sudo[196726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcwumxviaahxdoixicybvtwlpzdevijl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889895.512462-1644-192067208845909/AnsiballZ_stat.py
Nov 23 09:24:55 np0005532585.localdomain sudo[196726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:56 np0005532585.localdomain python3.9[196728]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:24:56 np0005532585.localdomain sudo[196726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:56 np0005532585.localdomain sudo[196816]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upljrqnspejrvbcjnfjrbgqpfznexifx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889895.512462-1644-192067208845909/AnsiballZ_copy.py
Nov 23 09:24:56 np0005532585.localdomain sudo[196816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:56 np0005532585.localdomain python3.9[196818]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889895.512462-1644-192067208845909/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:24:56 np0005532585.localdomain sudo[196816]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63866 DF PROTO=TCP SPT=40028 DPT=9882 SEQ=4134728545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB42200000000001030307) 
Nov 23 09:24:57 np0005532585.localdomain sudo[196926]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adwonlvsekeiiufocskpwlptpofbkukk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889897.0026946-1644-165844142534658/AnsiballZ_stat.py
Nov 23 09:24:57 np0005532585.localdomain sudo[196926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:57 np0005532585.localdomain python3.9[196928]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:24:57 np0005532585.localdomain sudo[196926]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:57 np0005532585.localdomain sudo[197016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbgvilddquhgueirtqmjvakwuznlpfoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889897.0026946-1644-165844142534658/AnsiballZ_copy.py
Nov 23 09:24:57 np0005532585.localdomain sudo[197016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:58 np0005532585.localdomain python3.9[197018]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889897.0026946-1644-165844142534658/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:24:58 np0005532585.localdomain sudo[197016]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:58 np0005532585.localdomain sudo[197126]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drkowvizsvwxhffvgpexiejivyraqhqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889898.2616239-1644-132602676044445/AnsiballZ_stat.py
Nov 23 09:24:58 np0005532585.localdomain sudo[197126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:58 np0005532585.localdomain python3.9[197128]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:24:58 np0005532585.localdomain sudo[197126]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:59 np0005532585.localdomain sudo[197216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghuzusvwhvthpvptrkmhopqxsibtftzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889898.2616239-1644-132602676044445/AnsiballZ_copy.py
Nov 23 09:24:59 np0005532585.localdomain sudo[197216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:59 np0005532585.localdomain python3.9[197218]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889898.2616239-1644-132602676044445/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:24:59 np0005532585.localdomain sudo[197216]: pam_unix(sudo:session): session closed for user root
Nov 23 09:24:59 np0005532585.localdomain sudo[197326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhdiurdabvchqcitooykajmexyeozryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889899.4260101-1644-180176614613347/AnsiballZ_stat.py
Nov 23 09:24:59 np0005532585.localdomain sudo[197326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:24:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54055 DF PROTO=TCP SPT=34536 DPT=9101 SEQ=3284786878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB4C8A0000000001030307) 
Nov 23 09:24:59 np0005532585.localdomain python3.9[197328]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:24:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45780 DF PROTO=TCP SPT=48102 DPT=9105 SEQ=4169266320 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB4D3B0000000001030307) 
Nov 23 09:25:00 np0005532585.localdomain sudo[197326]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:01 np0005532585.localdomain sudo[197416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylxsassodyshrrzxwbbqabvkjnrfdgvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889899.4260101-1644-180176614613347/AnsiballZ_copy.py
Nov 23 09:25:01 np0005532585.localdomain sudo[197416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:01 np0005532585.localdomain python3.9[197418]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889899.4260101-1644-180176614613347/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:01 np0005532585.localdomain sudo[197416]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:02 np0005532585.localdomain sudo[197526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otitgefkhaiwonfiobqeqsadohzlpapk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889901.5670555-1644-13305777703161/AnsiballZ_stat.py
Nov 23 09:25:02 np0005532585.localdomain sudo[197526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:02 np0005532585.localdomain python3.9[197528]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:02 np0005532585.localdomain sudo[197526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54057 DF PROTO=TCP SPT=34536 DPT=9101 SEQ=3284786878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB58A00000000001030307) 
Nov 23 09:25:03 np0005532585.localdomain sudo[197616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgniimzymdlxoeblkvgkpyqtiyocnlnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889901.5670555-1644-13305777703161/AnsiballZ_copy.py
Nov 23 09:25:03 np0005532585.localdomain sudo[197616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:03 np0005532585.localdomain python3.9[197618]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889901.5670555-1644-13305777703161/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:03 np0005532585.localdomain sudo[197616]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:03 np0005532585.localdomain sudo[197726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxabnptlthbpdgoqalfylnorjhauzcml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889903.4291651-1644-231606674667162/AnsiballZ_stat.py
Nov 23 09:25:03 np0005532585.localdomain sudo[197726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:03 np0005532585.localdomain python3.9[197728]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:03 np0005532585.localdomain sudo[197726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:04 np0005532585.localdomain sudo[197816]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmpemulbqmlrdrlrklobmhzofqocguso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889903.4291651-1644-231606674667162/AnsiballZ_copy.py
Nov 23 09:25:04 np0005532585.localdomain sudo[197816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:04 np0005532585.localdomain python3.9[197818]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889903.4291651-1644-231606674667162/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:04 np0005532585.localdomain sudo[197816]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:05 np0005532585.localdomain sudo[197926]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eybsmikzlbohxncljnugaplkrqrhwmtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889905.0879934-1644-129158892957846/AnsiballZ_stat.py
Nov 23 09:25:05 np0005532585.localdomain sudo[197926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:05 np0005532585.localdomain python3.9[197928]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:05 np0005532585.localdomain sudo[197926]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:05 np0005532585.localdomain sudo[198014]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivhkqsksghhbcfjogzpwzleiibsavxjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889905.0879934-1644-129158892957846/AnsiballZ_copy.py
Nov 23 09:25:05 np0005532585.localdomain sudo[198014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:06 np0005532585.localdomain python3.9[198016]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889905.0879934-1644-129158892957846/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:06 np0005532585.localdomain sudo[198014]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28880 DF PROTO=TCP SPT=55906 DPT=9100 SEQ=14938984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB66210000000001030307) 
Nov 23 09:25:06 np0005532585.localdomain sudo[198124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgumtqsvlesxatayiozoumqmkmebmsrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889906.1396778-1644-139627550053092/AnsiballZ_stat.py
Nov 23 09:25:06 np0005532585.localdomain sudo[198124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:06 np0005532585.localdomain python3.9[198126]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:06 np0005532585.localdomain sudo[198124]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:06 np0005532585.localdomain sudo[198214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgaqvmzvxtoqxbjqylnashiykgszbooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889906.1396778-1644-139627550053092/AnsiballZ_copy.py
Nov 23 09:25:06 np0005532585.localdomain sudo[198214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:07 np0005532585.localdomain python3.9[198216]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889906.1396778-1644-139627550053092/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:07 np0005532585.localdomain sudo[198214]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:07 np0005532585.localdomain sudo[198324]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvgypkjnojghddtsytjpbhgsxnjpletu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889907.4427216-1986-198125866579309/AnsiballZ_file.py
Nov 23 09:25:07 np0005532585.localdomain sudo[198324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:07 np0005532585.localdomain python3.9[198326]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:07 np0005532585.localdomain sudo[198324]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:08 np0005532585.localdomain sudo[198434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hshifibyeywlvcxcrvqsabsmpgohcfya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889908.1336527-2010-146479908019682/AnsiballZ_file.py
Nov 23 09:25:08 np0005532585.localdomain sudo[198434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:08 np0005532585.localdomain python3.9[198436]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:08 np0005532585.localdomain sudo[198434]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:25:09 np0005532585.localdomain systemd[1]: tmp-crun.cpQIjq.mount: Deactivated successfully.
Nov 23 09:25:09 np0005532585.localdomain podman[198508]: 2025-11-23 09:25:09.045121762 +0000 UTC m=+0.098267469 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:25:09 np0005532585.localdomain sudo[198562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uftbnqgytamfqqugmpvuhvywrgtgayio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889908.7806478-2010-78665643116467/AnsiballZ_file.py
Nov 23 09:25:09 np0005532585.localdomain sudo[198562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:09 np0005532585.localdomain podman[198508]: 2025-11-23 09:25:09.114313889 +0000 UTC m=+0.167459596 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:25:09 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:25:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:25:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:25:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:25:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:25:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:25:09.237 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:25:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48974 DF PROTO=TCP SPT=35424 DPT=9100 SEQ=3493553969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB71600000000001030307) 
Nov 23 09:25:09 np0005532585.localdomain python3.9[198567]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:09 np0005532585.localdomain sudo[198562]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:09 np0005532585.localdomain sudo[198680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjakrmbhaookqyqagzudtckcxhuqatws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889909.4232385-2010-233485547315968/AnsiballZ_file.py
Nov 23 09:25:09 np0005532585.localdomain sudo[198680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:09 np0005532585.localdomain python3.9[198682]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:09 np0005532585.localdomain sudo[198680]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:10 np0005532585.localdomain sudo[198790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwwllwtuispxxvtplejpupphcxivxltu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889910.1007423-2010-183194384216058/AnsiballZ_file.py
Nov 23 09:25:10 np0005532585.localdomain sudo[198790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:10 np0005532585.localdomain python3.9[198792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:10 np0005532585.localdomain sudo[198790]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:10 np0005532585.localdomain sudo[198900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnhdbeembtrfdjmkcooxaeeyxcktogkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889910.7082238-2010-190251333924290/AnsiballZ_file.py
Nov 23 09:25:10 np0005532585.localdomain sudo[198900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:11 np0005532585.localdomain python3.9[198902]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:11 np0005532585.localdomain sudo[198900]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:11 np0005532585.localdomain sudo[199010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sndcbxjfxillagfymfzuaiilzphmczgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889911.3038952-2010-165991886961817/AnsiballZ_file.py
Nov 23 09:25:11 np0005532585.localdomain sudo[199010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:11 np0005532585.localdomain python3.9[199012]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:11 np0005532585.localdomain sudo[199010]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64472 DF PROTO=TCP SPT=52564 DPT=9882 SEQ=2627936758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB7BB70000000001030307) 
Nov 23 09:25:12 np0005532585.localdomain sudo[199120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quzynvcikkqmqlbmedcvluqjnuzjywvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889911.9252546-2010-130924342646831/AnsiballZ_file.py
Nov 23 09:25:12 np0005532585.localdomain sudo[199120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:12 np0005532585.localdomain python3.9[199122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:12 np0005532585.localdomain sudo[199120]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:12 np0005532585.localdomain sudo[199230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abhbpgkkrquplgoowtdwwjsxetazuado ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889912.604202-2010-57585998999084/AnsiballZ_file.py
Nov 23 09:25:12 np0005532585.localdomain sudo[199230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:25:13 np0005532585.localdomain systemd[1]: tmp-crun.BUgRhf.mount: Deactivated successfully.
Nov 23 09:25:13 np0005532585.localdomain podman[199233]: 2025-11-23 09:25:13.022073822 +0000 UTC m=+0.081892956 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 09:25:13 np0005532585.localdomain podman[199233]: 2025-11-23 09:25:13.031322218 +0000 UTC m=+0.091141342 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 09:25:13 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:25:13 np0005532585.localdomain python3.9[199232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:13 np0005532585.localdomain sudo[199230]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:13 np0005532585.localdomain sudo[199357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpktxbuqhppqyikirkgvptgodtygvtjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889913.2017689-2010-156141534814172/AnsiballZ_file.py
Nov 23 09:25:13 np0005532585.localdomain sudo[199357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:13 np0005532585.localdomain python3.9[199359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:13 np0005532585.localdomain sudo[199357]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:14 np0005532585.localdomain sudo[199467]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmosvrtlgntsoqwairozarlolflvqwpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889913.7793915-2010-180197957541399/AnsiballZ_file.py
Nov 23 09:25:14 np0005532585.localdomain sudo[199467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:14 np0005532585.localdomain python3.9[199469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:14 np0005532585.localdomain sudo[199467]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:14 np0005532585.localdomain sudo[199577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzviiniokfvrfeyzixcruwqovhhsxudz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889914.3138192-2010-80490047313840/AnsiballZ_file.py
Nov 23 09:25:14 np0005532585.localdomain sudo[199577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:14 np0005532585.localdomain python3.9[199579]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:14 np0005532585.localdomain sudo[199577]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64474 DF PROTO=TCP SPT=52564 DPT=9882 SEQ=2627936758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB87A00000000001030307) 
Nov 23 09:25:15 np0005532585.localdomain sudo[199687]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imiaewuueqdcvcpcudrxfuvwlccltfyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889914.9202263-2010-124417370827721/AnsiballZ_file.py
Nov 23 09:25:15 np0005532585.localdomain sudo[199687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:15 np0005532585.localdomain python3.9[199689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:15 np0005532585.localdomain sudo[199687]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:15 np0005532585.localdomain sudo[199797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chzdhfaxbsxktqdudqrqzrvzhzjqnqyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889915.4925456-2010-47602848662165/AnsiballZ_file.py
Nov 23 09:25:15 np0005532585.localdomain sudo[199797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:15 np0005532585.localdomain python3.9[199799]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:15 np0005532585.localdomain sudo[199797]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:16 np0005532585.localdomain sudo[199907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdqvdmwljicpqfqzanpfrxlmthujxfcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889916.1139226-2010-128414327012426/AnsiballZ_file.py
Nov 23 09:25:16 np0005532585.localdomain sudo[199907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:16 np0005532585.localdomain python3.9[199909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:16 np0005532585.localdomain sudo[199907]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59772 DF PROTO=TCP SPT=42568 DPT=9102 SEQ=694892868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB94200000000001030307) 
Nov 23 09:25:18 np0005532585.localdomain sudo[200017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aexmwzjbkvxosavftrpwsrxjigjvhyye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889918.3684776-2307-174594629548743/AnsiballZ_stat.py
Nov 23 09:25:18 np0005532585.localdomain sudo[200017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:18 np0005532585.localdomain python3.9[200019]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:18 np0005532585.localdomain sudo[200017]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:19 np0005532585.localdomain sudo[200105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spjsuvfuczjzyiczqotsdwxdtbltvzdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889918.3684776-2307-174594629548743/AnsiballZ_copy.py
Nov 23 09:25:19 np0005532585.localdomain sudo[200105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:19 np0005532585.localdomain python3.9[200107]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889918.3684776-2307-174594629548743/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:19 np0005532585.localdomain sudo[200105]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:19 np0005532585.localdomain sudo[200215]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khrfbvdewcqpnnpxpturbiugutsxotpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889919.5529187-2307-91126729994897/AnsiballZ_stat.py
Nov 23 09:25:19 np0005532585.localdomain sudo[200215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:20 np0005532585.localdomain python3.9[200217]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:20 np0005532585.localdomain sudo[200215]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:20 np0005532585.localdomain sudo[200303]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjvqdssfsfmoyrimhnjrfzfxzowruliv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889919.5529187-2307-91126729994897/AnsiballZ_copy.py
Nov 23 09:25:20 np0005532585.localdomain sudo[200303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:21 np0005532585.localdomain python3.9[200305]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889919.5529187-2307-91126729994897/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:21 np0005532585.localdomain sudo[200303]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:21 np0005532585.localdomain sudo[200413]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hylrmgvhiukbklezmaunqgjezxijadbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889921.321161-2307-96917649562328/AnsiballZ_stat.py
Nov 23 09:25:21 np0005532585.localdomain sudo[200413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:21 np0005532585.localdomain python3.9[200415]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:21 np0005532585.localdomain sudo[200413]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:22 np0005532585.localdomain sudo[200501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-novgvcapebyhwlzdaupszryhucccvack ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889921.321161-2307-96917649562328/AnsiballZ_copy.py
Nov 23 09:25:22 np0005532585.localdomain sudo[200501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:22 np0005532585.localdomain python3.9[200503]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889921.321161-2307-96917649562328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:22 np0005532585.localdomain sudo[200501]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:22 np0005532585.localdomain sudo[200611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvpbmwvqsmiqmgvwqnfvsxprvmmvlpvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889922.4447055-2307-86066353550641/AnsiballZ_stat.py
Nov 23 09:25:22 np0005532585.localdomain sudo[200611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:22 np0005532585.localdomain python3.9[200613]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:22 np0005532585.localdomain sudo[200611]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:23 np0005532585.localdomain sudo[200699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjjaaryvcnwtwyyubmacbkkfhmgpwyoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889922.4447055-2307-86066353550641/AnsiballZ_copy.py
Nov 23 09:25:23 np0005532585.localdomain sudo[200699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43889 DF PROTO=TCP SPT=35166 DPT=9102 SEQ=1640571255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBA8E00000000001030307) 
Nov 23 09:25:23 np0005532585.localdomain python3.9[200701]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889922.4447055-2307-86066353550641/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:23 np0005532585.localdomain sudo[200699]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:23 np0005532585.localdomain sudo[200809]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qngbxsizgqfdklvnsvnhxreryqmnrfxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889923.5928783-2307-53087487242397/AnsiballZ_stat.py
Nov 23 09:25:23 np0005532585.localdomain sudo[200809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:24 np0005532585.localdomain python3.9[200811]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:24 np0005532585.localdomain sudo[200809]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:24 np0005532585.localdomain sudo[200897]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugbldqlcakiuuaunqnwfmurtgjjsfqwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889923.5928783-2307-53087487242397/AnsiballZ_copy.py
Nov 23 09:25:24 np0005532585.localdomain sudo[200897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:24 np0005532585.localdomain python3.9[200899]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889923.5928783-2307-53087487242397/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:24 np0005532585.localdomain sudo[200897]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:25 np0005532585.localdomain sudo[201007]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imzulosfnsxkhfohxlvdunrpfuhvkvos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889924.782969-2307-79215386343755/AnsiballZ_stat.py
Nov 23 09:25:25 np0005532585.localdomain sudo[201007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:25 np0005532585.localdomain python3.9[201009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:25 np0005532585.localdomain sudo[201007]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:25 np0005532585.localdomain sudo[201095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpoofvkujiqpypslaopjdszregmeqwzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889924.782969-2307-79215386343755/AnsiballZ_copy.py
Nov 23 09:25:25 np0005532585.localdomain sudo[201095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:25 np0005532585.localdomain python3.9[201097]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889924.782969-2307-79215386343755/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:25 np0005532585.localdomain sudo[201095]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:26 np0005532585.localdomain sudo[201205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iudbanxmnknvvqkzxflkxhgizstkmeit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889925.9307976-2307-23444864604633/AnsiballZ_stat.py
Nov 23 09:25:26 np0005532585.localdomain sudo[201205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:26 np0005532585.localdomain python3.9[201207]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:26 np0005532585.localdomain sudo[201205]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:26 np0005532585.localdomain sudo[201293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfqkyvhvxmyjhshsopayzfrmqtcfcfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889925.9307976-2307-23444864604633/AnsiballZ_copy.py
Nov 23 09:25:26 np0005532585.localdomain sudo[201293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:26 np0005532585.localdomain python3.9[201295]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889925.9307976-2307-23444864604633/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:26 np0005532585.localdomain sudo[201293]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:27 np0005532585.localdomain sudo[201403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byocujyplnylalhyzeuxldzduplhfnit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889927.045695-2307-154630365563603/AnsiballZ_stat.py
Nov 23 09:25:27 np0005532585.localdomain sudo[201403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64476 DF PROTO=TCP SPT=52564 DPT=9882 SEQ=2627936758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBB8200000000001030307) 
Nov 23 09:25:27 np0005532585.localdomain python3.9[201405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:27 np0005532585.localdomain sudo[201403]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:28 np0005532585.localdomain sudo[201491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbdxgqikthczubbfgmaebdebtjsoajsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889927.045695-2307-154630365563603/AnsiballZ_copy.py
Nov 23 09:25:28 np0005532585.localdomain sudo[201491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:28 np0005532585.localdomain python3.9[201493]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889927.045695-2307-154630365563603/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:28 np0005532585.localdomain sudo[201491]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:28 np0005532585.localdomain sudo[201601]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfgdtjiqoqlwujjkulakgwegeozwykpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889928.3427062-2307-201849953653328/AnsiballZ_stat.py
Nov 23 09:25:28 np0005532585.localdomain sudo[201601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:28 np0005532585.localdomain python3.9[201603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:28 np0005532585.localdomain sudo[201601]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:29 np0005532585.localdomain sudo[201689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdontucygtwxrgpabajhjomxtczoacio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889928.3427062-2307-201849953653328/AnsiballZ_copy.py
Nov 23 09:25:29 np0005532585.localdomain sudo[201689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:29 np0005532585.localdomain python3.9[201691]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889928.3427062-2307-201849953653328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:29 np0005532585.localdomain sudo[201689]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:29 np0005532585.localdomain sudo[201799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcnhdmcfldqjgqavylgzyijyyqswyqzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889929.4993634-2307-270798856805065/AnsiballZ_stat.py
Nov 23 09:25:29 np0005532585.localdomain sudo[201799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23741 DF PROTO=TCP SPT=46510 DPT=9101 SEQ=215047230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBC1BA0000000001030307) 
Nov 23 09:25:29 np0005532585.localdomain python3.9[201801]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59945 DF PROTO=TCP SPT=41060 DPT=9105 SEQ=469432071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBC26A0000000001030307) 
Nov 23 09:25:29 np0005532585.localdomain sudo[201799]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:30 np0005532585.localdomain sudo[201887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnwkfdrbjitmoctrndxqsnwftrlzavvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889929.4993634-2307-270798856805065/AnsiballZ_copy.py
Nov 23 09:25:30 np0005532585.localdomain sudo[201887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:30 np0005532585.localdomain python3.9[201889]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889929.4993634-2307-270798856805065/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:30 np0005532585.localdomain sudo[201887]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:30 np0005532585.localdomain sudo[201997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfsudkcmpjfjgdiojhhpmhzmoulfoyju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889930.6268172-2307-92035410105332/AnsiballZ_stat.py
Nov 23 09:25:30 np0005532585.localdomain sudo[201997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:31 np0005532585.localdomain python3.9[201999]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:31 np0005532585.localdomain sudo[201997]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:31 np0005532585.localdomain sudo[202085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixlzltvnomlgxlotqdadsbgmtebmbzwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889930.6268172-2307-92035410105332/AnsiballZ_copy.py
Nov 23 09:25:31 np0005532585.localdomain sudo[202085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:32 np0005532585.localdomain python3.9[202087]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889930.6268172-2307-92035410105332/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:32 np0005532585.localdomain sudo[202085]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:32 np0005532585.localdomain sudo[202195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppxrydvajegnnrjocrolpsjosgewzlwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889932.2339375-2307-142074041105601/AnsiballZ_stat.py
Nov 23 09:25:32 np0005532585.localdomain sudo[202195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:32 np0005532585.localdomain python3.9[202197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:32 np0005532585.localdomain sudo[202195]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23743 DF PROTO=TCP SPT=46510 DPT=9101 SEQ=215047230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBCDE00000000001030307) 
Nov 23 09:25:33 np0005532585.localdomain sudo[202283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozhwygytwlllfasgircwagykhdnxshgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889932.2339375-2307-142074041105601/AnsiballZ_copy.py
Nov 23 09:25:33 np0005532585.localdomain sudo[202283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:33 np0005532585.localdomain python3.9[202285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889932.2339375-2307-142074041105601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:33 np0005532585.localdomain sudo[202283]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:34 np0005532585.localdomain sudo[202393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpcsstvbvsegsfxdcyhacqnbdywzfwku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889933.9871204-2307-172036676834250/AnsiballZ_stat.py
Nov 23 09:25:34 np0005532585.localdomain sudo[202393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:34 np0005532585.localdomain python3.9[202395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:34 np0005532585.localdomain sudo[202393]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:34 np0005532585.localdomain sudo[202481]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rankbaibvenqrzqfrnjofwxvtdyaxmuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889933.9871204-2307-172036676834250/AnsiballZ_copy.py
Nov 23 09:25:34 np0005532585.localdomain sudo[202481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:34 np0005532585.localdomain python3.9[202483]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889933.9871204-2307-172036676834250/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:34 np0005532585.localdomain sudo[202481]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:35 np0005532585.localdomain sudo[202591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zexucahqzjguvaigykqbzhucjsaeidfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889935.0426493-2307-17196693756386/AnsiballZ_stat.py
Nov 23 09:25:35 np0005532585.localdomain sudo[202591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:35 np0005532585.localdomain python3.9[202593]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:35 np0005532585.localdomain sudo[202591]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:35 np0005532585.localdomain sudo[202679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oueinyxlegiahrjxmqvjdzshswhytqnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889935.0426493-2307-17196693756386/AnsiballZ_copy.py
Nov 23 09:25:35 np0005532585.localdomain sudo[202679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:35 np0005532585.localdomain sudo[202682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:25:35 np0005532585.localdomain sudo[202682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:25:35 np0005532585.localdomain sudo[202682]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:35 np0005532585.localdomain sudo[202700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 09:25:35 np0005532585.localdomain sudo[202700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:25:36 np0005532585.localdomain python3.9[202681]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889935.0426493-2307-17196693756386/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:36 np0005532585.localdomain sudo[202679]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61664 DF PROTO=TCP SPT=44452 DPT=9100 SEQ=3680421593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBDA210000000001030307) 
Nov 23 09:25:36 np0005532585.localdomain sudo[202700]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:36 np0005532585.localdomain sudo[202805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:25:36 np0005532585.localdomain sudo[202805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:25:36 np0005532585.localdomain sudo[202805]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:36 np0005532585.localdomain sudo[202845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:25:36 np0005532585.localdomain sudo[202845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:25:36 np0005532585.localdomain python3.9[202881]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:25:37 np0005532585.localdomain sudo[202845]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:37 np0005532585.localdomain sudo[203026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qboetavitsxrnulqxgtlcbwejrlkilpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889937.0037165-2925-134267132178526/AnsiballZ_seboolean.py
Nov 23 09:25:37 np0005532585.localdomain sudo[203026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:37 np0005532585.localdomain python3.9[203028]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 23 09:25:37 np0005532585.localdomain sudo[203026]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:37 np0005532585.localdomain sudo[203029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:25:37 np0005532585.localdomain sudo[203029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:25:37 np0005532585.localdomain sudo[203029]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:38 np0005532585.localdomain sudo[203154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqravvongtxibupqojsxiahuvcaetirq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889938.2692714-2955-43354769878645/AnsiballZ_systemd.py
Nov 23 09:25:38 np0005532585.localdomain sudo[203154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:38 np0005532585.localdomain python3.9[203156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:25:38 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:25:38 np0005532585.localdomain systemd-sysv-generator[203183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:25:38 np0005532585.localdomain systemd-rc-local-generator[203179]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Starting libvirt logging daemon socket...
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Starting libvirt logging daemon...
Nov 23 09:25:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17159 DF PROTO=TCP SPT=40974 DPT=9100 SEQ=2781183349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBE6A00000000001030307) 
Nov 23 09:25:39 np0005532585.localdomain podman[203194]: 2025-11-23 09:25:39.29378761 +0000 UTC m=+0.101478542 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: Started libvirt logging daemon.
Nov 23 09:25:39 np0005532585.localdomain sudo[203154]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:39 np0005532585.localdomain podman[203194]: 2025-11-23 09:25:39.401304514 +0000 UTC m=+0.208995456 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:25:39 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:25:39 np0005532585.localdomain sudo[203331]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpwtmwzyzatmyyncrpyohldcxuduhdlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889939.4992776-2955-245596542823568/AnsiballZ_systemd.py
Nov 23 09:25:39 np0005532585.localdomain sudo[203331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:40 np0005532585.localdomain python3.9[203333]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:25:40 np0005532585.localdomain systemd-sysv-generator[203357]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:25:40 np0005532585.localdomain systemd-rc-local-generator[203354]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 23 09:25:40 np0005532585.localdomain systemd[1]: Started libvirt nodedev daemon.
Nov 23 09:25:40 np0005532585.localdomain sudo[203331]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:40 np0005532585.localdomain sudo[203506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiomwilmixcsjnsyzsxjbazajbrkzzzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889940.6326966-2955-182171821637457/AnsiballZ_systemd.py
Nov 23 09:25:40 np0005532585.localdomain sudo[203506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:41 np0005532585.localdomain python3.9[203508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:25:41 np0005532585.localdomain systemd-rc-local-generator[203534]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:25:41 np0005532585.localdomain systemd-sysv-generator[203539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Started libvirt proxy daemon.
Nov 23 09:25:41 np0005532585.localdomain sudo[203506]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:41 np0005532585.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 23 09:25:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17078 DF PROTO=TCP SPT=48576 DPT=9882 SEQ=27232340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBF0E60000000001030307) 
Nov 23 09:25:41 np0005532585.localdomain setroubleshoot[203546]: Deleting alert a701375f-2314-41e7-b9ff-3bc5dfd0e157, it is allowed in current policy
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Nov 23 09:25:42 np0005532585.localdomain sudo[203685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijspymtugftelnmmnkdmzvaaxjinofsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889941.754094-2955-230158388193245/AnsiballZ_systemd.py
Nov 23 09:25:42 np0005532585.localdomain sudo[203685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:42 np0005532585.localdomain python3.9[203687]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:25:42 np0005532585.localdomain systemd-rc-local-generator[203711]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:25:42 np0005532585.localdomain systemd-sysv-generator[203717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 23 09:25:42 np0005532585.localdomain systemd[1]: Started libvirt QEMU daemon.
Nov 23 09:25:42 np0005532585.localdomain sudo[203685]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:42 np0005532585.localdomain setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 14acc193-8e99-4b7b-bc7e-04795d26bb69
Nov 23 09:25:42 np0005532585.localdomain setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Nov 23 09:25:42 np0005532585.localdomain setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 14acc193-8e99-4b7b-bc7e-04795d26bb69
Nov 23 09:25:42 np0005532585.localdomain setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:25:43 np0005532585.localdomain sudo[203875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptonsaicjpcdohdxwcgaxggkrheanzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889942.8582451-2955-76053477447669/AnsiballZ_systemd.py
Nov 23 09:25:43 np0005532585.localdomain sudo[203875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: tmp-crun.HITXMo.mount: Deactivated successfully.
Nov 23 09:25:43 np0005532585.localdomain podman[203854]: 2025-11-23 09:25:43.156842342 +0000 UTC m=+0.079263001 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:25:43 np0005532585.localdomain podman[203854]: 2025-11-23 09:25:43.166308104 +0000 UTC m=+0.088728753 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:25:43 np0005532585.localdomain python3.9[203879]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:25:43 np0005532585.localdomain systemd-rc-local-generator[203919]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:25:43 np0005532585.localdomain systemd-sysv-generator[203924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Starting libvirt secret daemon socket...
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 23 09:25:43 np0005532585.localdomain systemd[1]: Started libvirt secret daemon.
Nov 23 09:25:43 np0005532585.localdomain sudo[203875]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:44 np0005532585.localdomain sudo[204071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cthnzyugumxsaevwkuzmfbuhzhductir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889944.411609-3066-117667120693805/AnsiballZ_file.py
Nov 23 09:25:44 np0005532585.localdomain sudo[204071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:44 np0005532585.localdomain python3.9[204073]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:44 np0005532585.localdomain sudo[204071]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17080 DF PROTO=TCP SPT=48576 DPT=9882 SEQ=27232340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBFCE10000000001030307) 
Nov 23 09:25:45 np0005532585.localdomain sudo[204181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dksdtqeydasaovczrahebdfswmsbqfep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889945.101062-3090-193153420535891/AnsiballZ_find.py
Nov 23 09:25:45 np0005532585.localdomain sudo[204181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:45 np0005532585.localdomain python3.9[204183]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:25:45 np0005532585.localdomain sudo[204181]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:46 np0005532585.localdomain sudo[204291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpfmnnevcloajmdwecykorendsdxmrnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889945.7918973-3114-206743491167936/AnsiballZ_command.py
Nov 23 09:25:46 np0005532585.localdomain sudo[204291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:46 np0005532585.localdomain python3.9[204293]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:25:46 np0005532585.localdomain sudo[204291]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:47 np0005532585.localdomain python3.9[204405]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:25:48 np0005532585.localdomain python3.9[204513]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43891 DF PROTO=TCP SPT=35166 DPT=9102 SEQ=1640571255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC0A210000000001030307) 
Nov 23 09:25:48 np0005532585.localdomain python3.9[204599]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889947.6084235-3171-134549237374095/.source.xml follow=False _original_basename=secret.xml.j2 checksum=08854374a51612ae60ccb5be5d56c7ff5bc71f08 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:49 np0005532585.localdomain sudo[204707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awfacjbimoijwuzaeqghfglsjzznnrme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889948.7646894-3216-262809699472411/AnsiballZ_command.py
Nov 23 09:25:49 np0005532585.localdomain sudo[204707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:49 np0005532585.localdomain python3.9[204709]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:25:49 np0005532585.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:204711:989241 (system bus name :1.2856 [pkttyagent --process 204711 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 23 09:25:49 np0005532585.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:204711:989241 (system bus name :1.2856, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 23 09:25:49 np0005532585.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:204710:989241 (system bus name :1.2857 [pkttyagent --process 204710 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 23 09:25:49 np0005532585.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:204710:989241 (system bus name :1.2857, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 23 09:25:49 np0005532585.localdomain sudo[204707]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:50 np0005532585.localdomain python3.9[204829]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:50 np0005532585.localdomain sudo[204937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhulbifvlybvobnkkwlvpngwkcshdkkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889950.6273184-3264-67064292067486/AnsiballZ_command.py
Nov 23 09:25:50 np0005532585.localdomain sudo[204937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:51 np0005532585.localdomain sudo[204937]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:51 np0005532585.localdomain sudo[205048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kupiyvjnoycvjqsyxlnhymctusqzyboz ; FSID=46550e70-79cb-5f55-bf6d-1204b97e083b KEY=AQDxuiJpAAAAABAAP98WAgQc+4XAoh/vjuoPnQ== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889951.276749-3288-200590474050635/AnsiballZ_command.py
Nov 23 09:25:51 np0005532585.localdomain sudo[205048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:51 np0005532585.localdomain polkitd[1037]: Registered Authentication Agent for unix-process:205051:989490 (system bus name :1.2860 [pkttyagent --process 205051 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 23 09:25:51 np0005532585.localdomain polkitd[1037]: Unregistered Authentication Agent for unix-process:205051:989490 (system bus name :1.2860, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 23 09:25:51 np0005532585.localdomain sudo[205048]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:52 np0005532585.localdomain sudo[205164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlfxgsutnefxmcniksiufhrddkpommam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889952.0327716-3312-92476584604435/AnsiballZ_copy.py
Nov 23 09:25:52 np0005532585.localdomain sudo[205164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:52 np0005532585.localdomain python3.9[205166]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:52 np0005532585.localdomain sudo[205164]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:52 np0005532585.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Nov 23 09:25:52 np0005532585.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 23 09:25:53 np0005532585.localdomain sudo[205274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gucfgavxewonbcqshqirezlmeafwkyco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889952.9015393-3336-138180641096976/AnsiballZ_stat.py
Nov 23 09:25:53 np0005532585.localdomain sudo[205274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:53 np0005532585.localdomain python3.9[205276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:53 np0005532585.localdomain sudo[205274]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43572 DF PROTO=TCP SPT=43778 DPT=9102 SEQ=609621929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC1DE00000000001030307) 
Nov 23 09:25:53 np0005532585.localdomain sudo[205362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxxyfrcmxheedtojlohlwtwyghtdamfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889952.9015393-3336-138180641096976/AnsiballZ_copy.py
Nov 23 09:25:53 np0005532585.localdomain sudo[205362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:53 np0005532585.localdomain python3.9[205364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889952.9015393-3336-138180641096976/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:53 np0005532585.localdomain sudo[205362]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:54 np0005532585.localdomain sudo[205472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugtwxozajfsvjsiyrqwduaxdkxbgkpbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889954.2994764-3384-157710457892483/AnsiballZ_file.py
Nov 23 09:25:54 np0005532585.localdomain sudo[205472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:54 np0005532585.localdomain python3.9[205474]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:54 np0005532585.localdomain sudo[205472]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:55 np0005532585.localdomain sudo[205582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpmbshbkvmjnahhzltmbpjbagsolvsln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889955.013719-3408-223459550275085/AnsiballZ_stat.py
Nov 23 09:25:55 np0005532585.localdomain sudo[205582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:55 np0005532585.localdomain python3.9[205584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:55 np0005532585.localdomain sudo[205582]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:55 np0005532585.localdomain sudo[205639]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whygtwgvbywsttizsatiueoauxhsaivu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889955.013719-3408-223459550275085/AnsiballZ_file.py
Nov 23 09:25:55 np0005532585.localdomain sudo[205639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:55 np0005532585.localdomain python3.9[205641]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:55 np0005532585.localdomain sudo[205639]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:56 np0005532585.localdomain sudo[205749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvgrjgiousnmbdhrzbvhapltqhclyzjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889956.214353-3444-39257925319992/AnsiballZ_stat.py
Nov 23 09:25:56 np0005532585.localdomain sudo[205749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:56 np0005532585.localdomain python3.9[205751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:56 np0005532585.localdomain sudo[205749]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:56 np0005532585.localdomain sudo[205806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhxnxgrmtndmteesvofhealwbennings ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889956.214353-3444-39257925319992/AnsiballZ_file.py
Nov 23 09:25:56 np0005532585.localdomain sudo[205806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17082 DF PROTO=TCP SPT=48576 DPT=9882 SEQ=27232340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC2C210000000001030307) 
Nov 23 09:25:57 np0005532585.localdomain python3.9[205808]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._3f6l7jn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:57 np0005532585.localdomain sudo[205806]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:57 np0005532585.localdomain sudo[205916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylgvekmqetucfapyihhwxaflhqznwuvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889957.3522804-3481-82526383967926/AnsiballZ_stat.py
Nov 23 09:25:57 np0005532585.localdomain sudo[205916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:57 np0005532585.localdomain python3.9[205918]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:25:57 np0005532585.localdomain sudo[205916]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:58 np0005532585.localdomain sudo[205973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpqydwpdtfogkewcvpingppldpijvazb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889957.3522804-3481-82526383967926/AnsiballZ_file.py
Nov 23 09:25:58 np0005532585.localdomain sudo[205973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:58 np0005532585.localdomain python3.9[205975]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:25:58 np0005532585.localdomain sudo[205973]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:58 np0005532585.localdomain sudo[206083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-corvetixkazinsbmrnvesvhjlwpkcgni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889958.5324974-3520-101242170753312/AnsiballZ_command.py
Nov 23 09:25:58 np0005532585.localdomain sudo[206083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:58 np0005532585.localdomain python3.9[206085]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:25:59 np0005532585.localdomain sudo[206083]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:59 np0005532585.localdomain sudo[206194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmxlanlskiuyfgnvecutpjptmjzdohwr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763889959.2241151-3543-253253998134856/AnsiballZ_edpm_nftables_from_files.py
Nov 23 09:25:59 np0005532585.localdomain sudo[206194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:25:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3152 DF PROTO=TCP SPT=45864 DPT=9101 SEQ=2081057054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC36E90000000001030307) 
Nov 23 09:25:59 np0005532585.localdomain python3[206196]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 09:25:59 np0005532585.localdomain sudo[206194]: pam_unix(sudo:session): session closed for user root
Nov 23 09:25:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62158 DF PROTO=TCP SPT=33372 DPT=9105 SEQ=1694433680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC379B0000000001030307) 
Nov 23 09:26:00 np0005532585.localdomain sudo[206304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxpkkyfnwesntnunlolwwgfbxfcmhauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889960.043203-3567-157540136494299/AnsiballZ_stat.py
Nov 23 09:26:00 np0005532585.localdomain sudo[206304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:00 np0005532585.localdomain python3.9[206306]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:00 np0005532585.localdomain sudo[206304]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:00 np0005532585.localdomain sudo[206361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjhagdmcfvymjvcjpabikzocktdjlaoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889960.043203-3567-157540136494299/AnsiballZ_file.py
Nov 23 09:26:00 np0005532585.localdomain sudo[206361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:01 np0005532585.localdomain python3.9[206363]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:01 np0005532585.localdomain sudo[206361]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:01 np0005532585.localdomain sudo[206471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeshzjchgwdpppgwjbpvufuakkhxkiwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889961.1590366-3603-34034499840793/AnsiballZ_stat.py
Nov 23 09:26:01 np0005532585.localdomain sudo[206471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:01 np0005532585.localdomain python3.9[206473]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:01 np0005532585.localdomain sudo[206471]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:01 np0005532585.localdomain sudo[206528]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imrgtprwvycuafvbafgdfmvfwklbjfnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889961.1590366-3603-34034499840793/AnsiballZ_file.py
Nov 23 09:26:01 np0005532585.localdomain sudo[206528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:02 np0005532585.localdomain python3.9[206530]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:02 np0005532585.localdomain sudo[206528]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:02 np0005532585.localdomain sudo[206638]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlmnkhkmgtfpsswxkiagowkzrkckpuuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889962.294225-3639-43749790910799/AnsiballZ_stat.py
Nov 23 09:26:02 np0005532585.localdomain sudo[206638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:02 np0005532585.localdomain python3.9[206640]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:02 np0005532585.localdomain sudo[206638]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:02 np0005532585.localdomain sudo[206695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnyxckwhiswmoysirlkwlahtkqkwzomr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889962.294225-3639-43749790910799/AnsiballZ_file.py
Nov 23 09:26:02 np0005532585.localdomain sudo[206695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62160 DF PROTO=TCP SPT=33372 DPT=9105 SEQ=1694433680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC43A10000000001030307) 
Nov 23 09:26:03 np0005532585.localdomain python3.9[206697]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:03 np0005532585.localdomain sudo[206695]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:04 np0005532585.localdomain sudo[206805]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmerhvsgmckrxfylysbjyshnwjavjevd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889964.4334116-3675-231486216364267/AnsiballZ_stat.py
Nov 23 09:26:04 np0005532585.localdomain sudo[206805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:04 np0005532585.localdomain python3.9[206807]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:04 np0005532585.localdomain sudo[206805]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:05 np0005532585.localdomain sudo[206862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubkwfgswtutkgvxnswccxuncpjraeuqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889964.4334116-3675-231486216364267/AnsiballZ_file.py
Nov 23 09:26:05 np0005532585.localdomain sudo[206862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:05 np0005532585.localdomain python3.9[206864]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:05 np0005532585.localdomain sudo[206862]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48977 DF PROTO=TCP SPT=35424 DPT=9100 SEQ=3493553969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC50210000000001030307) 
Nov 23 09:26:06 np0005532585.localdomain sudo[206972]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqdjjxvszahbshcffyzmauyvsmbnxyvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889965.6177292-3711-208855973769798/AnsiballZ_stat.py
Nov 23 09:26:06 np0005532585.localdomain sudo[206972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:06 np0005532585.localdomain python3.9[206974]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:06 np0005532585.localdomain sudo[206972]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:06 np0005532585.localdomain sudo[207062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrvfmthmkpfoqcmxvjmihcovqsawnnto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889965.6177292-3711-208855973769798/AnsiballZ_copy.py
Nov 23 09:26:06 np0005532585.localdomain sudo[207062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:07 np0005532585.localdomain python3.9[207064]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889965.6177292-3711-208855973769798/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:07 np0005532585.localdomain sudo[207062]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:07 np0005532585.localdomain sudo[207172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcqqwfutgtsuewyzzctcsrfoylbluylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889967.4578843-3756-124527344572213/AnsiballZ_file.py
Nov 23 09:26:07 np0005532585.localdomain sudo[207172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:07 np0005532585.localdomain python3.9[207174]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:07 np0005532585.localdomain sudo[207172]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:08 np0005532585.localdomain sudo[207282]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlmjfzcbgutlvdxvhkvwtticcgyelxox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889968.1061683-3780-37022738441448/AnsiballZ_command.py
Nov 23 09:26:08 np0005532585.localdomain sudo[207282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:08 np0005532585.localdomain python3.9[207284]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:26:08 np0005532585.localdomain sudo[207282]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:26:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:26:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:26:09.236 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:26:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:26:09.238 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:26:09 np0005532585.localdomain sudo[207395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftdoacbxjrxzijvgpohvwwgxjiqjyfrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889968.8076112-3804-272194514202831/AnsiballZ_blockinfile.py
Nov 23 09:26:09 np0005532585.localdomain sudo[207395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59666 DF PROTO=TCP SPT=55348 DPT=9100 SEQ=4199510865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC5BE00000000001030307) 
Nov 23 09:26:09 np0005532585.localdomain python3.9[207397]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:09 np0005532585.localdomain sudo[207395]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:26:10 np0005532585.localdomain podman[207472]: 2025-11-23 09:26:10.040268997 +0000 UTC m=+0.092951941 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:26:10 np0005532585.localdomain sudo[207516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuxnujjbacupthzqqgajoincgzhjzokt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889969.7854502-3831-3107606339033/AnsiballZ_command.py
Nov 23 09:26:10 np0005532585.localdomain sudo[207516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:10 np0005532585.localdomain podman[207472]: 2025-11-23 09:26:10.105620608 +0000 UTC m=+0.158303582 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 09:26:10 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:26:10 np0005532585.localdomain python3.9[207525]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:26:10 np0005532585.localdomain sudo[207516]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:11 np0005532585.localdomain sudo[207641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmhjtqmdgwyjdmvowkmtzfatyxbsqirv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889970.8137357-3855-58035297715937/AnsiballZ_stat.py
Nov 23 09:26:11 np0005532585.localdomain sudo[207641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:11 np0005532585.localdomain python3.9[207643]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:26:11 np0005532585.localdomain sudo[207641]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:11 np0005532585.localdomain sudo[207753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niqjopnnnjqjxjqhbreiypvvkglrovjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889971.5058863-3880-190702676251986/AnsiballZ_command.py
Nov 23 09:26:11 np0005532585.localdomain sudo[207753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44119 DF PROTO=TCP SPT=41746 DPT=9882 SEQ=3913687544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC66170000000001030307) 
Nov 23 09:26:11 np0005532585.localdomain python3.9[207755]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:26:11 np0005532585.localdomain sudo[207753]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:12 np0005532585.localdomain sudo[207866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeldsvkrtxwpjcwbpaxjsjginecrslzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889972.222775-3903-223840219076858/AnsiballZ_file.py
Nov 23 09:26:12 np0005532585.localdomain sudo[207866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:12 np0005532585.localdomain python3.9[207868]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:12 np0005532585.localdomain sudo[207866]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:13 np0005532585.localdomain sudo[207976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsdleajeublkrdilnvdclnxbandkulgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889972.8976505-3928-100005418120253/AnsiballZ_stat.py
Nov 23 09:26:13 np0005532585.localdomain sudo[207976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:13 np0005532585.localdomain python3.9[207978]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:13 np0005532585.localdomain sudo[207976]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:13 np0005532585.localdomain sudo[208064]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjnhhavqqmcptfjabshmdfrkddoyafut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889972.8976505-3928-100005418120253/AnsiballZ_copy.py
Nov 23 09:26:13 np0005532585.localdomain sudo[208064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:26:13 np0005532585.localdomain podman[208067]: 2025-11-23 09:26:13.657618691 +0000 UTC m=+0.063867016 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:26:13 np0005532585.localdomain podman[208067]: 2025-11-23 09:26:13.666244207 +0000 UTC m=+0.072492552 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:26:13 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:26:13 np0005532585.localdomain python3.9[208066]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889972.8976505-3928-100005418120253/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:13 np0005532585.localdomain sudo[208064]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:14 np0005532585.localdomain sudo[208192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojkmjsquzjvqemfkgtaehwmdttlrhvjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889974.023101-3972-166839050151418/AnsiballZ_stat.py
Nov 23 09:26:14 np0005532585.localdomain sudo[208192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:14 np0005532585.localdomain python3.9[208194]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:14 np0005532585.localdomain sudo[208192]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:14 np0005532585.localdomain sudo[208280]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buvnidyklvwheadvlbqwijcypllhtmui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889974.023101-3972-166839050151418/AnsiballZ_copy.py
Nov 23 09:26:14 np0005532585.localdomain sudo[208280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44121 DF PROTO=TCP SPT=41746 DPT=9882 SEQ=3913687544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC72200000000001030307) 
Nov 23 09:26:15 np0005532585.localdomain python3.9[208282]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889974.023101-3972-166839050151418/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:15 np0005532585.localdomain sudo[208280]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:15 np0005532585.localdomain sudo[208390]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppphiqcpbzziwvphndojzcfogrheggsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889975.1724162-4017-39600808219720/AnsiballZ_stat.py
Nov 23 09:26:15 np0005532585.localdomain sudo[208390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:15 np0005532585.localdomain python3.9[208392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:26:15 np0005532585.localdomain sudo[208390]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:15 np0005532585.localdomain sudo[208478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bislxwpgjkzdgfuojckhbwareydbnjcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889975.1724162-4017-39600808219720/AnsiballZ_copy.py
Nov 23 09:26:15 np0005532585.localdomain sudo[208478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:16 np0005532585.localdomain python3.9[208480]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889975.1724162-4017-39600808219720/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:16 np0005532585.localdomain sudo[208478]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:17 np0005532585.localdomain sudo[208588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgkfodfkftxutasajpyxsbkxxwwyyzzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889977.5055141-4062-188653772018706/AnsiballZ_systemd.py
Nov 23 09:26:17 np0005532585.localdomain sudo[208588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43574 DF PROTO=TCP SPT=43778 DPT=9102 SEQ=609621929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC7E200000000001030307) 
Nov 23 09:26:18 np0005532585.localdomain python3.9[208590]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:26:18 np0005532585.localdomain systemd-rc-local-generator[208617]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:26:18 np0005532585.localdomain systemd-sysv-generator[208620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:18 np0005532585.localdomain systemd[1]: Reached target edpm_libvirt.target.
Nov 23 09:26:18 np0005532585.localdomain sudo[208588]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:19 np0005532585.localdomain sudo[208738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsswqmdbfezqjwywuckkjilfqpdadflm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889979.315258-4086-195735276755273/AnsiballZ_systemd.py
Nov 23 09:26:19 np0005532585.localdomain sudo[208738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:19 np0005532585.localdomain python3.9[208740]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 09:26:19 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:26:20 np0005532585.localdomain systemd-rc-local-generator[208764]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:26:20 np0005532585.localdomain systemd-sysv-generator[208768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:26:20 np0005532585.localdomain systemd-rc-local-generator[208802]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:26:20 np0005532585.localdomain systemd-sysv-generator[208808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:20 np0005532585.localdomain sudo[208738]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:21 np0005532585.localdomain sshd[160578]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:26:21 np0005532585.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Nov 23 09:26:21 np0005532585.localdomain systemd[1]: session-52.scope: Consumed 3min 36.058s CPU time.
Nov 23 09:26:21 np0005532585.localdomain systemd-logind[761]: Session 52 logged out. Waiting for processes to exit.
Nov 23 09:26:21 np0005532585.localdomain systemd-logind[761]: Removed session 52.
Nov 23 09:26:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17754 DF PROTO=TCP SPT=57284 DPT=9102 SEQ=62392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC93200000000001030307) 
Nov 23 09:26:26 np0005532585.localdomain sshd[208831]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:26:26 np0005532585.localdomain sshd[208831]: Accepted publickey for zuul from 192.168.122.30 port 35600 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:26:26 np0005532585.localdomain systemd-logind[761]: New session 53 of user zuul.
Nov 23 09:26:26 np0005532585.localdomain systemd[1]: Started Session 53 of User zuul.
Nov 23 09:26:26 np0005532585.localdomain sshd[208831]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:26:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44123 DF PROTO=TCP SPT=41746 DPT=9882 SEQ=3913687544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCA2200000000001030307) 
Nov 23 09:26:27 np0005532585.localdomain python3.9[208942]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:26:28 np0005532585.localdomain python3.9[209054]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:26:29 np0005532585.localdomain network[209071]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:26:29 np0005532585.localdomain network[209072]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:26:29 np0005532585.localdomain network[209073]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:26:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13885 DF PROTO=TCP SPT=35964 DPT=9101 SEQ=104125501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCAC1A0000000001030307) 
Nov 23 09:26:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3582 DF PROTO=TCP SPT=53272 DPT=9105 SEQ=437809189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCACCB0000000001030307) 
Nov 23 09:26:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:26:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13887 DF PROTO=TCP SPT=35964 DPT=9101 SEQ=104125501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCB8200000000001030307) 
Nov 23 09:26:34 np0005532585.localdomain sudo[209303]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psonmvackamdcpkmfdoksitgumeekfsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889994.0268233-102-103281611191870/AnsiballZ_setup.py
Nov 23 09:26:34 np0005532585.localdomain sudo[209303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:34 np0005532585.localdomain python3.9[209305]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:26:34 np0005532585.localdomain sudo[209303]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:35 np0005532585.localdomain sudo[209366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvbqdwalnfdfrikbkmnhjydlysdrgutd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763889994.0268233-102-103281611191870/AnsiballZ_dnf.py
Nov 23 09:26:35 np0005532585.localdomain sudo[209366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:35 np0005532585.localdomain python3.9[209368]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:26:35 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17162 DF PROTO=TCP SPT=40974 DPT=9100 SEQ=2781183349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCC4200000000001030307) 
Nov 23 09:26:37 np0005532585.localdomain sudo[209371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:26:37 np0005532585.localdomain sudo[209371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:26:37 np0005532585.localdomain sudo[209371]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:38 np0005532585.localdomain sudo[209389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:26:38 np0005532585.localdomain sudo[209389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:26:38 np0005532585.localdomain sudo[209389]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53967 DF PROTO=TCP SPT=34390 DPT=9100 SEQ=2157483921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCD1200000000001030307) 
Nov 23 09:26:39 np0005532585.localdomain sudo[209439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:26:39 np0005532585.localdomain sudo[209439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:26:39 np0005532585.localdomain sudo[209439]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:26:41 np0005532585.localdomain podman[209457]: 2025-11-23 09:26:41.03201156 +0000 UTC m=+0.085671467 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 23 09:26:41 np0005532585.localdomain podman[209457]: 2025-11-23 09:26:41.087486568 +0000 UTC m=+0.141146505 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:26:41 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:26:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5687 DF PROTO=TCP SPT=56114 DPT=9882 SEQ=4269180401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCDB470000000001030307) 
Nov 23 09:26:42 np0005532585.localdomain sudo[209366]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:43 np0005532585.localdomain sudo[209590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtqktxjfjrwgbdbghnfbebpayhoxiwqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890002.7929208-138-216462912051197/AnsiballZ_stat.py
Nov 23 09:26:43 np0005532585.localdomain sudo[209590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:43 np0005532585.localdomain python3.9[209592]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:26:43 np0005532585.localdomain sudo[209590]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:26:44 np0005532585.localdomain podman[209666]: 2025-11-23 09:26:44.023619477 +0000 UTC m=+0.075471703 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:26:44 np0005532585.localdomain podman[209666]: 2025-11-23 09:26:44.052079483 +0000 UTC m=+0.103931679 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 23 09:26:44 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:26:44 np0005532585.localdomain sudo[209720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unznlpdiihdudotovprhgartzruhctis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890003.6756766-163-120633723343939/AnsiballZ_copy.py
Nov 23 09:26:44 np0005532585.localdomain sudo[209720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:44 np0005532585.localdomain sshd[209723]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:26:44 np0005532585.localdomain sshd[209723]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 09:26:44 np0005532585.localdomain sshd[209723]: Connection closed by 18.219.193.156 port 46490
Nov 23 09:26:44 np0005532585.localdomain python3.9[209722]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:44 np0005532585.localdomain sudo[209720]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5689 DF PROTO=TCP SPT=56114 DPT=9882 SEQ=4269180401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCE7600000000001030307) 
Nov 23 09:26:45 np0005532585.localdomain sudo[209831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yncigpkgemonfjxxdodeamoibpbqfexp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890004.9998329-186-99169203353383/AnsiballZ_command.py
Nov 23 09:26:45 np0005532585.localdomain sudo[209831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:45 np0005532585.localdomain python3.9[209833]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:26:45 np0005532585.localdomain sudo[209831]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:46 np0005532585.localdomain sudo[209942]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwgfucjqxcotwwlxkresvmouquwpvujn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890005.833096-210-249326631292360/AnsiballZ_command.py
Nov 23 09:26:46 np0005532585.localdomain sudo[209942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:46 np0005532585.localdomain python3.9[209944]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:26:46 np0005532585.localdomain sudo[209942]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:47 np0005532585.localdomain sudo[210053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmdxffuvniizzuajevpldfzrrsoqqpbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890007.311036-234-53337450649215/AnsiballZ_command.py
Nov 23 09:26:47 np0005532585.localdomain sudo[210053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:47 np0005532585.localdomain python3.9[210055]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:26:47 np0005532585.localdomain sudo[210053]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17756 DF PROTO=TCP SPT=57284 DPT=9102 SEQ=62392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCF4200000000001030307) 
Nov 23 09:26:48 np0005532585.localdomain sudo[210164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pblrfnatmsdxeajgficdbbfwgdkakwzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890008.1998243-261-32856258809919/AnsiballZ_stat.py
Nov 23 09:26:48 np0005532585.localdomain sudo[210164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:48 np0005532585.localdomain python3.9[210166]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:26:48 np0005532585.localdomain sudo[210164]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:49 np0005532585.localdomain sudo[210276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjaskzzfispmkfqztogrzrnqgdgpocxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890009.0873415-294-251610270066431/AnsiballZ_lineinfile.py
Nov 23 09:26:49 np0005532585.localdomain sudo[210276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:49 np0005532585.localdomain python3.9[210278]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:26:49 np0005532585.localdomain sudo[210276]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:50 np0005532585.localdomain sudo[210386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykjvsvmeorcoqmiylaanuwqsbgxelpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890010.0125003-321-199022500883910/AnsiballZ_systemd_service.py
Nov 23 09:26:50 np0005532585.localdomain sudo[210386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:50 np0005532585.localdomain python3.9[210388]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 23 09:26:51 np0005532585.localdomain sudo[210386]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:51 np0005532585.localdomain sudo[210500]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljmlvknxsupxxbbylyojlinbjvhxhayr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890011.178495-345-203997443433529/AnsiballZ_systemd_service.py
Nov 23 09:26:51 np0005532585.localdomain sudo[210500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:51 np0005532585.localdomain python3.9[210502]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:26:51 np0005532585.localdomain systemd-rc-local-generator[210533]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:26:51 np0005532585.localdomain systemd-sysv-generator[210536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:26:52 np0005532585.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 23 09:26:52 np0005532585.localdomain systemd[1]: Starting Open-iSCSI...
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: If using hardware iscsi like qla4xxx this message can be ignored.
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Nov 23 09:26:52 np0005532585.localdomain iscsid[210544]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Nov 23 09:26:52 np0005532585.localdomain systemd[1]: Started Open-iSCSI.
Nov 23 09:26:52 np0005532585.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 23 09:26:52 np0005532585.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 23 09:26:52 np0005532585.localdomain sudo[210500]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:52 np0005532585.localdomain sudo[210653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzhkgfcpmkqzidshoehmsegwzqxxgmjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890012.7028334-378-168187460162858/AnsiballZ_service_facts.py
Nov 23 09:26:52 np0005532585.localdomain sudo[210653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:53 np0005532585.localdomain python3.9[210655]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:26:53 np0005532585.localdomain network[210672]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:26:53 np0005532585.localdomain network[210673]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:26:53 np0005532585.localdomain network[210674]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:26:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42760 DF PROTO=TCP SPT=55756 DPT=9102 SEQ=2764835931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD08600000000001030307) 
Nov 23 09:26:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:26:54 np0005532585.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 23 09:26:55 np0005532585.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 23 09:26:55 np0005532585.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service.
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6
Nov 23 09:26:56 np0005532585.localdomain setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 23 09:26:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5691 DF PROTO=TCP SPT=56114 DPT=9882 SEQ=4269180401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD18200000000001030307) 
Nov 23 09:26:58 np0005532585.localdomain sudo[210653]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:59 np0005532585.localdomain sudo[210921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmnacyxwksvyohivomeawadywcarzyms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890018.7496095-408-5290469435427/AnsiballZ_file.py
Nov 23 09:26:59 np0005532585.localdomain sudo[210921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:26:59 np0005532585.localdomain python3.9[210923]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 09:26:59 np0005532585.localdomain sudo[210921]: pam_unix(sudo:session): session closed for user root
Nov 23 09:26:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12056 DF PROTO=TCP SPT=47370 DPT=9101 SEQ=2871435611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD21490000000001030307) 
Nov 23 09:26:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30764 DF PROTO=TCP SPT=35804 DPT=9105 SEQ=677915327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD21FB0000000001030307) 
Nov 23 09:27:00 np0005532585.localdomain sudo[211031]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llmgujujqbdxuwmdehzicykgmfufhnxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890019.6368277-432-241930401058917/AnsiballZ_modprobe.py
Nov 23 09:27:00 np0005532585.localdomain sudo[211031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:00 np0005532585.localdomain python3.9[211033]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 23 09:27:00 np0005532585.localdomain sudo[211031]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:00 np0005532585.localdomain sudo[211145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khdnrwqfyfwwczchrzalusqatzdtjyti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890020.4672198-456-48508680459087/AnsiballZ_stat.py
Nov 23 09:27:00 np0005532585.localdomain sudo[211145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:00 np0005532585.localdomain python3.9[211147]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:00 np0005532585.localdomain sudo[211145]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:01 np0005532585.localdomain sudo[211233]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txolusomrvgunyklezffqcmnezrphlxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890020.4672198-456-48508680459087/AnsiballZ_copy.py
Nov 23 09:27:01 np0005532585.localdomain sudo[211233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:01 np0005532585.localdomain python3.9[211235]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890020.4672198-456-48508680459087/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:01 np0005532585.localdomain sudo[211233]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:02 np0005532585.localdomain sudo[211343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhwgcwitamyzmvgujxfgfhxxdvcvjnji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890021.8386202-504-98672806826006/AnsiballZ_lineinfile.py
Nov 23 09:27:02 np0005532585.localdomain sudo[211343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:02 np0005532585.localdomain python3.9[211345]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:02 np0005532585.localdomain sudo[211343]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12058 DF PROTO=TCP SPT=47370 DPT=9101 SEQ=2871435611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD2D600000000001030307) 
Nov 23 09:27:03 np0005532585.localdomain sudo[211453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjtzizrveguhmfrzpltqyjayqesjmqws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890022.5348547-528-58330151113624/AnsiballZ_systemd.py
Nov 23 09:27:03 np0005532585.localdomain sudo[211453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:03 np0005532585.localdomain python3.9[211455]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:27:03 np0005532585.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 09:27:03 np0005532585.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 23 09:27:03 np0005532585.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 23 09:27:03 np0005532585.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 23 09:27:03 np0005532585.localdomain systemd-modules-load[211459]: Module 'msr' is built in
Nov 23 09:27:03 np0005532585.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 23 09:27:03 np0005532585.localdomain sudo[211453]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:04 np0005532585.localdomain sudo[211567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aibgbcqaplrwvryqorvbdhmchdwmbkaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890023.740579-552-259427893793344/AnsiballZ_file.py
Nov 23 09:27:04 np0005532585.localdomain sudo[211567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:04 np0005532585.localdomain python3.9[211569]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:04 np0005532585.localdomain sudo[211567]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:04 np0005532585.localdomain sudo[211677]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmgpxsvuknehxkemktisynryjtjzlghj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890024.5995078-579-182777452887970/AnsiballZ_stat.py
Nov 23 09:27:04 np0005532585.localdomain sudo[211677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:05 np0005532585.localdomain python3.9[211679]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:27:05 np0005532585.localdomain sudo[211677]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:05 np0005532585.localdomain sudo[211787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlmmtinrwkqjkqkoqowyeylhlbfubsjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890025.428598-606-44427698560568/AnsiballZ_stat.py
Nov 23 09:27:05 np0005532585.localdomain sudo[211787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:05 np0005532585.localdomain python3.9[211789]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:27:05 np0005532585.localdomain sudo[211787]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59669 DF PROTO=TCP SPT=55348 DPT=9100 SEQ=4199510865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD3A200000000001030307) 
Nov 23 09:27:06 np0005532585.localdomain sudo[211897]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayagjemfatstmbjwfjfzlkecoowrowmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890026.1158402-630-233069566537776/AnsiballZ_stat.py
Nov 23 09:27:06 np0005532585.localdomain sudo[211897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:06 np0005532585.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully.
Nov 23 09:27:06 np0005532585.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Consumed 1.004s CPU time.
Nov 23 09:27:06 np0005532585.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 23 09:27:06 np0005532585.localdomain python3.9[211899]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:06 np0005532585.localdomain sudo[211897]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:06 np0005532585.localdomain sudo[211985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khgbdnctpvitphqagzwacjoyxlkmjwjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890026.1158402-630-233069566537776/AnsiballZ_copy.py
Nov 23 09:27:06 np0005532585.localdomain sudo[211985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:07 np0005532585.localdomain python3.9[211987]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890026.1158402-630-233069566537776/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:07 np0005532585.localdomain sudo[211985]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:07 np0005532585.localdomain sudo[212095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgizjphszvfeiavhnexbkzluqhrckgvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890027.4138863-675-100410931356694/AnsiballZ_command.py
Nov 23 09:27:07 np0005532585.localdomain sudo[212095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:07 np0005532585.localdomain python3.9[212097]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:27:07 np0005532585.localdomain sudo[212095]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:08 np0005532585.localdomain sudo[212206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nenkijymtsswbkjsqjxadewsyvjkvocz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890028.0897326-699-199274666970337/AnsiballZ_lineinfile.py
Nov 23 09:27:08 np0005532585.localdomain sudo[212206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:08 np0005532585.localdomain python3.9[212208]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:08 np0005532585.localdomain sudo[212206]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:27:09.236 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:27:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:27:09.237 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:27:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:27:09.238 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:27:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57724 DF PROTO=TCP SPT=55228 DPT=9100 SEQ=2277522031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD46200000000001030307) 
Nov 23 09:27:09 np0005532585.localdomain sudo[212316]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbndrmqzwcoerksxjhksfxhflhuoawhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890028.8445768-723-178749366094821/AnsiballZ_replace.py
Nov 23 09:27:09 np0005532585.localdomain sudo[212316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:09 np0005532585.localdomain python3.9[212318]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:09 np0005532585.localdomain sudo[212316]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:10 np0005532585.localdomain sudo[212426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exgwrukuddcgvmotroqmcnnagtdljxvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890030.3190448-747-80282947335394/AnsiballZ_replace.py
Nov 23 09:27:10 np0005532585.localdomain sudo[212426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:10 np0005532585.localdomain python3.9[212428]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:10 np0005532585.localdomain sudo[212426]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:11 np0005532585.localdomain sudo[212536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqvcmtfekjtkbvuxspudibszrhismicc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890031.166772-774-252206911443340/AnsiballZ_lineinfile.py
Nov 23 09:27:11 np0005532585.localdomain sudo[212536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:27:11 np0005532585.localdomain systemd[1]: tmp-crun.G0QVP0.mount: Deactivated successfully.
Nov 23 09:27:11 np0005532585.localdomain podman[212539]: 2025-11-23 09:27:11.548076989 +0000 UTC m=+0.081987061 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller)
Nov 23 09:27:11 np0005532585.localdomain podman[212539]: 2025-11-23 09:27:11.584397514 +0000 UTC m=+0.118307646 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 09:27:11 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:27:11 np0005532585.localdomain python3.9[212538]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:11 np0005532585.localdomain sudo[212536]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21638 DF PROTO=TCP SPT=39518 DPT=9882 SEQ=4167220684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD50770000000001030307) 
Nov 23 09:27:12 np0005532585.localdomain sudo[212671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujsdgrtzqqmsqrhlgpohrtqquwksebat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890031.7688808-774-133673863573767/AnsiballZ_lineinfile.py
Nov 23 09:27:12 np0005532585.localdomain sudo[212671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:12 np0005532585.localdomain python3.9[212673]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:12 np0005532585.localdomain sudo[212671]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:13 np0005532585.localdomain sudo[212781]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpbxlhhjrgxbxvikvneuypeekwjuuiyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890032.8675907-774-266687016173809/AnsiballZ_lineinfile.py
Nov 23 09:27:13 np0005532585.localdomain sudo[212781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:13 np0005532585.localdomain python3.9[212783]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:13 np0005532585.localdomain sudo[212781]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:13 np0005532585.localdomain sudo[212891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nstufbgbzopuvvcwavpyzwwjzadunwkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890033.473042-774-69862383908097/AnsiballZ_lineinfile.py
Nov 23 09:27:13 np0005532585.localdomain sudo[212891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:13 np0005532585.localdomain python3.9[212893]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:13 np0005532585.localdomain sudo[212891]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:14 np0005532585.localdomain sudo[213001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogxiysjspfcqidyuuksuvoooidonbfva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890034.2375388-861-178002730676414/AnsiballZ_stat.py
Nov 23 09:27:14 np0005532585.localdomain sudo[213001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:27:14 np0005532585.localdomain podman[213004]: 2025-11-23 09:27:14.614761928 +0000 UTC m=+0.081252739 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 23 09:27:14 np0005532585.localdomain podman[213004]: 2025-11-23 09:27:14.64929891 +0000 UTC m=+0.115789751 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:27:14 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:27:14 np0005532585.localdomain python3.9[213003]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:27:14 np0005532585.localdomain sudo[213001]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21640 DF PROTO=TCP SPT=39518 DPT=9882 SEQ=4167220684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD5C600000000001030307) 
Nov 23 09:27:15 np0005532585.localdomain sudo[213131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzizqnzmzljtattpmanxtzmyatrmwrpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890034.9680016-885-259770800689972/AnsiballZ_file.py
Nov 23 09:27:15 np0005532585.localdomain sudo[213131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:15 np0005532585.localdomain python3.9[213133]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:15 np0005532585.localdomain sudo[213131]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:16 np0005532585.localdomain sudo[213241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbelfosjynjlgelqkdfnzetnngfxkfwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890035.8475673-913-212649966819311/AnsiballZ_file.py
Nov 23 09:27:16 np0005532585.localdomain sudo[213241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:16 np0005532585.localdomain python3.9[213243]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:16 np0005532585.localdomain sudo[213241]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:17 np0005532585.localdomain sudo[213351]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycfxphaxoxbpaqkmzvkofnolguhdyxfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890037.0593646-936-248360013039581/AnsiballZ_stat.py
Nov 23 09:27:17 np0005532585.localdomain sudo[213351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:17 np0005532585.localdomain python3.9[213353]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:17 np0005532585.localdomain sudo[213351]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:17 np0005532585.localdomain sudo[213408]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgomnrvinwvespqjnhvtvxhputwaoayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890037.0593646-936-248360013039581/AnsiballZ_file.py
Nov 23 09:27:17 np0005532585.localdomain sudo[213408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:17 np0005532585.localdomain python3.9[213410]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42762 DF PROTO=TCP SPT=55756 DPT=9102 SEQ=2764835931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD68200000000001030307) 
Nov 23 09:27:17 np0005532585.localdomain sudo[213408]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:18 np0005532585.localdomain sudo[213518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srgfzomrtwgapbmdeeivgrkmfumwrqfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890038.0615401-936-130258217935570/AnsiballZ_stat.py
Nov 23 09:27:18 np0005532585.localdomain sudo[213518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:18 np0005532585.localdomain python3.9[213520]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:18 np0005532585.localdomain sudo[213518]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:18 np0005532585.localdomain sudo[213575]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujndxxwulqumijgukvzggrywyndpoqbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890038.0615401-936-130258217935570/AnsiballZ_file.py
Nov 23 09:27:18 np0005532585.localdomain sudo[213575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:18 np0005532585.localdomain python3.9[213577]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:18 np0005532585.localdomain sudo[213575]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:19 np0005532585.localdomain sudo[213685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoxeleudlmirnoztgqiyijzvzruyxqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890039.244184-1006-166621798783924/AnsiballZ_file.py
Nov 23 09:27:19 np0005532585.localdomain sudo[213685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:19 np0005532585.localdomain python3.9[213687]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:19 np0005532585.localdomain sudo[213685]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:20 np0005532585.localdomain sudo[213795]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-legrwmytmrrjcdpsazvwpttnjopnwndt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890039.8808954-1029-180570170819916/AnsiballZ_stat.py
Nov 23 09:27:20 np0005532585.localdomain sudo[213795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:20 np0005532585.localdomain python3.9[213797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:20 np0005532585.localdomain sudo[213795]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:20 np0005532585.localdomain sudo[213852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvbpayqpthcmhyxyxuyamfygpxlhcbyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890039.8808954-1029-180570170819916/AnsiballZ_file.py
Nov 23 09:27:20 np0005532585.localdomain sudo[213852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:20 np0005532585.localdomain python3.9[213854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:20 np0005532585.localdomain sudo[213852]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:21 np0005532585.localdomain sudo[213962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjrbtkaiblxcgmosvygallfckzmfflxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890041.0736053-1066-5590507227171/AnsiballZ_stat.py
Nov 23 09:27:21 np0005532585.localdomain sudo[213962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:21 np0005532585.localdomain python3.9[213964]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:21 np0005532585.localdomain sudo[213962]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:21 np0005532585.localdomain sudo[214019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhycggndviqqjggwzbdgkbglcciseoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890041.0736053-1066-5590507227171/AnsiballZ_file.py
Nov 23 09:27:21 np0005532585.localdomain sudo[214019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:22 np0005532585.localdomain python3.9[214021]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:22 np0005532585.localdomain sudo[214019]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:22 np0005532585.localdomain sudo[214129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krlhlgkmmrzzpobdeyhqwqqpnaqrauep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890042.1989222-1101-270036084742975/AnsiballZ_systemd.py
Nov 23 09:27:22 np0005532585.localdomain sudo[214129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:22 np0005532585.localdomain python3.9[214131]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:27:22 np0005532585.localdomain systemd-rc-local-generator[214153]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:27:22 np0005532585.localdomain systemd-sysv-generator[214160]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:23 np0005532585.localdomain systemd[1]: Starting dnf makecache...
Nov 23 09:27:23 np0005532585.localdomain sudo[214129]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:23 np0005532585.localdomain dnf[214169]: Updating Subscription Management repositories.
Nov 23 09:27:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5839 DF PROTO=TCP SPT=55886 DPT=9102 SEQ=2200684048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD7DA10000000001030307) 
Nov 23 09:27:24 np0005532585.localdomain sudo[214278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewhumuxkfrnwsbmdamyfytxrxetfvlnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890043.3422177-1125-196503966813136/AnsiballZ_stat.py
Nov 23 09:27:24 np0005532585.localdomain sudo[214278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:24 np0005532585.localdomain python3.9[214280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:24 np0005532585.localdomain sudo[214278]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:25 np0005532585.localdomain dnf[214169]: Metadata cache refreshed recently.
Nov 23 09:27:25 np0005532585.localdomain sudo[214335]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlmwklnumccamzegoigbkxmcwsgwezol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890043.3422177-1125-196503966813136/AnsiballZ_file.py
Nov 23 09:27:25 np0005532585.localdomain sudo[214335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:25 np0005532585.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 23 09:27:25 np0005532585.localdomain systemd[1]: Finished dnf makecache.
Nov 23 09:27:25 np0005532585.localdomain systemd[1]: dnf-makecache.service: Consumed 1.990s CPU time.
Nov 23 09:27:25 np0005532585.localdomain python3.9[214337]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:25 np0005532585.localdomain sudo[214335]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:25 np0005532585.localdomain sudo[214445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkkwwjllwpurtsfujfyammrrwflftzbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890045.4808393-1161-1845077500937/AnsiballZ_stat.py
Nov 23 09:27:25 np0005532585.localdomain sudo[214445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:25 np0005532585.localdomain python3.9[214447]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:25 np0005532585.localdomain sudo[214445]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:27 np0005532585.localdomain sudo[214502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yblpntxsdbbpuzuqohzyhkzkipfkxumk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890045.4808393-1161-1845077500937/AnsiballZ_file.py
Nov 23 09:27:27 np0005532585.localdomain sudo[214502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21642 DF PROTO=TCP SPT=39518 DPT=9882 SEQ=4167220684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD8C210000000001030307) 
Nov 23 09:27:27 np0005532585.localdomain python3.9[214504]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:27 np0005532585.localdomain sudo[214502]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:27 np0005532585.localdomain sudo[214612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikdtkwpfbjqkbvsvovdoipmbcccoysre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890047.569524-1197-31080121255251/AnsiballZ_systemd.py
Nov 23 09:27:27 np0005532585.localdomain sudo[214612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:28 np0005532585.localdomain python3.9[214614]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:27:28 np0005532585.localdomain systemd-rc-local-generator[214641]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:27:28 np0005532585.localdomain systemd-sysv-generator[214645]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:27:28 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:27:28 np0005532585.localdomain sudo[214612]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:29 np0005532585.localdomain sudo[214764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnsmjoaufbkaabphavqtqcnuxwxenrxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890049.002511-1227-90911073539254/AnsiballZ_file.py
Nov 23 09:27:29 np0005532585.localdomain sudo[214764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:29 np0005532585.localdomain python3.9[214766]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:29 np0005532585.localdomain sudo[214764]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33997 DF PROTO=TCP SPT=38656 DPT=9101 SEQ=2531834913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD96790000000001030307) 
Nov 23 09:27:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7485 DF PROTO=TCP SPT=35368 DPT=9105 SEQ=1242860296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD972B0000000001030307) 
Nov 23 09:27:29 np0005532585.localdomain sudo[214874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyjtkqkakrucvjanuukszisbjmswvgdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890049.7123399-1251-87371241721467/AnsiballZ_stat.py
Nov 23 09:27:29 np0005532585.localdomain sudo[214874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:30 np0005532585.localdomain python3.9[214876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:30 np0005532585.localdomain sudo[214874]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:30 np0005532585.localdomain sudo[214962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yszrlaxcfrdgahlnwmwfpuzckjxvwoox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890049.7123399-1251-87371241721467/AnsiballZ_copy.py
Nov 23 09:27:30 np0005532585.localdomain sudo[214962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:30 np0005532585.localdomain python3.9[214964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890049.7123399-1251-87371241721467/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:30 np0005532585.localdomain sudo[214962]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:31 np0005532585.localdomain sudo[215072]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqfjtenezoiwpsbkwgmpirkxrxuegkqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890051.2365253-1303-42281437475746/AnsiballZ_file.py
Nov 23 09:27:31 np0005532585.localdomain sudo[215072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:31 np0005532585.localdomain python3.9[215074]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:27:31 np0005532585.localdomain sudo[215072]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:32 np0005532585.localdomain sudo[215182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knwmjsqoigquhopktimseubbwcgsbvua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890051.9704373-1326-157418401051098/AnsiballZ_stat.py
Nov 23 09:27:32 np0005532585.localdomain sudo[215182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:32 np0005532585.localdomain python3.9[215184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:32 np0005532585.localdomain sudo[215182]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:32 np0005532585.localdomain sudo[215270]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlejfuffvfeqqinnqqpdddjuwtlgjybx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890051.9704373-1326-157418401051098/AnsiballZ_copy.py
Nov 23 09:27:32 np0005532585.localdomain sudo[215270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33999 DF PROTO=TCP SPT=38656 DPT=9101 SEQ=2531834913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDA2A00000000001030307) 
Nov 23 09:27:33 np0005532585.localdomain python3.9[215272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890051.9704373-1326-157418401051098/.source.json _original_basename=.vusu7ets follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:33 np0005532585.localdomain sudo[215270]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:33 np0005532585.localdomain sudo[215380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elabgfmlkkaztcrhtnjjxlrqwjqmzkdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890053.2296932-1371-220348442972246/AnsiballZ_file.py
Nov 23 09:27:33 np0005532585.localdomain sudo[215380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:33 np0005532585.localdomain python3.9[215382]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:33 np0005532585.localdomain sudo[215380]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:34 np0005532585.localdomain sudo[215490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctdhkjyhpjlcxyoiehufbrzcdriijdyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890054.0021996-1395-149447797317188/AnsiballZ_stat.py
Nov 23 09:27:34 np0005532585.localdomain sudo[215490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:34 np0005532585.localdomain sudo[215490]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:34 np0005532585.localdomain sudo[215578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbceozalgjtivlkzacwnligtvajnzkye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890054.0021996-1395-149447797317188/AnsiballZ_copy.py
Nov 23 09:27:34 np0005532585.localdomain sudo[215578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:35 np0005532585.localdomain sudo[215578]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:35 np0005532585.localdomain sudo[215688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veesafqfqcxadjiebronwajdtedygiwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890055.4851384-1446-139486707589447/AnsiballZ_container_config_data.py
Nov 23 09:27:35 np0005532585.localdomain sudo[215688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:36 np0005532585.localdomain python3.9[215690]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 23 09:27:36 np0005532585.localdomain sudo[215688]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53970 DF PROTO=TCP SPT=34390 DPT=9100 SEQ=2157483921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDB0200000000001030307) 
Nov 23 09:27:36 np0005532585.localdomain sudo[215798]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twvmacuaxoopjrvxjjkgcqyhvafifwok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890056.4148157-1473-32108809688411/AnsiballZ_container_config_hash.py
Nov 23 09:27:36 np0005532585.localdomain sudo[215798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:37 np0005532585.localdomain python3.9[215800]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:27:37 np0005532585.localdomain sudo[215798]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:38 np0005532585.localdomain sudo[215908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hoffxheakywkosjeceravmkyzuqbnnmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890057.999107-1500-213536472458757/AnsiballZ_podman_container_info.py
Nov 23 09:27:38 np0005532585.localdomain sudo[215908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:38 np0005532585.localdomain python3.9[215910]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 09:27:39 np0005532585.localdomain sudo[215908]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33817 DF PROTO=TCP SPT=50194 DPT=9100 SEQ=1007322871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDBB600000000001030307) 
Nov 23 09:27:39 np0005532585.localdomain sudo[215955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:27:39 np0005532585.localdomain sudo[215955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:27:39 np0005532585.localdomain sudo[215955]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:39 np0005532585.localdomain sudo[215973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:27:39 np0005532585.localdomain sudo[215973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:27:40 np0005532585.localdomain podman[216065]: 2025-11-23 09:27:40.506164507 +0000 UTC m=+0.095234557 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main)
Nov 23 09:27:40 np0005532585.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 23 09:27:40 np0005532585.localdomain podman[216065]: 2025-11-23 09:27:40.621338548 +0000 UTC m=+0.210408588 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Nov 23 09:27:40 np0005532585.localdomain sudo[215973]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:41 np0005532585.localdomain sudo[216134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:27:41 np0005532585.localdomain sudo[216134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:27:41 np0005532585.localdomain sudo[216134]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:41 np0005532585.localdomain sudo[216152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:27:41 np0005532585.localdomain sudo[216152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:27:41 np0005532585.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 23 09:27:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:27:41 np0005532585.localdomain sudo[216152]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:41 np0005532585.localdomain podman[216197]: 2025-11-23 09:27:41.820166251 +0000 UTC m=+0.095788964 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:27:41 np0005532585.localdomain podman[216197]: 2025-11-23 09:27:41.862032353 +0000 UTC m=+0.137655125 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:27:41 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:27:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10891 DF PROTO=TCP SPT=55340 DPT=9882 SEQ=773630041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDC5A70000000001030307) 
Nov 23 09:27:42 np0005532585.localdomain sudo[216229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:27:42 np0005532585.localdomain sudo[216229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:27:42 np0005532585.localdomain sudo[216229]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:42 np0005532585.localdomain sudo[216337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypkgkmbqwabnkwkpknoytsslvztlobim ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890062.196525-1539-13855601973073/AnsiballZ_edpm_container_manage.py
Nov 23 09:27:42 np0005532585.localdomain sudo[216337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:42 np0005532585.localdomain python3[216339]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:27:44 np0005532585.localdomain podman[216352]: 2025-11-23 09:27:43.134101664 +0000 UTC m=+0.048400747 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 23 09:27:44 np0005532585.localdomain podman[216401]: 
Nov 23 09:27:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:27:44 np0005532585.localdomain podman[216401]: 2025-11-23 09:27:44.946131531 +0000 UTC m=+0.087433193 container create 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:27:44 np0005532585.localdomain podman[216401]: 2025-11-23 09:27:44.904692953 +0000 UTC m=+0.045994675 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 23 09:27:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10893 DF PROTO=TCP SPT=55340 DPT=9882 SEQ=773630041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDD1A00000000001030307) 
Nov 23 09:27:44 np0005532585.localdomain python3[216339]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 23 09:27:45 np0005532585.localdomain podman[216414]: 2025-11-23 09:27:45.049383517 +0000 UTC m=+0.095268658 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:27:45 np0005532585.localdomain podman[216414]: 2025-11-23 09:27:45.086267069 +0000 UTC m=+0.132152230 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 23 09:27:45 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:27:45 np0005532585.localdomain sudo[216337]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:45 np0005532585.localdomain sudo[216562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jenguephomxjbwnellzzfjytnvbjuspf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890065.5226557-1563-249438255190502/AnsiballZ_stat.py
Nov 23 09:27:45 np0005532585.localdomain sudo[216562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:45 np0005532585.localdomain python3.9[216564]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:27:46 np0005532585.localdomain sudo[216562]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:46 np0005532585.localdomain sudo[216674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvoaxumbdtmmamoqmpiwqpsebfeesgmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890066.4518325-1590-42423769936454/AnsiballZ_file.py
Nov 23 09:27:46 np0005532585.localdomain sudo[216674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:46 np0005532585.localdomain python3.9[216676]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:46 np0005532585.localdomain sudo[216674]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:47 np0005532585.localdomain sudo[216729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iebdlmnrjmlsyxiuxfvianujyjbthoxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890066.4518325-1590-42423769936454/AnsiballZ_stat.py
Nov 23 09:27:47 np0005532585.localdomain sudo[216729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:47 np0005532585.localdomain python3.9[216731]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:27:47 np0005532585.localdomain sudo[216729]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:47 np0005532585.localdomain sudo[216838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpeevcyykdnglnrjovviznxrgabqbdkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890067.4072886-1590-251256976916081/AnsiballZ_copy.py
Nov 23 09:27:47 np0005532585.localdomain sudo[216838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:48 np0005532585.localdomain python3.9[216840]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890067.4072886-1590-251256976916081/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:48 np0005532585.localdomain sudo[216838]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5841 DF PROTO=TCP SPT=55886 DPT=9102 SEQ=2200684048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDDE200000000001030307) 
Nov 23 09:27:48 np0005532585.localdomain sudo[216893]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuvysxwrsjgygsjdkwaukhizgorhnqev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890067.4072886-1590-251256976916081/AnsiballZ_systemd.py
Nov 23 09:27:48 np0005532585.localdomain sudo[216893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:48 np0005532585.localdomain python3.9[216895]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:27:48 np0005532585.localdomain systemd-sysv-generator[216923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:27:48 np0005532585.localdomain systemd-rc-local-generator[216918]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:48 np0005532585.localdomain sudo[216893]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:49 np0005532585.localdomain sudo[216984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-errgxcjupjnifjxzblphblhsjipgxgwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890067.4072886-1590-251256976916081/AnsiballZ_systemd.py
Nov 23 09:27:49 np0005532585.localdomain sudo[216984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:49 np0005532585.localdomain python3.9[216986]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:27:49 np0005532585.localdomain systemd-sysv-generator[217018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:27:49 np0005532585.localdomain systemd-rc-local-generator[217013]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:27:50 np0005532585.localdomain systemd[1]: Starting multipathd container...
Nov 23 09:27:50 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:27:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 09:27:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 09:27:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:27:50 np0005532585.localdomain podman[217027]: 2025-11-23 09:27:50.19473036 +0000 UTC m=+0.164751265 container init 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + sudo -E kolla_set_configs
Nov 23 09:27:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:27:50 np0005532585.localdomain sudo[217046]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 09:27:50 np0005532585.localdomain podman[217027]: 2025-11-23 09:27:50.229534169 +0000 UTC m=+0.199555024 container start 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 09:27:50 np0005532585.localdomain podman[217027]: multipathd
Nov 23 09:27:50 np0005532585.localdomain sudo[217046]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:27:50 np0005532585.localdomain sudo[217046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 09:27:50 np0005532585.localdomain systemd[1]: Started multipathd container.
Nov 23 09:27:50 np0005532585.localdomain sudo[216984]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: INFO:__main__:Validating config file
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: INFO:__main__:Writing out command to execute
Nov 23 09:27:50 np0005532585.localdomain sudo[217046]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: ++ cat /run_command
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + CMD='/usr/sbin/multipathd -d'
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + ARGS=
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + sudo kolla_copy_cacerts
Nov 23 09:27:50 np0005532585.localdomain sudo[217063]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 23 09:27:50 np0005532585.localdomain sudo[217063]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:27:50 np0005532585.localdomain sudo[217063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 09:27:50 np0005532585.localdomain sudo[217063]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + [[ ! -n '' ]]
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + . kolla_extend_start
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: Running command: '/usr/sbin/multipathd -d'
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + umask 0022
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: + exec /usr/sbin/multipathd -d
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: 10013.504916 | --------start up--------
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: 10013.504933 | read /etc/multipath.conf
Nov 23 09:27:50 np0005532585.localdomain podman[217048]: 2025-11-23 09:27:50.327120625 +0000 UTC m=+0.091617869 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:27:50 np0005532585.localdomain multipathd[217040]: 10013.508976 | path checkers start up
Nov 23 09:27:50 np0005532585.localdomain podman[217048]: 2025-11-23 09:27:50.346206636 +0000 UTC m=+0.110703900 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:27:50 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:27:51 np0005532585.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 23 09:27:52 np0005532585.localdomain python3.9[217187]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:27:52 np0005532585.localdomain sudo[217297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggekrthgpchqifxkuebjtdocudcslvib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890072.6552799-1698-277363553211301/AnsiballZ_command.py
Nov 23 09:27:52 np0005532585.localdomain sudo[217297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:53 np0005532585.localdomain python3.9[217299]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:27:53 np0005532585.localdomain sudo[217297]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9612 DF PROTO=TCP SPT=42272 DPT=9102 SEQ=4110621087 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDF2A00000000001030307) 
Nov 23 09:27:53 np0005532585.localdomain sudo[217420]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twokwprwbsofahofzqoftxsgidpsaksc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890073.6829786-1722-86480542100296/AnsiballZ_systemd.py
Nov 23 09:27:53 np0005532585.localdomain sudo[217420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:54 np0005532585.localdomain python3.9[217422]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Stopping multipathd container...
Nov 23 09:27:54 np0005532585.localdomain multipathd[217040]: 10017.578704 | exit (signal)
Nov 23 09:27:54 np0005532585.localdomain multipathd[217040]: 10017.579219 | --------shut down-------
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: libpod-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.scope: Deactivated successfully.
Nov 23 09:27:54 np0005532585.localdomain podman[217426]: 2025-11-23 09:27:54.43208744 +0000 UTC m=+0.097408261 container died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.timer: Deactivated successfully.
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8-userdata-shm.mount: Deactivated successfully.
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b-merged.mount: Deactivated successfully.
Nov 23 09:27:54 np0005532585.localdomain podman[217426]: 2025-11-23 09:27:54.637755546 +0000 UTC m=+0.303076297 container cleanup 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:27:54 np0005532585.localdomain podman[217426]: multipathd
Nov 23 09:27:54 np0005532585.localdomain podman[217452]: 2025-11-23 09:27:54.734049363 +0000 UTC m=+0.064421596 container cleanup 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:27:54 np0005532585.localdomain podman[217452]: multipathd
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Stopped multipathd container.
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Starting multipathd container...
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:27:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 09:27:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:27:54 np0005532585.localdomain podman[217464]: 2025-11-23 09:27:54.891633392 +0000 UTC m=+0.130277644 container init 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: + sudo -E kolla_set_configs
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:27:54 np0005532585.localdomain sudo[217485]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 09:27:54 np0005532585.localdomain sudo[217485]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:27:54 np0005532585.localdomain sudo[217485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 09:27:54 np0005532585.localdomain podman[217464]: 2025-11-23 09:27:54.926963348 +0000 UTC m=+0.165607550 container start 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 23 09:27:54 np0005532585.localdomain podman[217464]: multipathd
Nov 23 09:27:54 np0005532585.localdomain systemd[1]: Started multipathd container.
Nov 23 09:27:54 np0005532585.localdomain sudo[217420]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: INFO:__main__:Validating config file
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: INFO:__main__:Writing out command to execute
Nov 23 09:27:54 np0005532585.localdomain sudo[217485]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: ++ cat /run_command
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: + CMD='/usr/sbin/multipathd -d'
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: + ARGS=
Nov 23 09:27:54 np0005532585.localdomain multipathd[217479]: + sudo kolla_copy_cacerts
Nov 23 09:27:55 np0005532585.localdomain sudo[217501]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 23 09:27:55 np0005532585.localdomain sudo[217501]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:27:55 np0005532585.localdomain sudo[217501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 23 09:27:55 np0005532585.localdomain sudo[217501]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: + [[ ! -n '' ]]
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: + . kolla_extend_start
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: Running command: '/usr/sbin/multipathd -d'
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: + umask 0022
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: + exec /usr/sbin/multipathd -d
Nov 23 09:27:55 np0005532585.localdomain podman[217487]: 2025-11-23 09:27:55.017665038 +0000 UTC m=+0.083833016 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: 10018.196342 | --------start up--------
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: 10018.196359 | read /etc/multipath.conf
Nov 23 09:27:55 np0005532585.localdomain multipathd[217479]: 10018.200030 | path checkers start up
Nov 23 09:27:55 np0005532585.localdomain podman[217487]: 2025-11-23 09:27:55.028577944 +0000 UTC m=+0.094745922 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 23 09:27:55 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:27:56 np0005532585.localdomain sudo[217625]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgxxhzlaadthckytfktfusxfohfvftzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890076.0138087-1746-113507313217870/AnsiballZ_file.py
Nov 23 09:27:56 np0005532585.localdomain sudo[217625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:56 np0005532585.localdomain python3.9[217627]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:56 np0005532585.localdomain sudo[217625]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:57 np0005532585.localdomain sudo[217735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrrnhmioyszjefyzgyppjdaqknbgveih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890077.0057404-1783-240461529225521/AnsiballZ_file.py
Nov 23 09:27:57 np0005532585.localdomain sudo[217735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10895 DF PROTO=TCP SPT=55340 DPT=9882 SEQ=773630041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE02200000000001030307) 
Nov 23 09:27:57 np0005532585.localdomain python3.9[217737]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 09:27:57 np0005532585.localdomain sudo[217735]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:57 np0005532585.localdomain sudo[217845]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ychfgpktgqrpeyjkgxzezsmvrycevyxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890077.7210839-1806-177230581729084/AnsiballZ_modprobe.py
Nov 23 09:27:58 np0005532585.localdomain sudo[217845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:58 np0005532585.localdomain python3.9[217847]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 23 09:27:58 np0005532585.localdomain sudo[217845]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:58 np0005532585.localdomain sudo[217963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zstatlruuldysevbzmegxizphxdhdaqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890078.4519227-1830-235798883936037/AnsiballZ_stat.py
Nov 23 09:27:58 np0005532585.localdomain sudo[217963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:58 np0005532585.localdomain python3.9[217965]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:27:58 np0005532585.localdomain sudo[217963]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:59 np0005532585.localdomain sudo[218051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftslozkdwznzxqbihpnvdwtkgpecbonl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890078.4519227-1830-235798883936037/AnsiballZ_copy.py
Nov 23 09:27:59 np0005532585.localdomain sudo[218051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:27:59 np0005532585.localdomain python3.9[218053]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890078.4519227-1830-235798883936037/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:27:59 np0005532585.localdomain sudo[218051]: pam_unix(sudo:session): session closed for user root
Nov 23 09:27:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2307 DF PROTO=TCP SPT=33386 DPT=9101 SEQ=2251829978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE0BAA0000000001030307) 
Nov 23 09:27:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30949 DF PROTO=TCP SPT=58386 DPT=9105 SEQ=1378184660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE0C5D0000000001030307) 
Nov 23 09:28:00 np0005532585.localdomain sudo[218161]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcadezaqpxkquoonaldqtpingjccacow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890079.8780186-1878-171273939952761/AnsiballZ_lineinfile.py
Nov 23 09:28:00 np0005532585.localdomain sudo[218161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:00 np0005532585.localdomain python3.9[218163]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:00 np0005532585.localdomain sudo[218161]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:01 np0005532585.localdomain sudo[218271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkkkgfsmxgdcxmywldnwzwlbnihiveri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890080.7551444-1902-22669775564699/AnsiballZ_systemd.py
Nov 23 09:28:01 np0005532585.localdomain sudo[218271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:01 np0005532585.localdomain python3.9[218273]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:28:01 np0005532585.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 09:28:01 np0005532585.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 23 09:28:01 np0005532585.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 23 09:28:01 np0005532585.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 23 09:28:01 np0005532585.localdomain systemd-modules-load[218277]: Module 'msr' is built in
Nov 23 09:28:01 np0005532585.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 23 09:28:01 np0005532585.localdomain sudo[218271]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:02 np0005532585.localdomain sudo[218385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekvlibjfpojzftjluzsyhhsslaqkkipb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890082.2036808-1926-240515265209377/AnsiballZ_dnf.py
Nov 23 09:28:02 np0005532585.localdomain sudo[218385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:02 np0005532585.localdomain python3.9[218387]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:28:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30951 DF PROTO=TCP SPT=58386 DPT=9105 SEQ=1378184660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE18600000000001030307) 
Nov 23 09:28:04 np0005532585.localdomain sshd[218390]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:28:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57727 DF PROTO=TCP SPT=55228 DPT=9100 SEQ=2277522031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE24200000000001030307) 
Nov 23 09:28:06 np0005532585.localdomain sshd[218390]: Invalid user user from 185.156.73.233 port 38556
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:28:06 np0005532585.localdomain sshd[218390]: Connection closed by invalid user user 185.156.73.233 port 38556 [preauth]
Nov 23 09:28:06 np0005532585.localdomain systemd-rc-local-generator[218423]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:28:06 np0005532585.localdomain systemd-sysv-generator[218427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:28:06 np0005532585.localdomain systemd-rc-local-generator[218459]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:28:06 np0005532585.localdomain systemd-sysv-generator[218464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:06 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 23 09:28:07 np0005532585.localdomain systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 23 09:28:07 np0005532585.localdomain lvm[218510]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 23 09:28:07 np0005532585.localdomain lvm[218510]: VG ceph_vg1 finished
Nov 23 09:28:07 np0005532585.localdomain lvm[218509]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 09:28:07 np0005532585.localdomain lvm[218509]: VG ceph_vg0 finished
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:28:07 np0005532585.localdomain systemd-sysv-generator[218567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:28:07 np0005532585.localdomain systemd-rc-local-generator[218562]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:07 np0005532585.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 09:28:08 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 09:28:08 np0005532585.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 23 09:28:08 np0005532585.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.270s CPU time.
Nov 23 09:28:08 np0005532585.localdomain systemd[1]: run-r03827389933a4623bedccab4fa957ae7.service: Deactivated successfully.
Nov 23 09:28:08 np0005532585.localdomain sudo[218385]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:28:09.237 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:28:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:28:09.238 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:28:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:28:09.240 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:28:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43973 DF PROTO=TCP SPT=50680 DPT=9100 SEQ=2399587717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE30A00000000001030307) 
Nov 23 09:28:09 np0005532585.localdomain python3.9[219806]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:28:10 np0005532585.localdomain sudo[219918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rguhxjsdqkvmtdrpbgeffuwpperzokyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890090.3170302-1978-255461204870614/AnsiballZ_file.py
Nov 23 09:28:10 np0005532585.localdomain sudo[219918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:10 np0005532585.localdomain python3.9[219920]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:10 np0005532585.localdomain sudo[219918]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:11 np0005532585.localdomain sudo[220028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewpsdkrkokfbsjsdaozdnqjthxtlnrny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890091.2650692-2011-161213176429732/AnsiballZ_systemd_service.py
Nov 23 09:28:11 np0005532585.localdomain sudo[220028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:11 np0005532585.localdomain python3.9[220030]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:28:11 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:28:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42499 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE3AD60000000001030307) 
Nov 23 09:28:11 np0005532585.localdomain systemd-sysv-generator[220061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:28:11 np0005532585.localdomain systemd-rc-local-generator[220053]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:28:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:11 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:28:12 np0005532585.localdomain sudo[220028]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:12 np0005532585.localdomain podman[220067]: 2025-11-23 09:28:12.246977305 +0000 UTC m=+0.083405560 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:28:12 np0005532585.localdomain podman[220067]: 2025-11-23 09:28:12.284267357 +0000 UTC m=+0.120695652 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:28:12 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:28:13 np0005532585.localdomain python3.9[220199]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:28:13 np0005532585.localdomain network[220216]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:28:13 np0005532585.localdomain network[220217]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:28:13 np0005532585.localdomain network[220218]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:28:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42501 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE46E00000000001030307) 
Nov 23 09:28:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:28:15 np0005532585.localdomain podman[220246]: 2025-11-23 09:28:15.213862017 +0000 UTC m=+0.079571384 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:28:15 np0005532585.localdomain podman[220246]: 2025-11-23 09:28:15.222477025 +0000 UTC m=+0.088186402 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:28:15 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:28:15 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:28:17 np0005532585.localdomain sudo[220469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkkephardfvjjdvuxwzzbvpkhfdyhulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890097.5040665-2068-121367352658958/AnsiballZ_systemd_service.py
Nov 23 09:28:17 np0005532585.localdomain sudo[220469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:18 np0005532585.localdomain python3.9[220471]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:18 np0005532585.localdomain sudo[220469]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:18 np0005532585.localdomain sudo[220580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otxkklyalnwiigbxilsngdaqjickhemc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890098.2515202-2068-267712516111462/AnsiballZ_systemd_service.py
Nov 23 09:28:18 np0005532585.localdomain sudo[220580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:18 np0005532585.localdomain python3.9[220582]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:18 np0005532585.localdomain sudo[220580]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42502 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE56A00000000001030307) 
Nov 23 09:28:19 np0005532585.localdomain sudo[220691]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npfxyojefhcjzpbnjmszpuhxplsrckzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890099.0147293-2068-268532204291432/AnsiballZ_systemd_service.py
Nov 23 09:28:19 np0005532585.localdomain sudo[220691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:19 np0005532585.localdomain python3.9[220693]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:19 np0005532585.localdomain sudo[220691]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:20 np0005532585.localdomain sudo[220802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgecgfdpidjcdpmirqikuvkkcqdnxtcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890099.7914858-2068-166959549482153/AnsiballZ_systemd_service.py
Nov 23 09:28:20 np0005532585.localdomain sudo[220802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:20 np0005532585.localdomain python3.9[220804]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:21 np0005532585.localdomain sudo[220802]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:21 np0005532585.localdomain sudo[220913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vutkrwrkryqxaoupidgglqzuqguesrer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890101.5594578-2068-63356460513463/AnsiballZ_systemd_service.py
Nov 23 09:28:21 np0005532585.localdomain sudo[220913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:22 np0005532585.localdomain python3.9[220915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:22 np0005532585.localdomain sudo[220913]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:22 np0005532585.localdomain sudo[221024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnqdfgqwwtymqbogldypwtplmxbnthqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890102.3166595-2068-141765292225896/AnsiballZ_systemd_service.py
Nov 23 09:28:22 np0005532585.localdomain sudo[221024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:22 np0005532585.localdomain python3.9[221026]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:22 np0005532585.localdomain sudo[221024]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:23 np0005532585.localdomain sudo[221135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysinpowmbnbiqtowjaebvsejjgqjmsai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890103.0513155-2068-105191200430679/AnsiballZ_systemd_service.py
Nov 23 09:28:23 np0005532585.localdomain sudo[221135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16865 DF PROTO=TCP SPT=51904 DPT=9102 SEQ=718674353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE67E00000000001030307) 
Nov 23 09:28:23 np0005532585.localdomain python3.9[221137]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:23 np0005532585.localdomain sudo[221135]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:24 np0005532585.localdomain sudo[221246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnmgtheckaitziqcgrpnlesdwsfirxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890103.7836268-2068-37503244786397/AnsiballZ_systemd_service.py
Nov 23 09:28:24 np0005532585.localdomain sudo[221246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:24 np0005532585.localdomain python3.9[221248]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:28:24 np0005532585.localdomain sudo[221246]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:28:26 np0005532585.localdomain podman[221267]: 2025-11-23 09:28:26.026126209 +0000 UTC m=+0.082504464 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:28:26 np0005532585.localdomain podman[221267]: 2025-11-23 09:28:26.037759631 +0000 UTC m=+0.094137896 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:28:26 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:28:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42503 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE76200000000001030307) 
Nov 23 09:28:28 np0005532585.localdomain sudo[221375]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgxpfvjwfdconhzuvjwskjpaedmyxmbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890107.9437845-2245-34865764225591/AnsiballZ_file.py
Nov 23 09:28:28 np0005532585.localdomain sudo[221375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:28 np0005532585.localdomain python3.9[221377]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:28 np0005532585.localdomain sudo[221375]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:28 np0005532585.localdomain sudo[221485]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urrdghihzkcdeiwlbtsbdnwdlgvjqkln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890108.51332-2245-175261441096859/AnsiballZ_file.py
Nov 23 09:28:28 np0005532585.localdomain sudo[221485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:28 np0005532585.localdomain python3.9[221487]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:29 np0005532585.localdomain sudo[221485]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:29 np0005532585.localdomain sudo[221595]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zquwttnxhmwzfqbopiypxaemwcbpqvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890109.1181748-2245-271885811030030/AnsiballZ_file.py
Nov 23 09:28:29 np0005532585.localdomain sudo[221595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:29 np0005532585.localdomain python3.9[221597]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:29 np0005532585.localdomain sudo[221595]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3102 DF PROTO=TCP SPT=47166 DPT=9101 SEQ=769331823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE80DA0000000001030307) 
Nov 23 09:28:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37647 DF PROTO=TCP SPT=59014 DPT=9105 SEQ=901764965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE818B0000000001030307) 
Nov 23 09:28:29 np0005532585.localdomain sudo[221705]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obcivvxntruxaafhorgsxooykwktiuui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890109.7373316-2245-36766927849332/AnsiballZ_file.py
Nov 23 09:28:29 np0005532585.localdomain sudo[221705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:30 np0005532585.localdomain python3.9[221707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:30 np0005532585.localdomain sudo[221705]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:30 np0005532585.localdomain sudo[221815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elysrorfvvyfgxpbwenlphzqcvuamikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890110.358816-2245-95945332794542/AnsiballZ_file.py
Nov 23 09:28:30 np0005532585.localdomain sudo[221815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:30 np0005532585.localdomain python3.9[221817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:30 np0005532585.localdomain sudo[221815]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:31 np0005532585.localdomain sudo[221925]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-liklnabwpboxrebrrbfvypfazdnqyink ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890110.9432266-2245-172294484768714/AnsiballZ_file.py
Nov 23 09:28:31 np0005532585.localdomain sudo[221925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:31 np0005532585.localdomain python3.9[221927]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:31 np0005532585.localdomain sudo[221925]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:31 np0005532585.localdomain sudo[222035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esosrwbvnqelcimafmfxtzvatuojhgcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890111.53091-2245-188311629379398/AnsiballZ_file.py
Nov 23 09:28:31 np0005532585.localdomain sudo[222035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:32 np0005532585.localdomain python3.9[222037]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:32 np0005532585.localdomain sudo[222035]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:32 np0005532585.localdomain sudo[222145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyrcnyrsqqgpelifokfaarokdeuwbifb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890112.1188893-2245-214969449881660/AnsiballZ_file.py
Nov 23 09:28:32 np0005532585.localdomain sudo[222145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:32 np0005532585.localdomain python3.9[222147]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:32 np0005532585.localdomain sudo[222145]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3104 DF PROTO=TCP SPT=47166 DPT=9101 SEQ=769331823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE8CE00000000001030307) 
Nov 23 09:28:33 np0005532585.localdomain sudo[222255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tijseftinpeftjuxvpbbsivekmjnjkqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890112.9260218-2417-60799925951201/AnsiballZ_file.py
Nov 23 09:28:33 np0005532585.localdomain sudo[222255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:33 np0005532585.localdomain python3.9[222257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:33 np0005532585.localdomain sudo[222255]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:33 np0005532585.localdomain sudo[222365]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdcaeaofktrvgeuxtiuajgfutkrmifoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890113.4924998-2417-56123444580029/AnsiballZ_file.py
Nov 23 09:28:33 np0005532585.localdomain sudo[222365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:33 np0005532585.localdomain python3.9[222367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:33 np0005532585.localdomain sudo[222365]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:34 np0005532585.localdomain sudo[222475]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqutlnrglrghyymfiyyjthykggdjkgym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890114.074287-2417-215512892352228/AnsiballZ_file.py
Nov 23 09:28:34 np0005532585.localdomain sudo[222475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:34 np0005532585.localdomain python3.9[222477]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:34 np0005532585.localdomain sudo[222475]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:34 np0005532585.localdomain sudo[222585]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvwjasvpomzpvsqutngdooufosxoynst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890114.6427689-2417-198538752668657/AnsiballZ_file.py
Nov 23 09:28:34 np0005532585.localdomain sudo[222585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:35 np0005532585.localdomain python3.9[222587]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:35 np0005532585.localdomain sudo[222585]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:35 np0005532585.localdomain sudo[222695]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmicobfyrnjruqofmjpynqfmyayzvvps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890115.2261367-2417-251915118085950/AnsiballZ_file.py
Nov 23 09:28:35 np0005532585.localdomain sudo[222695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:35 np0005532585.localdomain python3.9[222697]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:35 np0005532585.localdomain sudo[222695]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:36 np0005532585.localdomain sudo[222805]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaschfvkknrqmzpuhqpjzsflryxbdkyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890115.798658-2417-23860285481252/AnsiballZ_file.py
Nov 23 09:28:36 np0005532585.localdomain sudo[222805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:36 np0005532585.localdomain python3.9[222807]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:36 np0005532585.localdomain sudo[222805]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33820 DF PROTO=TCP SPT=50194 DPT=9100 SEQ=1007322871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE9A200000000001030307) 
Nov 23 09:28:36 np0005532585.localdomain sudo[222915]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtkowqnscanknfhoyxzgfvwtxugegzrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890116.3604228-2417-149579997182791/AnsiballZ_file.py
Nov 23 09:28:36 np0005532585.localdomain sudo[222915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:36 np0005532585.localdomain python3.9[222917]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:36 np0005532585.localdomain sudo[222915]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:37 np0005532585.localdomain sudo[223025]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fupxxquqkvcpnxwwzapeasxsbnweqalp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890117.254157-2417-68714278980721/AnsiballZ_file.py
Nov 23 09:28:37 np0005532585.localdomain sudo[223025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:37 np0005532585.localdomain python3.9[223027]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:28:37 np0005532585.localdomain sudo[223025]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20598 DF PROTO=TCP SPT=45392 DPT=9100 SEQ=4141493920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEA5E00000000001030307) 
Nov 23 09:28:40 np0005532585.localdomain sudo[223135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evlhifyzwapxwxqabefgivzaqaugicrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890119.8188593-2590-210387044588909/AnsiballZ_command.py
Nov 23 09:28:40 np0005532585.localdomain sudo[223135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:40 np0005532585.localdomain python3.9[223137]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:40 np0005532585.localdomain sudo[223135]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:41 np0005532585.localdomain python3.9[223247]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:28:41 np0005532585.localdomain sudo[223355]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfkcqdjrakftfusdnasdsbaxsvotbqgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890121.498111-2644-125356782402705/AnsiballZ_systemd_service.py
Nov 23 09:28:41 np0005532585.localdomain sudo[223355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19232 DF PROTO=TCP SPT=43124 DPT=9882 SEQ=1417323512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEB0070000000001030307) 
Nov 23 09:28:42 np0005532585.localdomain python3.9[223357]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:28:42 np0005532585.localdomain systemd-rc-local-generator[223378]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:28:42 np0005532585.localdomain systemd-sysv-generator[223385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:28:42 np0005532585.localdomain sudo[223355]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:28:42 np0005532585.localdomain sudo[223393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:28:42 np0005532585.localdomain sudo[223393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:28:42 np0005532585.localdomain sudo[223393]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: tmp-crun.6kHDsX.mount: Deactivated successfully.
Nov 23 09:28:42 np0005532585.localdomain podman[223410]: 2025-11-23 09:28:42.504335812 +0000 UTC m=+0.092688307 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:28:42 np0005532585.localdomain sudo[223435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:28:42 np0005532585.localdomain sudo[223435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:28:42 np0005532585.localdomain podman[223410]: 2025-11-23 09:28:42.547330112 +0000 UTC m=+0.135682657 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:28:42 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:28:43 np0005532585.localdomain sudo[223580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztsmadtsmydfmryzvcoawwbndwukvrvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890122.6992083-2668-89137491571552/AnsiballZ_command.py
Nov 23 09:28:43 np0005532585.localdomain sudo[223580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:43 np0005532585.localdomain sudo[223435]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:43 np0005532585.localdomain python3.9[223582]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:43 np0005532585.localdomain sudo[223580]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:43 np0005532585.localdomain sudo[223705]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyeyweyzatjivgwgaxevczhpywpxblcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890123.4049203-2668-258863590965623/AnsiballZ_command.py
Nov 23 09:28:43 np0005532585.localdomain sudo[223705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:43 np0005532585.localdomain python3.9[223707]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:43 np0005532585.localdomain sudo[223708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:28:43 np0005532585.localdomain sudo[223708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:28:43 np0005532585.localdomain sudo[223708]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:43 np0005532585.localdomain sudo[223705]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:44 np0005532585.localdomain sudo[223834]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xspeheftzldwkehglgjaxjbuadjhrfjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890123.9648492-2668-237114025439373/AnsiballZ_command.py
Nov 23 09:28:44 np0005532585.localdomain sudo[223834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:44 np0005532585.localdomain python3.9[223836]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:44 np0005532585.localdomain sudo[223834]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:44 np0005532585.localdomain sudo[223945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdbhrkceitzvbrekiotavcwwdjmrqxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890124.5012152-2668-217480736982235/AnsiballZ_command.py
Nov 23 09:28:44 np0005532585.localdomain sudo[223945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:44 np0005532585.localdomain python3.9[223947]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3106 DF PROTO=TCP SPT=47166 DPT=9101 SEQ=769331823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEBC200000000001030307) 
Nov 23 09:28:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:28:46 np0005532585.localdomain sudo[223945]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:46 np0005532585.localdomain podman[223949]: 2025-11-23 09:28:46.033121225 +0000 UTC m=+0.087739279 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:28:46 np0005532585.localdomain podman[223949]: 2025-11-23 09:28:46.063023403 +0000 UTC m=+0.117641477 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 09:28:46 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:28:46 np0005532585.localdomain sudo[224074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgvraouwjtxokujkxqdqmmvptiwusnes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890126.1107423-2668-242954457547711/AnsiballZ_command.py
Nov 23 09:28:46 np0005532585.localdomain sudo[224074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:46 np0005532585.localdomain python3.9[224076]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:46 np0005532585.localdomain sudo[224074]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:46 np0005532585.localdomain sudo[224185]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkpheusbuhfftogdssrkweljlydkeffy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890126.7275965-2668-71087483918564/AnsiballZ_command.py
Nov 23 09:28:46 np0005532585.localdomain sudo[224185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:47 np0005532585.localdomain python3.9[224187]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:47 np0005532585.localdomain sudo[224185]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:47 np0005532585.localdomain sudo[224296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjwjhbjuwjvddwkuipltclovndyfcqna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890127.3199415-2668-32661609480240/AnsiballZ_command.py
Nov 23 09:28:47 np0005532585.localdomain sudo[224296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:48 np0005532585.localdomain python3.9[224298]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:48 np0005532585.localdomain sudo[224296]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16867 DF PROTO=TCP SPT=51904 DPT=9102 SEQ=718674353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEC8200000000001030307) 
Nov 23 09:28:48 np0005532585.localdomain sudo[224407]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdoyoedggxgnrrdrvrwuhqgjbgcryjsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890128.1769927-2668-127343439093599/AnsiballZ_command.py
Nov 23 09:28:48 np0005532585.localdomain sudo[224407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:48 np0005532585.localdomain python3.9[224409]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:28:48 np0005532585.localdomain sudo[224407]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40351 DF PROTO=TCP SPT=40044 DPT=9102 SEQ=4073162484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEDD200000000001030307) 
Nov 23 09:28:53 np0005532585.localdomain sudo[224518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdgqggsvcsqzeklfmxukocqxfcwtfbbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890132.5218115-2875-228329191301927/AnsiballZ_file.py
Nov 23 09:28:53 np0005532585.localdomain sudo[224518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:53 np0005532585.localdomain python3.9[224520]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:53 np0005532585.localdomain sudo[224518]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:54 np0005532585.localdomain sudo[224628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orqzvobabwthpirjlfwotflziykvhiwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890133.9477623-2875-216333330561940/AnsiballZ_file.py
Nov 23 09:28:54 np0005532585.localdomain sudo[224628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:54 np0005532585.localdomain python3.9[224630]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:54 np0005532585.localdomain sudo[224628]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:54 np0005532585.localdomain sudo[224738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrwrryhzlgehbayixzldxtznkohiqbyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890134.5308416-2875-227250069894798/AnsiballZ_file.py
Nov 23 09:28:54 np0005532585.localdomain sudo[224738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:54 np0005532585.localdomain python3.9[224740]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:54 np0005532585.localdomain sudo[224738]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:55 np0005532585.localdomain sudo[224848]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoescwypibwovfzzkdzzdrgubgkjqrkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890135.2128685-2941-188689128198869/AnsiballZ_file.py
Nov 23 09:28:55 np0005532585.localdomain sudo[224848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:55 np0005532585.localdomain python3.9[224850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:55 np0005532585.localdomain sudo[224848]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:56 np0005532585.localdomain sudo[224958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grdcquuqbftzavkwyfiizbwnznjsyjce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890135.8266451-2941-144093630767658/AnsiballZ_file.py
Nov 23 09:28:56 np0005532585.localdomain sudo[224958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:28:56 np0005532585.localdomain systemd[1]: tmp-crun.RYZ1rj.mount: Deactivated successfully.
Nov 23 09:28:56 np0005532585.localdomain podman[224961]: 2025-11-23 09:28:56.222210299 +0000 UTC m=+0.099078972 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:28:56 np0005532585.localdomain podman[224961]: 2025-11-23 09:28:56.233390459 +0000 UTC m=+0.110259112 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 23 09:28:56 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:28:56 np0005532585.localdomain python3.9[224960]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:56 np0005532585.localdomain sudo[224958]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:56 np0005532585.localdomain sudo[225085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pewzrlrsafazvolbmutjgjvxgyssupnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890136.4535027-2941-112348904306024/AnsiballZ_file.py
Nov 23 09:28:56 np0005532585.localdomain sudo[225085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:56 np0005532585.localdomain python3.9[225087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:56 np0005532585.localdomain sudo[225085]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19236 DF PROTO=TCP SPT=43124 DPT=9882 SEQ=1417323512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEEC200000000001030307) 
Nov 23 09:28:57 np0005532585.localdomain sudo[225195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqbzystfzujurmpulqdovnrpkseqmrdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890137.0677993-2941-109117298883822/AnsiballZ_file.py
Nov 23 09:28:57 np0005532585.localdomain sudo[225195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:57 np0005532585.localdomain python3.9[225197]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:57 np0005532585.localdomain sudo[225195]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:57 np0005532585.localdomain sudo[225305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zubcfyludstmgttizibfprqbqmnmlfmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890137.6788683-2941-3317016384113/AnsiballZ_file.py
Nov 23 09:28:57 np0005532585.localdomain sudo[225305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:58 np0005532585.localdomain python3.9[225307]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:58 np0005532585.localdomain sudo[225305]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:58 np0005532585.localdomain sudo[225415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzfksqdrxehtrmuzunmtjdooevcmpetq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890138.2724733-2941-101495308668829/AnsiballZ_file.py
Nov 23 09:28:58 np0005532585.localdomain sudo[225415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:58 np0005532585.localdomain python3.9[225417]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:58 np0005532585.localdomain sudo[225415]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:59 np0005532585.localdomain sudo[225525]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qagyligedfvshozfjdqfgbesbfrsolux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890139.1224248-2941-32486697412543/AnsiballZ_file.py
Nov 23 09:28:59 np0005532585.localdomain sudo[225525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:28:59 np0005532585.localdomain python3.9[225527]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:28:59 np0005532585.localdomain sudo[225525]: pam_unix(sudo:session): session closed for user root
Nov 23 09:28:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38985 DF PROTO=TCP SPT=50238 DPT=9101 SEQ=1719451716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEF60A0000000001030307) 
Nov 23 09:28:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49821 DF PROTO=TCP SPT=34514 DPT=9105 SEQ=2692827760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEF6BB0000000001030307) 
Nov 23 09:29:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38987 DF PROTO=TCP SPT=50238 DPT=9101 SEQ=1719451716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF02200000000001030307) 
Nov 23 09:29:05 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43976 DF PROTO=TCP SPT=50680 DPT=9100 SEQ=2399587717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF0E200000000001030307) 
Nov 23 09:29:07 np0005532585.localdomain sudo[225635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obqmqilgzfxwvumyuqqmwsqpyptbjgkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890146.5199358-3266-263789911456466/AnsiballZ_getent.py
Nov 23 09:29:07 np0005532585.localdomain sudo[225635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:07 np0005532585.localdomain python3.9[225637]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 23 09:29:07 np0005532585.localdomain sudo[225635]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:07 np0005532585.localdomain sudo[225746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgbbhvolmeoounaydrxsktecyvqiorhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890147.4348385-3290-182070202873743/AnsiballZ_group.py
Nov 23 09:29:07 np0005532585.localdomain sudo[225746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:08 np0005532585.localdomain python3.9[225748]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 09:29:08 np0005532585.localdomain groupadd[225749]: group added to /etc/group: name=nova, GID=42436
Nov 23 09:29:08 np0005532585.localdomain groupadd[225749]: group added to /etc/gshadow: name=nova
Nov 23 09:29:08 np0005532585.localdomain groupadd[225749]: new group: name=nova, GID=42436
Nov 23 09:29:08 np0005532585.localdomain sudo[225746]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40928 DF PROTO=TCP SPT=47280 DPT=9100 SEQ=2376654661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF1AE00000000001030307) 
Nov 23 09:29:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:29:09.239 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:29:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:29:09.239 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:29:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:29:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:29:09 np0005532585.localdomain sudo[225862]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrmtjbknhzhwsmqkxyqlsdxvtjexizno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890149.3977077-3314-243630045536676/AnsiballZ_user.py
Nov 23 09:29:09 np0005532585.localdomain sudo[225862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:10 np0005532585.localdomain python3.9[225864]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 09:29:10 np0005532585.localdomain useradd[225866]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Nov 23 09:29:10 np0005532585.localdomain useradd[225866]: add 'nova' to group 'libvirt'
Nov 23 09:29:10 np0005532585.localdomain useradd[225866]: add 'nova' to shadow group 'libvirt'
Nov 23 09:29:10 np0005532585.localdomain sudo[225862]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:11 np0005532585.localdomain sshd[225890]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:29:11 np0005532585.localdomain sshd[225890]: Accepted publickey for zuul from 192.168.122.30 port 37962 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:29:11 np0005532585.localdomain systemd-logind[761]: New session 54 of user zuul.
Nov 23 09:29:11 np0005532585.localdomain systemd[1]: Started Session 54 of User zuul.
Nov 23 09:29:11 np0005532585.localdomain sshd[225890]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:29:11 np0005532585.localdomain sshd[225893]: Received disconnect from 192.168.122.30 port 37962:11: disconnected by user
Nov 23 09:29:11 np0005532585.localdomain sshd[225893]: Disconnected from user zuul 192.168.122.30 port 37962
Nov 23 09:29:11 np0005532585.localdomain sshd[225890]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:29:11 np0005532585.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Nov 23 09:29:11 np0005532585.localdomain systemd-logind[761]: Session 54 logged out. Waiting for processes to exit.
Nov 23 09:29:11 np0005532585.localdomain systemd-logind[761]: Removed session 54.
Nov 23 09:29:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49350 DF PROTO=TCP SPT=36458 DPT=9882 SEQ=3389845660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF25360000000001030307) 
Nov 23 09:29:12 np0005532585.localdomain python3.9[226001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:12 np0005532585.localdomain python3.9[226087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890151.75823-3389-149144960786998/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:29:13 np0005532585.localdomain podman[226124]: 2025-11-23 09:29:13.043121017 +0000 UTC m=+0.098440831 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 23 09:29:13 np0005532585.localdomain podman[226124]: 2025-11-23 09:29:13.125669444 +0000 UTC m=+0.180989308 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 23 09:29:13 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:29:13 np0005532585.localdomain python3.9[226220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:13 np0005532585.localdomain python3.9[226275]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:14 np0005532585.localdomain python3.9[226383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:14 np0005532585.localdomain python3.9[226469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890153.9351525-3389-205843061096348/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49352 DF PROTO=TCP SPT=36458 DPT=9882 SEQ=3389845660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF31210000000001030307) 
Nov 23 09:29:15 np0005532585.localdomain python3.9[226577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:15 np0005532585.localdomain python3.9[226663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890154.9653418-3389-60953597794694/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=c48862f04c3bb6bb101bc9efe68e434d3f83ed7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:29:17 np0005532585.localdomain python3.9[226771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:17 np0005532585.localdomain systemd[1]: tmp-crun.doBBTs.mount: Deactivated successfully.
Nov 23 09:29:17 np0005532585.localdomain podman[226772]: 2025-11-23 09:29:17.028359478 +0000 UTC m=+0.083400114 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:29:17 np0005532585.localdomain podman[226772]: 2025-11-23 09:29:17.063246668 +0000 UTC m=+0.118287304 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:29:17 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:29:17 np0005532585.localdomain python3.9[226874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890156.596046-3389-236023126748895/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:18 np0005532585.localdomain python3.9[226982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40353 DF PROTO=TCP SPT=40044 DPT=9102 SEQ=4073162484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF3E200000000001030307) 
Nov 23 09:29:18 np0005532585.localdomain python3.9[227068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890157.6420708-3389-241080197897321/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:20 np0005532585.localdomain sudo[227176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvnmrqslijlriqizoogwzjfqcfwflozq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890160.1247013-3638-266409752350614/AnsiballZ_file.py
Nov 23 09:29:20 np0005532585.localdomain sudo[227176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:20 np0005532585.localdomain python3.9[227178]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:29:20 np0005532585.localdomain sudo[227176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:21 np0005532585.localdomain sudo[227286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klrjcknytciqoissdfndiiuxshrwqlzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890160.8716428-3663-65666422241532/AnsiballZ_copy.py
Nov 23 09:29:21 np0005532585.localdomain sudo[227286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:21 np0005532585.localdomain python3.9[227288]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:29:21 np0005532585.localdomain sudo[227286]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:21 np0005532585.localdomain sudo[227396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymvizsvgvjjqqpurogcdwtwqlkeadfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890161.585993-3687-127360553163005/AnsiballZ_stat.py
Nov 23 09:29:21 np0005532585.localdomain sudo[227396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:22 np0005532585.localdomain python3.9[227398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:22 np0005532585.localdomain sudo[227396]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:22 np0005532585.localdomain sudo[227508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avvdjethjwxbstptjqzqzzavtvszqemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890162.4174402-3713-124881250312912/AnsiballZ_file.py
Nov 23 09:29:22 np0005532585.localdomain sudo[227508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:22 np0005532585.localdomain python3.9[227510]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:29:22 np0005532585.localdomain sudo[227508]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16594 DF PROTO=TCP SPT=60978 DPT=9102 SEQ=430266586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF52610000000001030307) 
Nov 23 09:29:23 np0005532585.localdomain python3.9[227618]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:24 np0005532585.localdomain python3.9[227728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:24 np0005532585.localdomain python3.9[227814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890163.9385962-3765-74460258997354/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:25 np0005532585.localdomain python3.9[227922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:29:26 np0005532585.localdomain python3.9[228008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890165.1541164-3810-251059714091482/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:29:26 np0005532585.localdomain sudo[228116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypjjiwseyrlmuffueghhxbyzzoxyinot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890166.5945866-3860-159401138221332/AnsiballZ_container_config_data.py
Nov 23 09:29:26 np0005532585.localdomain sudo[228116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:29:26 np0005532585.localdomain systemd[1]: tmp-crun.nNphTj.mount: Deactivated successfully.
Nov 23 09:29:26 np0005532585.localdomain podman[228119]: 2025-11-23 09:29:26.981468123 +0000 UTC m=+0.098671727 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:29:26 np0005532585.localdomain podman[228119]: 2025-11-23 09:29:26.991327453 +0000 UTC m=+0.108531097 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:29:27 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:29:27 np0005532585.localdomain python3.9[228118]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 23 09:29:27 np0005532585.localdomain sudo[228116]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49354 DF PROTO=TCP SPT=36458 DPT=9882 SEQ=3389845660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF62200000000001030307) 
Nov 23 09:29:27 np0005532585.localdomain sudo[228246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltvukjanfiicnxyhwwtfhrblrqgrfemx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890167.354749-3887-11294778229910/AnsiballZ_container_config_hash.py
Nov 23 09:29:27 np0005532585.localdomain sudo[228246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:27 np0005532585.localdomain python3.9[228248]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:29:27 np0005532585.localdomain sudo[228246]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:28 np0005532585.localdomain sudo[228356]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eijvlfhkseaayrhbkyrbchlrywgbhazn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890168.2677326-3917-167721328359084/AnsiballZ_edpm_container_manage.py
Nov 23 09:29:28 np0005532585.localdomain sudo[228356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:28 np0005532585.localdomain python3[228358]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:29:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16982 DF PROTO=TCP SPT=47048 DPT=9101 SEQ=15123556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF6B3A0000000001030307) 
Nov 23 09:29:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40984 DF PROTO=TCP SPT=38618 DPT=9105 SEQ=695840909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF6BEB0000000001030307) 
Nov 23 09:29:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16984 DF PROTO=TCP SPT=47048 DPT=9101 SEQ=15123556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF77610000000001030307) 
Nov 23 09:29:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20601 DF PROTO=TCP SPT=45392 DPT=9100 SEQ=4141493920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF84200000000001030307) 
Nov 23 09:29:39 np0005532585.localdomain podman[228372]: 2025-11-23 09:29:28.9287112 +0000 UTC m=+0.036686795 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 23 09:29:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31310 DF PROTO=TCP SPT=51714 DPT=9100 SEQ=3062298622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF90200000000001030307) 
Nov 23 09:29:39 np0005532585.localdomain podman[228435]: 
Nov 23 09:29:39 np0005532585.localdomain podman[228435]: 2025-11-23 09:29:39.322349078 +0000 UTC m=+0.073234107 container create 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.vendor=CentOS)
Nov 23 09:29:39 np0005532585.localdomain podman[228435]: 2025-11-23 09:29:39.281137865 +0000 UTC m=+0.032022944 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 23 09:29:39 np0005532585.localdomain python3[228358]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 23 09:29:39 np0005532585.localdomain sudo[228356]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:40 np0005532585.localdomain sudo[228580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wikmxdrhchxpgdpodonzjtxxcduwlusm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890180.3607817-3941-44573076392563/AnsiballZ_stat.py
Nov 23 09:29:40 np0005532585.localdomain sudo[228580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:40 np0005532585.localdomain python3.9[228582]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:40 np0005532585.localdomain sudo[228580]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:41 np0005532585.localdomain sudo[228692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlwjhnldlmtiafzdtmlabojorttysgki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890181.505685-3977-193818964765562/AnsiballZ_container_config_data.py
Nov 23 09:29:41 np0005532585.localdomain sudo[228692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9376 DF PROTO=TCP SPT=52692 DPT=9882 SEQ=3079844263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF9A7C0000000001030307) 
Nov 23 09:29:42 np0005532585.localdomain python3.9[228694]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 23 09:29:42 np0005532585.localdomain sudo[228692]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:42 np0005532585.localdomain sudo[228802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crxmpkjfttywzrwovptqanhcyhdrlstp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890182.3278975-4004-69094479170720/AnsiballZ_container_config_hash.py
Nov 23 09:29:42 np0005532585.localdomain sudo[228802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:42 np0005532585.localdomain python3.9[228804]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:29:42 np0005532585.localdomain sudo[228802]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:43 np0005532585.localdomain sudo[228912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clsfjjsdcpqjggvwdcganybxqqikfxtk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890183.2355437-4034-219008261243840/AnsiballZ_edpm_container_manage.py
Nov 23 09:29:43 np0005532585.localdomain sudo[228912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:29:43 np0005532585.localdomain podman[228915]: 2025-11-23 09:29:43.636698085 +0000 UTC m=+0.090518801 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:29:43 np0005532585.localdomain podman[228915]: 2025-11-23 09:29:43.706806795 +0000 UTC m=+0.160627461 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 23 09:29:43 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:29:43 np0005532585.localdomain python3[228914]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:29:44 np0005532585.localdomain sudo[228970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:29:44 np0005532585.localdomain sudo[228970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:29:44 np0005532585.localdomain sudo[228970]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:44 np0005532585.localdomain python3[228914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",
                                                                    "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:33:31.011385583Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211770748,
                                                                    "VirtualSize": 1211770748,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",
                                                                              "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",
                                                                              "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:19.349843192Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:59.347040136Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:02.744397841Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:22.134493628Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:23.375712978Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:39.628890315Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:54.615675342Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:59.80799783Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:31:53.511902352Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:30.270978852Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:30.703308349Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:31.008952634Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:31.009001445Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:37.508054621Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 23 09:29:44 np0005532585.localdomain sudo[229005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:29:44 np0005532585.localdomain sudo[229005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:29:44 np0005532585.localdomain podman[229021]: 2025-11-23 09:29:44.17644785 +0000 UTC m=+0.089623474 container remove e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 23 09:29:44 np0005532585.localdomain python3[228914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Nov 23 09:29:44 np0005532585.localdomain podman[229037]: 
Nov 23 09:29:44 np0005532585.localdomain podman[229037]: 2025-11-23 09:29:44.28444477 +0000 UTC m=+0.086201439 container create 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:29:44 np0005532585.localdomain podman[229037]: 2025-11-23 09:29:44.245290101 +0000 UTC m=+0.047046800 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 23 09:29:44 np0005532585.localdomain python3[228914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 23 09:29:44 np0005532585.localdomain sudo[228912]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:44 np0005532585.localdomain sudo[229005]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:44 np0005532585.localdomain sudo[229213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyadkrywddsjltbgnhiwliwbfonquglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890184.681992-4058-266103641254993/AnsiballZ_stat.py
Nov 23 09:29:44 np0005532585.localdomain sudo[229213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:45 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9378 DF PROTO=TCP SPT=52692 DPT=9882 SEQ=3079844263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFA6A00000000001030307) 
Nov 23 09:29:45 np0005532585.localdomain python3.9[229215]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:45 np0005532585.localdomain sudo[229213]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:45 np0005532585.localdomain sudo[229235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:29:45 np0005532585.localdomain sudo[229235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:29:45 np0005532585.localdomain sudo[229235]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:45 np0005532585.localdomain sudo[229343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrfjzptlsmefyalgwvrqdbyewctprzpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890185.4577746-4085-97785276089872/AnsiballZ_file.py
Nov 23 09:29:45 np0005532585.localdomain sudo[229343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:45 np0005532585.localdomain python3.9[229345]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:29:45 np0005532585.localdomain sudo[229343]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:46 np0005532585.localdomain sudo[229452]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrwgytuesxobukjfrrejonpubbtkxqiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890186.3920426-4085-269267773013087/AnsiballZ_copy.py
Nov 23 09:29:46 np0005532585.localdomain sudo[229452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:46 np0005532585.localdomain python3.9[229454]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890186.3920426-4085-269267773013087/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:29:46 np0005532585.localdomain sudo[229452]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:47 np0005532585.localdomain sudo[229507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbhfkjcjpywzqkafyugrnnajsuctwnnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890186.3920426-4085-269267773013087/AnsiballZ_systemd.py
Nov 23 09:29:47 np0005532585.localdomain sudo[229507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:29:47 np0005532585.localdomain podman[229510]: 2025-11-23 09:29:47.293794669 +0000 UTC m=+0.085881710 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 23 09:29:47 np0005532585.localdomain podman[229510]: 2025-11-23 09:29:47.327286936 +0000 UTC m=+0.119373957 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:29:47 np0005532585.localdomain python3.9[229509]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:29:47 np0005532585.localdomain systemd-sysv-generator[229551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:29:47 np0005532585.localdomain systemd-rc-local-generator[229547]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:47 np0005532585.localdomain sudo[229507]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16596 DF PROTO=TCP SPT=60978 DPT=9102 SEQ=430266586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFB2200000000001030307) 
Nov 23 09:29:48 np0005532585.localdomain sudo[229670]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aublcfascsencokrskozmksxuelcegxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890186.3920426-4085-269267773013087/AnsiballZ_systemd.py
Nov 23 09:29:48 np0005532585.localdomain sudo[229670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:49 np0005532585.localdomain python3.9[229672]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:29:49 np0005532585.localdomain systemd-rc-local-generator[229698]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:29:49 np0005532585.localdomain systemd-sysv-generator[229703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: Starting nova_compute container...
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:29:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 09:29:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 23 09:29:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 09:29:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 09:29:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 09:29:49 np0005532585.localdomain podman[229713]: 2025-11-23 09:29:49.544245946 +0000 UTC m=+0.123088010 container init 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible)
Nov 23 09:29:49 np0005532585.localdomain podman[229713]: 2025-11-23 09:29:49.555489887 +0000 UTC m=+0.134331941 container start 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:29:49 np0005532585.localdomain podman[229713]: nova_compute
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + sudo -E kolla_set_configs
Nov 23 09:29:49 np0005532585.localdomain systemd[1]: Started nova_compute container.
Nov 23 09:29:49 np0005532585.localdomain sudo[229670]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Validating config file
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying service configuration files
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Deleting /etc/ceph
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Creating directory /etc/ceph
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Writing out command to execute
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: ++ cat /run_command
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + CMD=nova-compute
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + ARGS=
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + sudo kolla_copy_cacerts
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + [[ ! -n '' ]]
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + . kolla_extend_start
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: Running command: 'nova-compute'
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + echo 'Running command: '\''nova-compute'\'''
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + umask 0022
Nov 23 09:29:49 np0005532585.localdomain nova_compute[229727]: + exec nova-compute
Nov 23 09:29:50 np0005532585.localdomain python3.9[229847]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.344 229731 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.344 229731 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.344 229731 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.345 229731 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.459 229731 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.479 229731 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.480 229731 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 23 09:29:51 np0005532585.localdomain python3.9[229957]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:51 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:51.898 229731 INFO nova.virt.driver [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.023 229731 INFO nova.compute.provider_config [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.076 229731 WARNING nova.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.076 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.077 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.077 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console_host                   = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.089 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.089 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.089 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.159 229731 WARNING oslo_config.cfg [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: and ``live_migration_inbound_addr`` respectively.
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: ).  Its value may be silently ignored in the future.
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_secret_uuid        = 46550e70-79cb-5f55-bf6d-1204b97e083b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.219 229731 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.229 229731 INFO nova.virt.node [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.229 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.230 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.230 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.230 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.240 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f46cf8f8970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.242 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f46cf8f8970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.243 229731 INFO nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Connection event '1' reason 'None'
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.253 229731 DEBUG nova.virt.libvirt.volume.mount [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.264 229731 INFO nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host capabilities <capabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <host>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <uuid>43895caf-e6c2-47af-84a5-6194e901da5c</uuid>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <arch>x86_64</arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model>EPYC-Rome-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <vendor>AMD</vendor>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <microcode version='16777317'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <signature family='23' model='49' stepping='0'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='x2apic'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='tsc-deadline'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='osxsave'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='hypervisor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='tsc_adjust'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='spec-ctrl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='stibp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='arch-capabilities'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='cmp_legacy'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='topoext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='virt-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='lbrv'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='tsc-scale'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='vmcb-clean'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='pause-filter'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='pfthreshold'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='svme-addr-chk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='rdctl-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='skip-l1dfl-vmentry'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='mds-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature name='pschange-mc-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <pages unit='KiB' size='4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <pages unit='KiB' size='2048'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <pages unit='KiB' size='1048576'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <power_management>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <suspend_mem/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <suspend_disk/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <suspend_hybrid/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </power_management>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <iommu support='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <migration_features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <live/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <uri_transports>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <uri_transport>tcp</uri_transport>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <uri_transport>rdma</uri_transport>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </uri_transports>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </migration_features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <topology>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <cells num='1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <cell id='0'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           <memory unit='KiB'>16116612</memory>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           <pages unit='KiB' size='2048'>0</pages>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           <distances>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <sibling id='0' value='10'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           </distances>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           <cpus num='8'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:           </cpus>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         </cell>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </cells>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </topology>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <cache>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </cache>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <secmodel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model>selinux</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <doi>0</doi>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </secmodel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <secmodel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model>dac</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <doi>0</doi>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </secmodel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </host>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <guest>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <os_type>hvm</os_type>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <arch name='i686'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <wordsize>32</wordsize>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <domain type='qemu'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <domain type='kvm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <pae/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <nonpae/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <acpi default='on' toggle='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <apic default='on' toggle='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <cpuselection/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <deviceboot/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <disksnapshot default='on' toggle='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <externalSnapshot/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </guest>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <guest>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <os_type>hvm</os_type>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <arch name='x86_64'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <wordsize>64</wordsize>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <domain type='qemu'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <domain type='kvm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <acpi default='on' toggle='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <apic default='on' toggle='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <cpuselection/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <deviceboot/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <disksnapshot default='on' toggle='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <externalSnapshot/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </guest>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: </capabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.270 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.291 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: <domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <domain>kvm</domain>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <arch>i686</arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <vcpu max='240'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <iothreads supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <os supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='firmware'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <loader supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>rom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pflash</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='readonly'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>yes</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='secure'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </loader>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </os>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='maximum' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='maximumMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-model' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <vendor>AMD</vendor>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='x2apic'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='stibp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='succor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lbrv'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='custom' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Dhyana-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-128'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-256'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-512'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <memoryBacking supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='sourceType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>anonymous</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>memfd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </memoryBacking>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <disk supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='diskDevice'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>disk</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cdrom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>floppy</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>lun</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ide</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>fdc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>sata</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </disk>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <graphics supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vnc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egl-headless</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </graphics>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <video supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='modelType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vga</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cirrus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>none</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>bochs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ramfb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </video>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hostdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='mode'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>subsystem</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='startupPolicy'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>mandatory</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>requisite</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>optional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='subsysType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pci</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='capsType'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='pciBackend'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hostdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <rng supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>random</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </rng>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <filesystem supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='driverType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>path</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>handle</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtiofs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </filesystem>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <tpm supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-tis</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-crb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emulator</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>external</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendVersion'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>2.0</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </tpm>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <redirdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </redirdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <channel supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </channel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <crypto supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </crypto>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <interface supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>passt</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </interface>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <panic supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>isa</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>hyperv</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </panic>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <console supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>null</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dev</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pipe</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stdio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>udp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tcp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu-vdagent</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </console>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <gic supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <vmcoreinfo supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <genid supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backingStoreInput supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backup supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <async-teardown supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <ps2 supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sev supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sgx supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hyperv supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='features'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>relaxed</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vapic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>spinlocks</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vpindex</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>runtime</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>synic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stimer</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reset</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vendor_id</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>frequencies</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reenlightenment</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tlbflush</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ipi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>avic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emsr_bitmap</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>xmm_input</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <spinlocks>4095</spinlocks>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <stimer_direct>on</stimer_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hyperv>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <launchSecurity supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='sectype'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tdx</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </launchSecurity>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: </domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.295 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: <domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <domain>kvm</domain>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <arch>i686</arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <vcpu max='1024'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <iothreads supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <os supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='firmware'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <loader supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>rom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pflash</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='readonly'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>yes</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='secure'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </loader>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </os>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='maximum' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='maximumMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-model' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <vendor>AMD</vendor>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='x2apic'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='stibp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='succor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lbrv'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='custom' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain python3.9[230067]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Dhyana-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-128'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-256'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-512'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <memoryBacking supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='sourceType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>anonymous</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>memfd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </memoryBacking>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <disk supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='diskDevice'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>disk</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cdrom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>floppy</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>lun</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>fdc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>sata</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </disk>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <graphics supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vnc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egl-headless</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </graphics>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <video supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='modelType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vga</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cirrus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>none</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>bochs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ramfb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </video>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hostdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='mode'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>subsystem</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='startupPolicy'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>mandatory</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>requisite</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>optional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='subsysType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pci</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='capsType'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='pciBackend'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hostdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <rng supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>random</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </rng>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <filesystem supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='driverType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>path</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>handle</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtiofs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </filesystem>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <tpm supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-tis</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-crb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emulator</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>external</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendVersion'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>2.0</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </tpm>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <redirdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </redirdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <channel supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </channel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <crypto supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </crypto>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <interface supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>passt</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </interface>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <panic supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>isa</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>hyperv</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </panic>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <console supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>null</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dev</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pipe</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stdio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>udp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tcp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu-vdagent</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </console>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <gic supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <vmcoreinfo supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <genid supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backingStoreInput supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backup supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <async-teardown supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <ps2 supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sev supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sgx supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hyperv supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='features'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>relaxed</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vapic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>spinlocks</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vpindex</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>runtime</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>synic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stimer</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reset</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vendor_id</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>frequencies</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reenlightenment</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tlbflush</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ipi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>avic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emsr_bitmap</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>xmm_input</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <spinlocks>4095</spinlocks>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <stimer_direct>on</stimer_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hyperv>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <launchSecurity supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='sectype'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tdx</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </launchSecurity>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: </domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.327 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.332 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: <domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <domain>kvm</domain>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <arch>x86_64</arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <vcpu max='240'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <iothreads supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <os supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='firmware'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <loader supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>rom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pflash</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='readonly'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>yes</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='secure'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </loader>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </os>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='maximum' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='maximumMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-model' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <vendor>AMD</vendor>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='x2apic'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='stibp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='succor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lbrv'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='custom' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Dhyana-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-128'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-256'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-512'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <memoryBacking supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='sourceType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>anonymous</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>memfd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </memoryBacking>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <disk supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='diskDevice'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>disk</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cdrom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>floppy</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>lun</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ide</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>fdc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>sata</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </disk>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <graphics supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vnc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egl-headless</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </graphics>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <video supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='modelType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vga</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cirrus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>none</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>bochs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ramfb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </video>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hostdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='mode'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>subsystem</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='startupPolicy'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>mandatory</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>requisite</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>optional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='subsysType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pci</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='capsType'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='pciBackend'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hostdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <rng supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>random</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </rng>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <filesystem supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='driverType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>path</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>handle</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtiofs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </filesystem>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <tpm supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-tis</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-crb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emulator</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>external</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendVersion'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>2.0</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </tpm>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <redirdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </redirdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <channel supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </channel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <crypto supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </crypto>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <interface supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>passt</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </interface>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <panic supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>isa</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>hyperv</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </panic>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <console supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>null</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dev</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pipe</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stdio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>udp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tcp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu-vdagent</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </console>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <gic supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <vmcoreinfo supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <genid supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backingStoreInput supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backup supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <async-teardown supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <ps2 supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sev supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sgx supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hyperv supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='features'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>relaxed</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vapic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>spinlocks</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vpindex</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>runtime</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>synic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stimer</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reset</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vendor_id</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>frequencies</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reenlightenment</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tlbflush</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ipi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>avic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emsr_bitmap</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>xmm_input</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <spinlocks>4095</spinlocks>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <stimer_direct>on</stimer_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hyperv>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <launchSecurity supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='sectype'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tdx</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </launchSecurity>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: </domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.392 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: <domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <domain>kvm</domain>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <arch>x86_64</arch>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <vcpu max='1024'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <iothreads supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <os supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='firmware'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>efi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <loader supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>rom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pflash</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='readonly'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>yes</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='secure'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>yes</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>no</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </loader>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </os>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='maximum' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='maximumMigratable'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>on</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>off</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='host-model' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <vendor>AMD</vendor>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='x2apic'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='stibp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='succor'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lbrv'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <mode name='custom' supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Broadwell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Cooperlake-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Denverton-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Dhyana-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='auto-ibrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amd-psfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='no-nested-data-bp'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='null-sel-clr-base'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='stibp-always-on'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='EPYC-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-128'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-256'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx10-512'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='prefetchiti'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Haswell-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='IvyBridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='KnightsMill-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4fmaps'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-4vnniw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512er'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512pf'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fma4'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tbm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xop'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='amx-tile'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-bf16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-fp16'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bitalg'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vbmi2'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrc'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fzrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='la57'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='taa-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='tsx-ldtrk'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xfd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='SierraForest-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ifma'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-ne-convert'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx-vnni-int8'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='bus-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cmpccxadd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fbsdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='fsrs'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ibrs-all'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mcdt-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pbrsb-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='psdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='serialize'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vaes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='vpclmulqdq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='hle'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='rtm'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512bw'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512cd'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512dq'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512f'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='avx512vl'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='invpcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pcid'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='pku'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='mpx'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v2'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v3'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='core-capability'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='split-lock-detect'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='Snowridge-v4'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='cldemote'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='erms'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='gfni'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdir64b'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='movdiri'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='xsaves'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='athlon-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='core2duo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='coreduo-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='n270-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='ss'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <blockers model='phenom-v1'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnow'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <feature name='3dnowext'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </blockers>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </mode>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </cpu>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <memoryBacking supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <enum name='sourceType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>anonymous</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <value>memfd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </memoryBacking>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <disk supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='diskDevice'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>disk</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cdrom</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>floppy</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>lun</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>fdc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>sata</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </disk>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <graphics supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vnc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egl-headless</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </graphics>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <video supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='modelType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vga</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>cirrus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>none</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>bochs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ramfb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </video>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hostdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='mode'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>subsystem</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='startupPolicy'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>mandatory</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>requisite</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>optional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='subsysType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pci</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>scsi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='capsType'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='pciBackend'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hostdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <rng supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtio-non-transitional</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>random</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>egd</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </rng>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <filesystem supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='driverType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>path</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>handle</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>virtiofs</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </filesystem>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <tpm supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-tis</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tpm-crb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emulator</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>external</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendVersion'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>2.0</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </tpm>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <redirdev supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='bus'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>usb</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </redirdev>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <channel supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </channel>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <crypto supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendModel'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>builtin</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </crypto>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <interface supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='backendType'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>default</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>passt</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </interface>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <panic supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='model'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>isa</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>hyperv</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </panic>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <console supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='type'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>null</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vc</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pty</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dev</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>file</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>pipe</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stdio</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>udp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tcp</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>unix</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>qemu-vdagent</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>dbus</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </console>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </devices>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   <features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <gic supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <vmcoreinfo supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <genid supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backingStoreInput supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <backup supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <async-teardown supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <ps2 supported='yes'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sev supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <sgx supported='no'/>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <hyperv supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='features'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>relaxed</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vapic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>spinlocks</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vpindex</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>runtime</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>synic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>stimer</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reset</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>vendor_id</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>frequencies</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>reenlightenment</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tlbflush</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>ipi</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>avic</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>emsr_bitmap</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>xmm_input</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <spinlocks>4095</spinlocks>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <stimer_direct>on</stimer_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </defaults>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </hyperv>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     <launchSecurity supported='yes'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       <enum name='sectype'>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:         <value>tdx</value>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:       </enum>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:     </launchSecurity>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:   </features>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: </domainCapabilities>
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.461 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.462 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.462 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.462 229731 INFO nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Secure Boot support detected
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.465 229731 INFO nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.465 229731 INFO nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.479 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.522 229731 INFO nova.virt.node [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.549 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Verified node dae70d62-10f4-474c-9782-8c926a3641d5 matches my host np0005532585.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.596 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.601 229731 DEBUG nova.virt.libvirt.vif [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005532585.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T08:25:43Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.602 229731 DEBUG nova.network.os_vif_util [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.603 229731 DEBUG nova.network.os_vif_util [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.604 229731 DEBUG os_vif [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.649 229731 DEBUG ovsdbapp.backend.ovs_idl [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.649 229731 DEBUG ovsdbapp.backend.ovs_idl [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.649 229731 DEBUG ovsdbapp.backend.ovs_idl [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.650 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.650 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.651 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.651 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.652 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.656 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.670 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.671 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.671 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:29:52 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:52.672 229731 INFO oslo.privsep.daemon [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpwwhf7w6z/privsep.sock']
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.285 229731 INFO oslo.privsep.daemon [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.178 230114 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.184 230114 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.188 230114 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.188 230114 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230114
Nov 23 09:29:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26889 DF PROTO=TCP SPT=60580 DPT=9102 SEQ=561049474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFC7600000000001030307) 
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.557 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.557 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.558 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.559 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.559 229731 INFO os_vif [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.560 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.564 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.565 229731 INFO nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 23 09:29:53 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:53.818 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.175 229731 INFO nova.service [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating service version for nova-compute on np0005532585.localdomain from 57 to 66
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.206 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.207 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.207 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.208 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.209 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.655 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.725 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:29:54 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:54.725 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:29:54 np0005532585.localdomain systemd[1]: Started libvirt nodedev daemon.
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.043 229731 WARNING nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.044 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12918MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.044 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.044 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.200 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.201 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.201 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.214 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.227 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.227 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.241 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.275 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX2,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.354 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:29:55 np0005532585.localdomain sudo[230382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xadzqzbpgelyayixznibdpnemwyyfecl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890195.187866-4265-142902926062009/AnsiballZ_podman_container.py
Nov 23 09:29:55 np0005532585.localdomain sudo[230382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.817 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.823 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.823 229731 INFO nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] kernel doesn't support AMD SEV
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.825 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.826 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.881 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updated inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.882 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating resource provider dae70d62-10f4-474c-9782-8c926a3641d5 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.882 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.936 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating resource provider dae70d62-10f4-474c-9782-8c926a3641d5 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.966 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.966 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.967 229731 DEBUG nova.service [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.993 229731 DEBUG nova.service [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 23 09:29:55 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:55.994 229731 DEBUG nova.servicegroup.drivers.db [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] DB_Driver: join new ServiceGroup member np0005532585.localdomain to the compute group, service = <Service: host=np0005532585.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 23 09:29:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9380 DF PROTO=TCP SPT=52692 DPT=9882 SEQ=3079844263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFD6210000000001030307) 
Nov 23 09:29:57 np0005532585.localdomain python3.9[230384]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 23 09:29:57 np0005532585.localdomain sudo[230382]: pam_unix(sudo:session): session closed for user root
Nov 23 09:29:57 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 121.3 (404 of 333 items), suggesting rotation.
Nov 23 09:29:57 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 09:29:57 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:29:57 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:29:57 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:57.700 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:29:58 np0005532585.localdomain systemd[1]: tmp-crun.xqvLMH.mount: Deactivated successfully.
Nov 23 09:29:58 np0005532585.localdomain podman[230428]: 2025-11-23 09:29:58.013805409 +0000 UTC m=+0.069678428 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 09:29:58 np0005532585.localdomain podman[230428]: 2025-11-23 09:29:58.021546304 +0000 UTC m=+0.077419293 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 09:29:58 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:29:58 np0005532585.localdomain sudo[230537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epbtmofboohldkyujuydpwnuecxndtek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890198.0604281-4289-124154048997235/AnsiballZ_systemd.py
Nov 23 09:29:58 np0005532585.localdomain sudo[230537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:29:58 np0005532585.localdomain python3.9[230539]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:29:58 np0005532585.localdomain systemd[1]: Stopping nova_compute container...
Nov 23 09:29:58 np0005532585.localdomain systemd[1]: tmp-crun.ki7ajJ.mount: Deactivated successfully.
Nov 23 09:29:58 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:58.811 229731 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Nov 23 09:29:58 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:29:58.862 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:29:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39229 DF PROTO=TCP SPT=41864 DPT=9101 SEQ=3850135123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFE06A0000000001030307) 
Nov 23 09:29:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3606 DF PROTO=TCP SPT=41876 DPT=9105 SEQ=1463888140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFE11B0000000001030307) 
Nov 23 09:30:00 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:30:00.108 229731 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 23 09:30:00 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:30:00.111 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:30:00 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:30:00.112 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:30:00 np0005532585.localdomain nova_compute[229727]: 2025-11-23 09:30:00.112 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:30:00 np0005532585.localdomain virtqemud[203731]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 23 09:30:00 np0005532585.localdomain virtqemud[203731]: hostname: np0005532585.localdomain
Nov 23 09:30:00 np0005532585.localdomain virtqemud[203731]: End of file while reading data: Input/output error
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Deactivated successfully.
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Consumed 4.843s CPU time.
Nov 23 09:30:00 np0005532585.localdomain podman[230543]: 2025-11-23 09:30:00.510094373 +0000 UTC m=+1.773216252 container died 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: tmp-crun.HmzNB2.mount: Deactivated successfully.
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295-userdata-shm.mount: Deactivated successfully.
Nov 23 09:30:00 np0005532585.localdomain podman[230543]: 2025-11-23 09:30:00.564371001 +0000 UTC m=+1.827492850 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 23 09:30:00 np0005532585.localdomain podman[230543]: nova_compute
Nov 23 09:30:00 np0005532585.localdomain podman[230581]: error opening file `/run/crun/2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295/status`: No such file or directory
Nov 23 09:30:00 np0005532585.localdomain podman[230570]: 2025-11-23 09:30:00.659056848 +0000 UTC m=+0.066521332 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 23 09:30:00 np0005532585.localdomain podman[230570]: nova_compute
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: Stopped nova_compute container.
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: Starting nova_compute container...
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:30:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:00 np0005532585.localdomain podman[230585]: 2025-11-23 09:30:00.806742314 +0000 UTC m=+0.118150630 container init 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=edpm)
Nov 23 09:30:00 np0005532585.localdomain podman[230585]: 2025-11-23 09:30:00.817076077 +0000 UTC m=+0.128484393 container start 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 23 09:30:00 np0005532585.localdomain podman[230585]: nova_compute
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + sudo -E kolla_set_configs
Nov 23 09:30:00 np0005532585.localdomain systemd[1]: Started nova_compute container.
Nov 23 09:30:00 np0005532585.localdomain sudo[230537]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Validating config file
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying service configuration files
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /etc/ceph
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Creating directory /etc/ceph
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Writing out command to execute
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: ++ cat /run_command
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + CMD=nova-compute
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + ARGS=
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + sudo kolla_copy_cacerts
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + [[ ! -n '' ]]
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + . kolla_extend_start
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + echo 'Running command: '\''nova-compute'\'''
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: Running command: 'nova-compute'
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + umask 0022
Nov 23 09:30:00 np0005532585.localdomain nova_compute[230600]: + exec nova-compute
Nov 23 09:30:01 np0005532585.localdomain sudo[230719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhiukuphdpkgzayfvlafenlubpmmgpoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890201.21182-4316-268542107583710/AnsiballZ_podman_container.py
Nov 23 09:30:01 np0005532585.localdomain sudo[230719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:01 np0005532585.localdomain python3.9[230721]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 23 09:30:01 np0005532585.localdomain systemd[1]: Started libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope.
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:30:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 09:30:02 np0005532585.localdomain podman[230746]: 2025-11-23 09:30:02.026920436 +0000 UTC m=+0.129822704 container init 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:30:02 np0005532585.localdomain podman[230746]: 2025-11-23 09:30:02.037693263 +0000 UTC m=+0.140595501 container start 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:30:02 np0005532585.localdomain python3.9[230721]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Applying nova statedir ownership
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d already 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/console.log
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/f8def1b80727f8e5cc38a877010a5f81bbb3086d
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-f8def1b80727f8e5cc38a877010a5f81bbb3086d
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/4143dbbec5b08621aa3c8eb364f8a7d3e97604e18b7ed41c4bab0da11ed561fd
Nov 23 09:30:02 np0005532585.localdomain nova_compute_init[230766]: INFO:nova_statedir:Nova statedir ownership complete
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: libpod-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully.
Nov 23 09:30:02 np0005532585.localdomain podman[230767]: 2025-11-23 09:30:02.105356469 +0000 UTC m=+0.050662420 container died 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Nov 23 09:30:02 np0005532585.localdomain podman[230778]: 2025-11-23 09:30:02.179322166 +0000 UTC m=+0.073820124 container cleanup 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully.
Nov 23 09:30:02 np0005532585.localdomain sudo[230719]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully.
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428-userdata-shm.mount: Deactivated successfully.
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.571 230604 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.572 230604 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.572 230604 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.572 230604 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.688 230604 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.708 230604 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:30:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:02.708 230604 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 23 09:30:02 np0005532585.localdomain sshd[208831]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Nov 23 09:30:02 np0005532585.localdomain systemd[1]: session-53.scope: Consumed 2min 12.019s CPU time.
Nov 23 09:30:02 np0005532585.localdomain systemd-logind[761]: Session 53 logged out. Waiting for processes to exit.
Nov 23 09:30:02 np0005532585.localdomain systemd-logind[761]: Removed session 53.
Nov 23 09:30:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3608 DF PROTO=TCP SPT=41876 DPT=9105 SEQ=1463888140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFED200000000001030307) 
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.081 230604 INFO nova.virt.driver [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.197 230604 INFO nova.compute.provider_config [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.203 230604 WARNING nova.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console_host                   = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.263 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.263 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.264 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.264 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.264 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.265 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.265 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.265 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.266 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.266 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.266 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.279 230604 WARNING oslo_config.cfg [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: and ``live_migration_inbound_addr`` respectively.
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: ).  Its value may be silently ignored in the future.
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_secret_uuid        = 46550e70-79cb-5f55-bf6d-1204b97e083b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.327 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.327 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.328 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.328 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.328 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.330 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.330 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.330 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.343 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.343 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.344 230604 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.357 230604 INFO nova.virt.node [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.357 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.358 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.358 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.358 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.368 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f332a298910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.370 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f332a298910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.371 230604 INFO nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Connection event '1' reason 'None'
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.376 230604 INFO nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host capabilities <capabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <host>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <uuid>43895caf-e6c2-47af-84a5-6194e901da5c</uuid>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <arch>x86_64</arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model>EPYC-Rome-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <vendor>AMD</vendor>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <microcode version='16777317'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <signature family='23' model='49' stepping='0'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='x2apic'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='tsc-deadline'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='osxsave'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='hypervisor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='tsc_adjust'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='spec-ctrl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='stibp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='arch-capabilities'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='cmp_legacy'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='topoext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='virt-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='lbrv'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='tsc-scale'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='vmcb-clean'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='pause-filter'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='pfthreshold'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='svme-addr-chk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='rdctl-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='skip-l1dfl-vmentry'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='mds-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature name='pschange-mc-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <pages unit='KiB' size='4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <pages unit='KiB' size='2048'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <pages unit='KiB' size='1048576'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <power_management>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <suspend_mem/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <suspend_disk/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <suspend_hybrid/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </power_management>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <iommu support='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <migration_features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <live/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <uri_transports>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <uri_transport>tcp</uri_transport>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <uri_transport>rdma</uri_transport>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </uri_transports>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </migration_features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <topology>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <cells num='1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <cell id='0'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           <memory unit='KiB'>16116612</memory>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           <pages unit='KiB' size='2048'>0</pages>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           <distances>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <sibling id='0' value='10'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           </distances>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           <cpus num='8'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:           </cpus>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         </cell>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </cells>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </topology>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <cache>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </cache>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <secmodel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model>selinux</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <doi>0</doi>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </secmodel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <secmodel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model>dac</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <doi>0</doi>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </secmodel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </host>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <guest>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <os_type>hvm</os_type>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <arch name='i686'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <wordsize>32</wordsize>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <domain type='qemu'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <domain type='kvm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <pae/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <nonpae/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <acpi default='on' toggle='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <apic default='on' toggle='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <cpuselection/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <deviceboot/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <disksnapshot default='on' toggle='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <externalSnapshot/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </guest>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <guest>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <os_type>hvm</os_type>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <arch name='x86_64'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <wordsize>64</wordsize>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <domain type='qemu'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <domain type='kvm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <acpi default='on' toggle='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <apic default='on' toggle='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <cpuselection/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <deviceboot/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <disksnapshot default='on' toggle='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <externalSnapshot/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </guest>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: </capabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.384 230604 DEBUG nova.virt.libvirt.volume.mount [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.385 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.390 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: <domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <domain>kvm</domain>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <arch>i686</arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <vcpu max='1024'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <iothreads supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <os supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='firmware'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <loader supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>rom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pflash</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='readonly'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>yes</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='secure'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </loader>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </os>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='maximum' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='maximumMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-model' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <vendor>AMD</vendor>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='x2apic'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='stibp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='succor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lbrv'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='custom' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Dhyana-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-128'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-256'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-512'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <memoryBacking supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='sourceType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>anonymous</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>memfd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </memoryBacking>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <disk supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='diskDevice'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>disk</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cdrom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>floppy</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>lun</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>fdc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>sata</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </disk>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <graphics supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vnc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egl-headless</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </graphics>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <video supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='modelType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vga</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cirrus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>none</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>bochs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ramfb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </video>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hostdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='mode'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>subsystem</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='startupPolicy'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>mandatory</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>requisite</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>optional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='subsysType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pci</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='capsType'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='pciBackend'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hostdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <rng supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>random</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </rng>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <filesystem supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='driverType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>path</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>handle</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtiofs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </filesystem>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <tpm supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-tis</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-crb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emulator</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>external</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendVersion'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>2.0</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </tpm>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <redirdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </redirdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <channel supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </channel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <crypto supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </crypto>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <interface supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>passt</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </interface>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <panic supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>isa</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>hyperv</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </panic>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <console supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>null</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dev</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pipe</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stdio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>udp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tcp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu-vdagent</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </console>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <gic supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <vmcoreinfo supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <genid supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backingStoreInput supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backup supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <async-teardown supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <ps2 supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sev supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sgx supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hyperv supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='features'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>relaxed</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vapic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>spinlocks</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vpindex</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>runtime</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>synic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stimer</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reset</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vendor_id</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>frequencies</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reenlightenment</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tlbflush</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ipi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>avic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emsr_bitmap</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>xmm_input</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <spinlocks>4095</spinlocks>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <stimer_direct>on</stimer_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hyperv>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <launchSecurity supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='sectype'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tdx</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </launchSecurity>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: </domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.400 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: <domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <domain>kvm</domain>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <arch>i686</arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <vcpu max='240'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <iothreads supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <os supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='firmware'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <loader supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>rom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pflash</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='readonly'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>yes</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='secure'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </loader>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </os>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='maximum' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='maximumMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-model' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <vendor>AMD</vendor>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='x2apic'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='stibp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='succor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lbrv'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='custom' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Dhyana-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-128'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-256'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-512'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <memoryBacking supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='sourceType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>anonymous</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>memfd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </memoryBacking>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <disk supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='diskDevice'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>disk</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cdrom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>floppy</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>lun</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ide</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>fdc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>sata</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </disk>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <graphics supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vnc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egl-headless</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </graphics>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <video supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='modelType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vga</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cirrus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>none</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>bochs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ramfb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </video>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hostdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='mode'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>subsystem</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='startupPolicy'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>mandatory</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>requisite</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>optional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='subsysType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pci</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='capsType'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='pciBackend'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hostdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <rng supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>random</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </rng>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <filesystem supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='driverType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>path</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>handle</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtiofs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </filesystem>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <tpm supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-tis</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-crb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emulator</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>external</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendVersion'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>2.0</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </tpm>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <redirdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </redirdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <channel supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </channel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <crypto supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </crypto>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <interface supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>passt</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </interface>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <panic supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>isa</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>hyperv</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </panic>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <console supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>null</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dev</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pipe</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stdio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>udp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tcp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu-vdagent</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </console>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <gic supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <vmcoreinfo supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <genid supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backingStoreInput supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backup supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <async-teardown supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <ps2 supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sev supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sgx supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hyperv supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='features'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>relaxed</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vapic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>spinlocks</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vpindex</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>runtime</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>synic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stimer</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reset</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vendor_id</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>frequencies</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reenlightenment</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tlbflush</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ipi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>avic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emsr_bitmap</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>xmm_input</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <spinlocks>4095</spinlocks>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <stimer_direct>on</stimer_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hyperv>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <launchSecurity supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='sectype'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tdx</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </launchSecurity>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: </domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.417 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.421 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: <domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <domain>kvm</domain>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <arch>x86_64</arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <vcpu max='1024'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <iothreads supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <os supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='firmware'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>efi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <loader supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>rom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pflash</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='readonly'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>yes</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='secure'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>yes</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </loader>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </os>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='maximum' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='maximumMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-model' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <vendor>AMD</vendor>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='x2apic'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='stibp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='succor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lbrv'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='custom' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Dhyana-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-128'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-256'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-512'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <memoryBacking supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='sourceType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>anonymous</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>memfd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </memoryBacking>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <disk supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='diskDevice'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>disk</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cdrom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>floppy</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>lun</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>fdc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>sata</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </disk>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <graphics supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vnc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egl-headless</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </graphics>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <video supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='modelType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vga</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cirrus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>none</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>bochs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ramfb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </video>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hostdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='mode'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>subsystem</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='startupPolicy'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>mandatory</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>requisite</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>optional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='subsysType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pci</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='capsType'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='pciBackend'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hostdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <rng supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>random</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </rng>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <filesystem supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='driverType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>path</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>handle</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtiofs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </filesystem>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <tpm supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-tis</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-crb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emulator</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>external</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendVersion'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>2.0</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </tpm>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <redirdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </redirdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <channel supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </channel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <crypto supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </crypto>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <interface supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>passt</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </interface>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <panic supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>isa</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>hyperv</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </panic>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <console supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>null</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dev</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pipe</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stdio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>udp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tcp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu-vdagent</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </console>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <gic supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <vmcoreinfo supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <genid supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backingStoreInput supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backup supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <async-teardown supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <ps2 supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sev supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sgx supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hyperv supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='features'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>relaxed</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vapic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>spinlocks</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vpindex</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>runtime</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>synic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stimer</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reset</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vendor_id</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>frequencies</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reenlightenment</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tlbflush</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ipi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>avic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emsr_bitmap</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>xmm_input</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <spinlocks>4095</spinlocks>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <stimer_direct>on</stimer_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hyperv>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <launchSecurity supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='sectype'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tdx</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </launchSecurity>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: </domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.467 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: <domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <domain>kvm</domain>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <arch>x86_64</arch>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <vcpu max='240'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <iothreads supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <os supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='firmware'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <loader supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>rom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pflash</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='readonly'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>yes</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='secure'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>no</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </loader>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </os>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='maximum' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='maximumMigratable'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>on</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>off</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='host-model' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <vendor>AMD</vendor>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='x2apic'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='stibp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='succor'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lbrv'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <mode name='custom' supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Broadwell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Cooperlake-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Denverton-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Dhyana-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='auto-ibrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amd-psfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='no-nested-data-bp'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='null-sel-clr-base'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='stibp-always-on'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='EPYC-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-128'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-256'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx10-512'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='prefetchiti'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Haswell-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='IvyBridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='KnightsMill-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4fmaps'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-4vnniw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512er'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512pf'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fma4'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tbm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xop'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='amx-tile'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-bf16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-fp16'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bitalg'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vbmi2'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrc'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fzrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='la57'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='taa-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='tsx-ldtrk'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xfd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='SierraForest-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ifma'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-ne-convert'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx-vnni-int8'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='bus-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cmpccxadd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fbsdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='fsrs'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ibrs-all'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mcdt-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pbrsb-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='psdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='serialize'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vaes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='vpclmulqdq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='hle'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='rtm'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512bw'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512cd'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512dq'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512f'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='avx512vl'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='invpcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pcid'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='pku'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='mpx'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v2'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v3'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='core-capability'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='split-lock-detect'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='Snowridge-v4'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='cldemote'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='erms'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='gfni'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdir64b'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='movdiri'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='xsaves'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='athlon-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='core2duo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='coreduo-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='n270-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='ss'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <blockers model='phenom-v1'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnow'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <feature name='3dnowext'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </blockers>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </mode>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </cpu>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <memoryBacking supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <enum name='sourceType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>anonymous</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <value>memfd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </memoryBacking>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <disk supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='diskDevice'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>disk</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cdrom</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>floppy</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>lun</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ide</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>fdc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>sata</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </disk>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <graphics supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vnc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egl-headless</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </graphics>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <video supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='modelType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vga</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>cirrus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>none</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>bochs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ramfb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </video>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hostdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='mode'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>subsystem</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='startupPolicy'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>mandatory</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>requisite</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>optional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='subsysType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pci</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>scsi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='capsType'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='pciBackend'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hostdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <rng supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtio-non-transitional</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>random</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>egd</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </rng>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <filesystem supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='driverType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>path</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>handle</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>virtiofs</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </filesystem>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <tpm supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-tis</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tpm-crb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emulator</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>external</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendVersion'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>2.0</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </tpm>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <redirdev supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='bus'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>usb</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </redirdev>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <channel supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </channel>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <crypto supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendModel'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>builtin</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </crypto>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <interface supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='backendType'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>default</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>passt</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </interface>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <panic supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='model'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>isa</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>hyperv</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </panic>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <console supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='type'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>null</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vc</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pty</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dev</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>file</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>pipe</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stdio</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>udp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tcp</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>unix</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>qemu-vdagent</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>dbus</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </console>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </devices>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   <features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <gic supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <vmcoreinfo supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <genid supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backingStoreInput supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <backup supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <async-teardown supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <ps2 supported='yes'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sev supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <sgx supported='no'/>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <hyperv supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='features'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>relaxed</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vapic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>spinlocks</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vpindex</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>runtime</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>synic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>stimer</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reset</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>vendor_id</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>frequencies</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>reenlightenment</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tlbflush</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>ipi</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>avic</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>emsr_bitmap</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>xmm_input</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <spinlocks>4095</spinlocks>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <stimer_direct>on</stimer_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </defaults>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </hyperv>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     <launchSecurity supported='yes'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       <enum name='sectype'>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:         <value>tdx</value>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:       </enum>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:     </launchSecurity>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:   </features>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: </domainCapabilities>
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.557 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.558 230604 INFO nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Secure Boot support detected
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.560 230604 INFO nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.561 230604 INFO nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.576 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.599 230604 INFO nova.virt.node [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.619 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Verified node dae70d62-10f4-474c-9782-8c926a3641d5 matches my host np0005532585.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.646 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.650 230604 DEBUG nova.virt.libvirt.vif [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005532585.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T08:25:43Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.651 230604 DEBUG nova.network.os_vif_util [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.652 230604 DEBUG nova.network.os_vif_util [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.653 230604 DEBUG os_vif [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.746 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.747 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.748 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.759 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.759 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.759 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.760 230604 INFO oslo.privsep.daemon [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp05yuyu3y/privsep.sock']
Nov 23 09:30:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:03.864 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.331 230604 INFO oslo.privsep.daemon [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Spawned new privsep daemon via rootwrap
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.230 230850 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.233 230850 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.235 230850 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.235 230850 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230850
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.615 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.615 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.616 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.617 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.617 230604 INFO os_vif [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.618 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.621 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.622 230604 INFO nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.708 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.709 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.709 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.710 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:30:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:04.711 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.138 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.210 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.211 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.420 230604 WARNING nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.422 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12952MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.422 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.423 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.584 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.585 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.585 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.701 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.723 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.724 230604 DEBUG nova.compute.provider_tree [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.737 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.756 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_F16C,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:30:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:05.793 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.246 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.250 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.250 230604 INFO nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] kernel doesn't support AMD SEV
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.251 230604 DEBUG nova.compute.provider_tree [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.251 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.272 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.294 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.294 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.294 230604 DEBUG nova.service [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.315 230604 DEBUG nova.service [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 23 09:30:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:06.316 230604 DEBUG nova.servicegroup.drivers.db [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] DB_Driver: join new ServiceGroup member np0005532585.localdomain to the compute group, service = <Service: host=np0005532585.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 23 09:30:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40931 DF PROTO=TCP SPT=47280 DPT=9100 SEQ=2376654661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFFA210000000001030307) 
Nov 23 09:30:08 np0005532585.localdomain sshd[230898]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:30:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:08.798 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:08 np0005532585.localdomain sshd[230898]: Accepted publickey for zuul from 192.168.122.30 port 37274 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:30:08 np0005532585.localdomain systemd-logind[761]: New session 55 of user zuul.
Nov 23 09:30:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:08.867 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:08 np0005532585.localdomain systemd[1]: Started Session 55 of User zuul.
Nov 23 09:30:08 np0005532585.localdomain sshd[230898]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:30:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:30:09.240 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:30:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:30:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:30:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:30:09.242 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:30:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5701 DF PROTO=TCP SPT=42040 DPT=9100 SEQ=722781209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D005600000000001030307) 
Nov 23 09:30:09 np0005532585.localdomain python3.9[231009]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:30:11 np0005532585.localdomain sudo[231121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frblknwtmzztqdrwqygxoymnsmgljywg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890211.0916133-69-153154868519018/AnsiballZ_systemd_service.py
Nov 23 09:30:11 np0005532585.localdomain sudo[231121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58460 DF PROTO=TCP SPT=53278 DPT=9882 SEQ=4002580767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D00F960000000001030307) 
Nov 23 09:30:11 np0005532585.localdomain python3.9[231123]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:30:12 np0005532585.localdomain systemd-sysv-generator[231146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:30:12 np0005532585.localdomain systemd-rc-local-generator[231143]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:12 np0005532585.localdomain sudo[231121]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:13 np0005532585.localdomain python3.9[231267]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:30:13 np0005532585.localdomain network[231284]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:30:13 np0005532585.localdomain network[231285]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:30:13 np0005532585.localdomain network[231286]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:30:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:30:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:13.833 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:13.869 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:13 np0005532585.localdomain podman[231292]: 2025-11-23 09:30:13.902552163 +0000 UTC m=+0.060109132 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:30:13 np0005532585.localdomain podman[231292]: 2025-11-23 09:30:13.932743378 +0000 UTC m=+0.090300337 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:30:14 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:30:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58462 DF PROTO=TCP SPT=53278 DPT=9882 SEQ=4002580767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D01BA10000000001030307) 
Nov 23 09:30:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:30:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26891 DF PROTO=TCP SPT=60580 DPT=9102 SEQ=561049474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D028200000000001030307) 
Nov 23 09:30:18 np0005532585.localdomain sudo[231546]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjpgdiubjnhazwblhawmsbvuposdpcuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890218.016909-126-50359814952765/AnsiballZ_systemd_service.py
Nov 23 09:30:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:30:18 np0005532585.localdomain sudo[231546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:18 np0005532585.localdomain podman[231548]: 2025-11-23 09:30:18.826817153 +0000 UTC m=+0.079753632 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 23 09:30:18 np0005532585.localdomain podman[231548]: 2025-11-23 09:30:18.863857004 +0000 UTC m=+0.116793443 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 09:30:18 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:18.867 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:18 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:18.870 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:18 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:30:19 np0005532585.localdomain python3.9[231549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:30:19 np0005532585.localdomain sudo[231546]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:19 np0005532585.localdomain sudo[231676]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzbbmpyazvqswhbikopcpwcywgjttvhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890219.3831449-156-231155226205436/AnsiballZ_file.py
Nov 23 09:30:19 np0005532585.localdomain sudo[231676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:19 np0005532585.localdomain python3.9[231678]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:19 np0005532585.localdomain sudo[231676]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:19 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Nov 23 09:30:19 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 09:30:19 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:30:20 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:30:20 np0005532585.localdomain sudo[231787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ublqzovrfpoftguilwfasdchwfvtezfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890220.183711-180-155537285043504/AnsiballZ_file.py
Nov 23 09:30:20 np0005532585.localdomain sudo[231787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:20 np0005532585.localdomain python3.9[231789]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:20 np0005532585.localdomain sudo[231787]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:21 np0005532585.localdomain sudo[231897]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roqxaohzghedmpxrsioajdgtrrbvuwrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890220.963138-207-140879812904057/AnsiballZ_command.py
Nov 23 09:30:21 np0005532585.localdomain sudo[231897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:21 np0005532585.localdomain python3.9[231899]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:30:21 np0005532585.localdomain sudo[231897]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:22 np0005532585.localdomain python3.9[232009]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:30:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42161 DF PROTO=TCP SPT=44684 DPT=9102 SEQ=3396325974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D03CA00000000001030307) 
Nov 23 09:30:23 np0005532585.localdomain sudo[232117]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pimirxwfihbyrtkcfvbsneociwkhcvmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890223.278621-261-91579424192165/AnsiballZ_systemd_service.py
Nov 23 09:30:23 np0005532585.localdomain sudo[232117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:23 np0005532585.localdomain python3.9[232119]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:30:23 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:30:23 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:23.872 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:23 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:23.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:23 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:23.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:23 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:23.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:23 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:23.904 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:23 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:23.905 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:23 np0005532585.localdomain systemd-sysv-generator[232147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:30:23 np0005532585.localdomain systemd-rc-local-generator[232143]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:24 np0005532585.localdomain sudo[232117]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:25 np0005532585.localdomain sudo[232263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yipisxrpdehbuljrzrmunrrfwsyhsikf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890224.809614-285-278232392042255/AnsiballZ_command.py
Nov 23 09:30:25 np0005532585.localdomain sudo[232263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:25 np0005532585.localdomain python3.9[232265]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:30:25 np0005532585.localdomain sudo[232263]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:26 np0005532585.localdomain sudo[232374]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upxoiemgfrtonmmqsicsahlfemsrtmdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890226.522106-312-240156946039533/AnsiballZ_file.py
Nov 23 09:30:26 np0005532585.localdomain sudo[232374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:27 np0005532585.localdomain python3.9[232376]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:30:27 np0005532585.localdomain sudo[232374]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58464 DF PROTO=TCP SPT=53278 DPT=9882 SEQ=4002580767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D04C210000000001030307) 
Nov 23 09:30:27 np0005532585.localdomain python3.9[232484]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:30:28 np0005532585.localdomain python3.9[232594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:28 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:28.906 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:28 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:28.908 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:28 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:28.908 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:28 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:28.908 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:28 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:28.938 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:28 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:28.938 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:30:28 np0005532585.localdomain python3.9[232680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890227.9986708-360-127162955015351/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a1c197ac7c699777a1adad471c9d81e692c62960 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:30:29 np0005532585.localdomain podman[232681]: 2025-11-23 09:30:29.054827944 +0000 UTC m=+0.085109082 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 09:30:29 np0005532585.localdomain podman[232681]: 2025-11-23 09:30:29.091147892 +0000 UTC m=+0.121429050 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:30:29 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:30:29 np0005532585.localdomain sudo[232808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvfsnwkffmzdolvhlehosptgyjytfqeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890229.2042897-406-197860497354741/AnsiballZ_group.py
Nov 23 09:30:29 np0005532585.localdomain sudo[232808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65406 DF PROTO=TCP SPT=59422 DPT=9101 SEQ=981976127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0559A0000000001030307) 
Nov 23 09:30:29 np0005532585.localdomain python3.9[232810]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 23 09:30:29 np0005532585.localdomain sudo[232808]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4137 DF PROTO=TCP SPT=43594 DPT=9105 SEQ=4180956252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0564C0000000001030307) 
Nov 23 09:30:30 np0005532585.localdomain sudo[232918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bquhhtqvjlvqodunxmexmwgyriosfabk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890230.2615535-438-209731628523257/AnsiballZ_getent.py
Nov 23 09:30:30 np0005532585.localdomain sudo[232918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:30 np0005532585.localdomain python3.9[232920]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 23 09:30:30 np0005532585.localdomain sudo[232918]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:31 np0005532585.localdomain sudo[233029]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vucvjdxonrlirxibsklpjwmksltfwrbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890231.0522761-462-235841314742775/AnsiballZ_group.py
Nov 23 09:30:31 np0005532585.localdomain sudo[233029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:31 np0005532585.localdomain python3.9[233031]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 09:30:31 np0005532585.localdomain groupadd[233032]: group added to /etc/group: name=ceilometer, GID=42405
Nov 23 09:30:31 np0005532585.localdomain groupadd[233032]: group added to /etc/gshadow: name=ceilometer
Nov 23 09:30:31 np0005532585.localdomain groupadd[233032]: new group: name=ceilometer, GID=42405
Nov 23 09:30:31 np0005532585.localdomain sudo[233029]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:32 np0005532585.localdomain sudo[233145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvowxtchzybnorcxlmxgkdivrdwijqln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890231.8303897-486-179941664226189/AnsiballZ_user.py
Nov 23 09:30:32 np0005532585.localdomain sudo[233145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:32 np0005532585.localdomain python3.9[233147]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 09:30:32 np0005532585.localdomain useradd[233149]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Nov 23 09:30:32 np0005532585.localdomain useradd[233149]: add 'ceilometer' to group 'libvirt'
Nov 23 09:30:32 np0005532585.localdomain useradd[233149]: add 'ceilometer' to shadow group 'libvirt'
Nov 23 09:30:32 np0005532585.localdomain sudo[233145]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65408 DF PROTO=TCP SPT=59422 DPT=9101 SEQ=981976127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D061A10000000001030307) 
Nov 23 09:30:33 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:33.940 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:33 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:33.942 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:33 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:33.942 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:33 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:33.942 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:33 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:33.977 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:33 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:33.978 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:34 np0005532585.localdomain python3.9[233263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:34 np0005532585.localdomain python3.9[233349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763890233.6412175-564-264951459368103/.source.conf _original_basename=ceilometer.conf follow=False checksum=950edd520595720a58ffe786d84e54d033109e91 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:35 np0005532585.localdomain python3.9[233457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:35 np0005532585.localdomain python3.9[233543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763890234.9572551-564-163618447786307/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31313 DF PROTO=TCP SPT=51714 DPT=9100 SEQ=3062298622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D06E200000000001030307) 
Nov 23 09:30:36 np0005532585.localdomain python3.9[233651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:36 np0005532585.localdomain python3.9[233737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763890236.0000565-564-114090329836794/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:38 np0005532585.localdomain python3.9[233845]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:30:38 np0005532585.localdomain python3.9[233953]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:30:38 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:38.979 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:38 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:38.982 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:38 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:38.983 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:38 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:38.983 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:39.037 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:39.038 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30841 DF PROTO=TCP SPT=52784 DPT=9100 SEQ=1354231002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D07AA10000000001030307) 
Nov 23 09:30:40 np0005532585.localdomain python3.9[234061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:41 np0005532585.localdomain python3.9[234147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890240.1224487-741-223280229895433/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:41 np0005532585.localdomain python3.9[234255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7668 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D084C70000000001030307) 
Nov 23 09:30:42 np0005532585.localdomain python3.9[234310]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:42 np0005532585.localdomain python3.9[234418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:43 np0005532585.localdomain python3.9[234504]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890242.2063427-741-17228611610544/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:43 np0005532585.localdomain python3.9[234612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:44.038 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:44.040 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:44.040 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:44.041 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:44.058 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:44.058 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:44 np0005532585.localdomain python3.9[234698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890243.2827628-741-199279989865214/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:44 np0005532585.localdomain python3.9[234806]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:30:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7670 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D090E00000000001030307) 
Nov 23 09:30:45 np0005532585.localdomain podman[234823]: 2025-11-23 09:30:45.01550288 +0000 UTC m=+0.070616555 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:30:45 np0005532585.localdomain podman[234823]: 2025-11-23 09:30:45.079183393 +0000 UTC m=+0.134297048 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:30:45 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:30:45 np0005532585.localdomain python3.9[234917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890244.4157846-741-226513149588775/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:45 np0005532585.localdomain sudo[234951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:30:45 np0005532585.localdomain sudo[234951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:30:45 np0005532585.localdomain sudo[234951]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:45 np0005532585.localdomain sudo[234995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:30:45 np0005532585.localdomain sudo[234995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:30:45 np0005532585.localdomain python3.9[235061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:46 np0005532585.localdomain sudo[234995]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:46 np0005532585.localdomain python3.9[235170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890245.5103056-741-104431102843691/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:46 np0005532585.localdomain sudo[235288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:30:46 np0005532585.localdomain sudo[235288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:30:46 np0005532585.localdomain sudo[235288]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:46 np0005532585.localdomain python3.9[235287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:47 np0005532585.localdomain python3.9[235391]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890246.5791655-741-163880738108669/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:48 np0005532585.localdomain python3.9[235499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:30:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7671 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0A0A00000000001030307) 
Nov 23 09:30:49 np0005532585.localdomain podman[235586]: 2025-11-23 09:30:49.017158344 +0000 UTC m=+0.075152048 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:30:49 np0005532585.localdomain podman[235586]: 2025-11-23 09:30:49.04801641 +0000 UTC m=+0.106010094 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 09:30:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:49.059 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:49.061 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:49.062 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:49.062 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:49 np0005532585.localdomain python3.9[235585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890247.550197-741-98228040881482/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:49 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:30:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:49.088 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:49.089 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:49 np0005532585.localdomain python3.9[235711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:50.319 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:30:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:50.344 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 23 09:30:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:50.345 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:30:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:50.345 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:30:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:50.346 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:30:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:50.432 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:30:50 np0005532585.localdomain python3.9[235797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890249.1735647-741-121942688886372/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:51 np0005532585.localdomain python3.9[235905]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:51 np0005532585.localdomain python3.9[235991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890250.7956047-741-49126990267938/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:52 np0005532585.localdomain python3.9[236099]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:52 np0005532585.localdomain python3.9[236185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890251.8765295-741-134149799912637/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:30:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59342 DF PROTO=TCP SPT=41054 DPT=9102 SEQ=3168736183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0B1E10000000001030307) 
Nov 23 09:30:53 np0005532585.localdomain sudo[236293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onosfivosejjonuaiptjcpkadekztsvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890253.3046417-1206-35209868381035/AnsiballZ_file.py
Nov 23 09:30:53 np0005532585.localdomain sudo[236293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:53 np0005532585.localdomain python3.9[236295]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:30:53 np0005532585.localdomain sudo[236293]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:54.090 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:54.092 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:54.093 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:54.093 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:54.122 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:54.123 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:54 np0005532585.localdomain sudo[236403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwjdsphvnkjqirjckzyozeresubtymmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890253.9496763-1230-28546056367333/AnsiballZ_systemd_service.py
Nov 23 09:30:54 np0005532585.localdomain sudo[236403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:54 np0005532585.localdomain python3.9[236405]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:30:54 np0005532585.localdomain systemd-sysv-generator[236437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:30:54 np0005532585.localdomain systemd-rc-local-generator[236431]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:30:54 np0005532585.localdomain systemd[1]: Listening on Podman API Socket.
Nov 23 09:30:54 np0005532585.localdomain sudo[236403]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:55 np0005532585.localdomain sudo[236553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zocgghcliozuqwzczzmzbwajdoqimpmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/AnsiballZ_stat.py
Nov 23 09:30:55 np0005532585.localdomain sudo[236553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:55 np0005532585.localdomain python3.9[236555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:55 np0005532585.localdomain sudo[236553]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:56 np0005532585.localdomain sudo[236641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utxgkqijlovlddbhwqofadhvoqnrjrxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/AnsiballZ_copy.py
Nov 23 09:30:56 np0005532585.localdomain sudo[236641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:56 np0005532585.localdomain python3.9[236643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:30:56 np0005532585.localdomain sudo[236641]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:56 np0005532585.localdomain sudo[236696]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeuvzhttdtnwkgxkrzsxkubnwjjrkokn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/AnsiballZ_stat.py
Nov 23 09:30:56 np0005532585.localdomain sudo[236696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:56 np0005532585.localdomain python3.9[236698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:30:56 np0005532585.localdomain sudo[236696]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:56 np0005532585.localdomain sudo[236784]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jesaajyscacitxsyheruguwnsdtnpnny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/AnsiballZ_copy.py
Nov 23 09:30:56 np0005532585.localdomain sudo[236784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7672 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0C0200000000001030307) 
Nov 23 09:30:57 np0005532585.localdomain python3.9[236786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:30:57 np0005532585.localdomain sudo[236784]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:57 np0005532585.localdomain sudo[236894]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnzwznaddaqausughxjgbmoqbwycswey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890257.5543487-1341-108768723095439/AnsiballZ_container_config_data.py
Nov 23 09:30:57 np0005532585.localdomain sudo[236894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:58 np0005532585.localdomain python3.9[236896]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 23 09:30:58 np0005532585.localdomain sudo[236894]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:58 np0005532585.localdomain sudo[237004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymhyplilqbjneepbhajwkzctaibzbiyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890258.428233-1368-102453300387238/AnsiballZ_container_config_hash.py
Nov 23 09:30:58 np0005532585.localdomain sudo[237004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:30:59 np0005532585.localdomain python3.9[237006]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:30:59 np0005532585.localdomain sudo[237004]: pam_unix(sudo:session): session closed for user root
Nov 23 09:30:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:59.123 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:59.125 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:30:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:59.125 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:30:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:59.125 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:59.164 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:30:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:30:59.164 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:30:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34932 DF PROTO=TCP SPT=53614 DPT=9101 SEQ=2178214657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0CACA0000000001030307) 
Nov 23 09:30:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:30:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37116 DF PROTO=TCP SPT=51694 DPT=9105 SEQ=2741287519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0CB7B0000000001030307) 
Nov 23 09:31:00 np0005532585.localdomain systemd[1]: tmp-crun.Q8NkQD.mount: Deactivated successfully.
Nov 23 09:31:00 np0005532585.localdomain podman[237095]: 2025-11-23 09:31:00.035425374 +0000 UTC m=+0.085091281 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:31:00 np0005532585.localdomain podman[237095]: 2025-11-23 09:31:00.043634424 +0000 UTC m=+0.093300321 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:31:00 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:31:00 np0005532585.localdomain sudo[237132]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtneeraewtgansdsvlbktiizwbvfolmd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890259.4466164-1398-37862516941580/AnsiballZ_edpm_container_manage.py
Nov 23 09:31:00 np0005532585.localdomain sudo[237132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:01 np0005532585.localdomain python3[237134]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:31:01 np0005532585.localdomain python3[237134]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5",
                                                                    "Digest": "sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:23:50.144134741Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 505196287,
                                                                    "VirtualSize": 505196287,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",
                                                                              "sha256:4ff7b15b3989ce3486d1ee120e82ba5b4acb5e4ad1d931e92c8d8e0851a32a6a",
                                                                              "sha256:847ae301d478780c04ade872e138a0bd4b67a423f03bd51e3a177105d1684cb3"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:19.349843192Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:59.347040136Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:02.744397841Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:15:50.605214148Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:15:52.846340556Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:16:39.654532036Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:16:42.640001814Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:23:07.624548534Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:23:50.142272318Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:23:52.827921842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 23 09:31:01 np0005532585.localdomain podman[237185]: 2025-11-23 09:31:01.377054291 +0000 UTC m=+0.081485808 container remove 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 23 09:31:01 np0005532585.localdomain python3[237134]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Nov 23 09:31:01 np0005532585.localdomain podman[237200]: 
Nov 23 09:31:01 np0005532585.localdomain podman[237200]: 2025-11-23 09:31:01.476334281 +0000 UTC m=+0.080980742 container create db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 23 09:31:01 np0005532585.localdomain podman[237200]: 2025-11-23 09:31:01.436407778 +0000 UTC m=+0.041054289 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 23 09:31:01 np0005532585.localdomain python3[237134]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 23 09:31:01 np0005532585.localdomain sudo[237132]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:02 np0005532585.localdomain sudo[237345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tazwbjrhomtojlfzcihpalqaorrueofm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890261.9168901-1422-163835293812894/AnsiballZ_stat.py
Nov 23 09:31:02 np0005532585.localdomain sudo[237345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:02 np0005532585.localdomain python3.9[237347]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:31:02 np0005532585.localdomain sudo[237345]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:02.803 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:02.804 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:02.804 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:31:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:02.805 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:31:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34934 DF PROTO=TCP SPT=53614 DPT=9101 SEQ=2178214657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0D6E00000000001030307) 
Nov 23 09:31:03 np0005532585.localdomain sudo[237457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvcvxchcymmqxsgbnoikborfixqpabkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890263.1493146-1449-21310736924219/AnsiballZ_file.py
Nov 23 09:31:03 np0005532585.localdomain sudo[237457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:03 np0005532585.localdomain python3.9[237459]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:31:03 np0005532585.localdomain sudo[237457]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:04 np0005532585.localdomain sudo[237566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzlsxklynpoiddjgiefkuesjbgutkbct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890263.6718848-1449-268053809274375/AnsiballZ_copy.py
Nov 23 09:31:04 np0005532585.localdomain sudo[237566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.165 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.167 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.167 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.168 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.205 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.206 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:04 np0005532585.localdomain python3.9[237568]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890263.6718848-1449-268053809274375/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:31:04 np0005532585.localdomain sudo[237566]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.583 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.583 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.584 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:31:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:04.584 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:31:04 np0005532585.localdomain sudo[237621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chwgxpgzrmxerxtmkzkmdjrylwnbxvsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890263.6718848-1449-268053809274375/AnsiballZ_systemd.py
Nov 23 09:31:04 np0005532585.localdomain sudo[237621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:05 np0005532585.localdomain python3.9[237623]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:31:05 np0005532585.localdomain systemd-sysv-generator[237647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:31:05 np0005532585.localdomain systemd-rc-local-generator[237644]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:05 np0005532585.localdomain sudo[237621]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.414 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.436 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.436 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.437 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.437 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.437 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.438 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.438 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.438 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.439 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.439 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.455 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.455 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.456 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.456 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.457 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:31:05 np0005532585.localdomain sudo[237732]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czucqxivkxzcdutsjvyoltkwjypwnhxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890263.6718848-1449-268053809274375/AnsiballZ_systemd.py
Nov 23 09:31:05 np0005532585.localdomain sudo[237732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.908 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:31:05 np0005532585.localdomain python3.9[237734]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.967 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:31:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:05.967 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.188 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.190 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12918MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.191 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.191 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.256 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.257 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.257 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:31:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5704 DF PROTO=TCP SPT=42040 DPT=9100 SEQ=722781209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0E4200000000001030307) 
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.300 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.757 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.765 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.779 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.783 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:31:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:06.783 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:31:07 np0005532585.localdomain systemd-rc-local-generator[237784]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:31:07 np0005532585.localdomain systemd-sysv-generator[237792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:31:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 23 09:31:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:31:07 np0005532585.localdomain podman[237799]: 2025-11-23 09:31:07.515093676 +0000 UTC m=+0.134545216 container init db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2)
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + sudo -E kolla_set_configs
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: sudo: unable to send audit message: Operation not permitted
Nov 23 09:31:07 np0005532585.localdomain sudo[237821]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 09:31:07 np0005532585.localdomain sudo[237821]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:31:07 np0005532585.localdomain sudo[237821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:31:07 np0005532585.localdomain podman[237799]: 2025-11-23 09:31:07.560832403 +0000 UTC m=+0.180283943 container start db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:31:07 np0005532585.localdomain podman[237799]: ceilometer_agent_compute
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Validating config file
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Copying service configuration files
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: INFO:__main__:Writing out command to execute
Nov 23 09:31:07 np0005532585.localdomain sudo[237821]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: ++ cat /run_command
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + ARGS=
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + sudo kolla_copy_cacerts
Nov 23 09:31:07 np0005532585.localdomain sudo[237732]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: sudo: unable to send audit message: Operation not permitted
Nov 23 09:31:07 np0005532585.localdomain sudo[237835]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 23 09:31:07 np0005532585.localdomain sudo[237835]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:31:07 np0005532585.localdomain sudo[237835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 23 09:31:07 np0005532585.localdomain sudo[237835]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + [[ ! -n '' ]]
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + . kolla_extend_start
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + umask 0022
Nov 23 09:31:07 np0005532585.localdomain ceilometer_agent_compute[237815]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 23 09:31:07 np0005532585.localdomain podman[237824]: 2025-11-23 09:31:07.646165631 +0000 UTC m=+0.081002702 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm)
Nov 23 09:31:07 np0005532585.localdomain podman[237824]: 2025-11-23 09:31:07.678280557 +0000 UTC m=+0.113117648 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:31:07 np0005532585.localdomain podman[237824]: unhealthy
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:31:07 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:31:08 np0005532585.localdomain sudo[237952]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vniuhczizwqyafjfbsrobndhhpinphim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890267.8233185-1521-160655086214836/AnsiballZ_systemd.py
Nov 23 09:31:08 np0005532585.localdomain sudo[237952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.340 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.340 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.340 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.353 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.353 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.369 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.370 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.371 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 23 09:31:08 np0005532585.localdomain python3.9[237954]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:31:08 np0005532585.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.476 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 23 09:31:08 np0005532585.localdomain systemd[1]: tmp-crun.jsnnYj.mount: Deactivated successfully.
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.527 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.544 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.574 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.582 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.628 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.628 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.629 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.908 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ac70b06cd618b02a69e86ac9618a72b930eb6965a99ca4a3b2aa53408954f371" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.985 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:08 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-cb876c5f-4670-4d11-adf5-6d047f5427a1 x-openstack-request-id: req-cb876c5f-4670-4d11-adf5-6d047f5427a1 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.985 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.986 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-cb876c5f-4670-4d11-adf5-6d047f5427a1 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 23 09:31:08 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.988 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ac70b06cd618b02a69e86ac9618a72b930eb6965a99ca4a3b2aa53408954f371" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.017 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-cdd5758b-0163-4591-9b16-c98144903c4f x-openstack-request-id: req-cdd5758b-0163-4591-9b16-c98144903c4f _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.018 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.018 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 used request id req-cdd5758b-0163-4591-9b16-c98144903c4f request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.019 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.020 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.045 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 50900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4e9a21-ef42-4d47-b853-1cd87ec3777f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50900000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:09.020415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2498e990-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.222336239, 'message_signature': 'b9cdb5db665f49bbbf44ac8c3837df11419845f5f901373de6beca57b483a6d1'}]}, 'timestamp': '2025-11-23 09:31:09.046249', '_unique_id': '578470e99f1f4f34a29ddd6e0e3300ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.060 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 355032bc-9946-4f6d-817c-2bfc8694d41d / tapd3912d14-a3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.061 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '218535e6-5649-4cb2-99fd-6a30185a1edc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.057324', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '249b51bc-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'c1b1d27da66b313221b48ad408261b6d8802d0331edd22b8f299c1b9233fd649'}]}, 'timestamp': '2025-11-23 09:31:09.061857', '_unique_id': '756471c9fa964d2bb5e73717f4132aac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.078 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.078 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7682290-17ad-4c80-a6b4-4c2d85e8d209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.064495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '249de6a2-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': 'd2810731127dae29db35cb6cd5268b14bf0e1a57e91049328073afb8296da247'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.064495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '249dfbd8-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': '169a5f6cc346ce09b3e3684e85b760cc6c52d84641c7d2f35059e0136a08aafa'}]}, 'timestamp': '2025-11-23 09:31:09.079269', '_unique_id': '823944c1465e42e48d36f337b4fe754c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.081 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.082 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b60b8b-c2a2-47b4-9a35-79e94512b088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.081816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '249e7374-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': 'af11f5d88a60c144b4b3e126b0617dcd940010d924676453e1d48e45661624ba'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.081816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '249e8670-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': 'fd5204d1da74ebef498d0bd3ddf5c28895eb4c06b49c49e69a8735715690e020'}]}, 'timestamp': '2025-11-23 09:31:09.082804', '_unique_id': 'ea688908b5764be495b6746b97f76d00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.085 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.085 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.086 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7b7ff94-075e-44b7-81f8-9c7ee099289c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.086242', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '249f1ff4-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'e978ec3fde2c7473ce6afc3bf44b82c9380b7b53b70bdced8bd4023980afdd58'}]}, 'timestamp': '2025-11-23 09:31:09.086770', '_unique_id': '4a5b544a72ae4a70a634c59a5a8d20db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.089 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f099576-e657-4a7f-bc6f-fef248c18d2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.089210', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '249f9376-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'cccce40001e07a8b87b4851f247f32461a000ce126b6e7002dd0db4c80e4422b'}]}, 'timestamp': '2025-11-23 09:31:09.089730', '_unique_id': 'ebd891b88f5a44ed883a051ea7824aca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.092 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5cd1ec1-127a-48cc-995a-628769d1348e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.092113', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a0040a-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '4aafd6b3db2673a6885b1264d171064f2aa404c495d56ffa72e6f1476a3d7ccf'}]}, 'timestamp': '2025-11-23 09:31:09.092634', '_unique_id': 'ee013968de36464783f05f72c6756ce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.133 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.134 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fc483f3-4fc9-4396-9daf-c415176d404e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.095301', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a65b8e-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '44243623ff2b7fb6295f2bbf5c0387e7f27118dc8e511b1500593088e88320d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.095301', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a67128-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'f15878028debb9cd33d1822ba21d79ac11bc1c384dd20e438f01f2e1270317f4'}]}, 'timestamp': '2025-11-23 09:31:09.134715', '_unique_id': 'b3a858d364ce489491d7e5a16d9bd06f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.138 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9c8f44e-41a2-45f0-a4d1-c9787b71e9fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.138049', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a70840-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'e9223232e5e615820ee279bcd95f0c6c4652f39cfe17c6e90148e19888df66e7'}]}, 'timestamp': '2025-11-23 09:31:09.138626', '_unique_id': '5de347a5e6fc412780f2654ceaa95ec7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.141 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.141 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9f6080d-3a0d-4e4a-9144-d40b32422e51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.141061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a77e10-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'e5e595e3b7d158103008740632aa91d35e8d589aba66fd892de3d8da650edc18'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.141061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a7927e-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'ea98e7057c89352e3164089fa15030cb319218c8035e576bf7bb7f24705a5584'}]}, 'timestamp': '2025-11-23 09:31:09.142121', '_unique_id': 'b51153059efb44bc828ddd425552f819'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.144 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.144 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7128677a-fd39-4e49-a988-2263210eb894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:09.144516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '24a80376-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.222336239, 'message_signature': '64b494babb94227c551e9d341841b3e6a7b158b6fc225082b93c3cf07bf5ccea'}]}, 'timestamp': '2025-11-23 09:31:09.145031', '_unique_id': '43631d88dd6746e1ab13c6f0b3ea589e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.147 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.147 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88ec5748-917d-4240-a69f-aa9362ccf5e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.147295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a8720c-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '729efd81428729aa8b80f8f5a8cf56b21d9e2fa532b714493778a7eae2fbbf0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.147295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a88526-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '78ff174cd10d1db703c64fe8ac1373e63be2c1ff63a898b7dd10a7f40b258e49'}]}, 'timestamp': '2025-11-23 09:31:09.148319', '_unique_id': '2eb1e3b71942448b971a934cefb12068'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.150 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f627099-2422-4a4f-b893-5c787bfd9098', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.150638', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a8f236-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '6fc93ac0abf2bc804ea5689ddde04fc73b67610013f7b7d3039215c95e6dfae6'}]}, 'timestamp': '2025-11-23 09:31:09.151150', '_unique_id': 'bf951a26d5094dc99af65441dad65b0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.153 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.153 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2441eeff-50c1-460b-ba50-e1cc25088034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.153449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a95fb4-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '825a821d8a907ae9f21acce873de2d69527e9fdb1c513808f621c05f3a04cca5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.153449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a973e6-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '8f3afac8594924b3687b23d5b5b2a76ba2bf8fa0869741dc23b0063614b65f2c'}]}, 'timestamp': '2025-11-23 09:31:09.154471', '_unique_id': 'e983098db8f44b459d2dcd66edad8eb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.157 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e64fbdd7-200f-4df4-8164-d4c1e83b7f84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.157048', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a9edbc-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '7e00dacb34b2158af772e47a59011159c7374c6b162ec35491d69f37c94929c1'}]}, 'timestamp': '2025-11-23 09:31:09.157594', '_unique_id': '8ad31f193bb841f0a30b0a890caa7829'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.159 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd22bef6d-c85f-48b1-a91d-41c7af80a2c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.159934', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24aa5db0-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'a98524b19f26d413b691c287d47958bca7349d673b27552d9aedbe85de31e55c'}]}, 'timestamp': '2025-11-23 09:31:09.160427', '_unique_id': '281edaf4c0d841e8b5f02266b96a084f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.162 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.163 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8525dc5a-575c-4f54-b413-bd9d336a0c81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.162739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24aacc1e-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '730be7c8da68a290580906bbb9e398e66a4948acf251f589cd20368385f493c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.162739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24aadd80-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'c8b3750a5bebf6c1b00b3ff7542763844f0012871f41b88d2c38efdd571eea75'}]}, 'timestamp': '2025-11-23 09:31:09.163693', '_unique_id': '2fc222ad4ccc42e686bca401bb79f883'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.167 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9115f1a2-1bcf-49f9-9444-38a2886c8e0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.166989', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24ab7196-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '4f660c1915cd3ca021983258fc0fef6cc7875fbfd9228384b15652898a132d53'}]}, 'timestamp': '2025-11-23 09:31:09.167492', '_unique_id': 'f017e5c44f6647e2ac0bea0d31c593a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.170 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eaf0d7a-0ca0-4a72-848f-28a6d66bade8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.170039', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24abe838-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '8cd7b07b24d579f6ff5075f4b4aa661833e9ab0b81b2e1cc22af1cd0a28632be'}]}, 'timestamp': '2025-11-23 09:31:09.170525', '_unique_id': '2cebef03a1e6489ca64b1c586197157e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.172 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.172 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bef5078-486a-400d-aca7-3917f1f7b2b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.172582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24ac47ce-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '78eb6a15a7fe52aea356ceeff0c612fa8a82881d104a91eb4892ccb03c879f3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.172582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24ac52fa-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '44bfcc62cabef0026b8c59c4e48574554a0267e8f7a989529886e893283557f5'}]}, 'timestamp': '2025-11-23 09:31:09.173150', '_unique_id': '6fe1a339b59343c8bc639af994fa0b2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3523519-57f5-46d5-ad83-89dc9e42a83f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.175416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24acb754-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': '5be757c87304cac83bc448ce3db90441f0a4632f651b3e36cab6eebcb493064a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.175416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24acc208-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': '3b6afa3ff0c4e6367fe54d0e2879ea2b58118aa5255374144807c436b51846ef'}]}, 'timestamp': '2025-11-23 09:31:09.176011', '_unique_id': '6cb7a21b1884453bafed89f2120574b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:09 np0005532585.localdomain virtqemud[203731]: End of file while reading data: Input/output error
Nov 23 09:31:09 np0005532585.localdomain virtqemud[203731]: End of file while reading data: Input/output error
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.190 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 23 09:31:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:09.207 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:09.209 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:09.210 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:09.210 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63975 DF PROTO=TCP SPT=51130 DPT=9100 SEQ=477593843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0EFA00000000001030307) 
Nov 23 09:31:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:31:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:31:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:31:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:31:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:31:09.243 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:31:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:09.248 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:09.249 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: libpod-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope: Deactivated successfully.
Nov 23 09:31:09 np0005532585.localdomain podman[237961]: 2025-11-23 09:31:09.334793721 +0000 UTC m=+0.897766511 container died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: libpod-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope: Consumed 1.303s CPU time.
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.timer: Deactivated successfully.
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743-merged.mount: Deactivated successfully.
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44-userdata-shm.mount: Deactivated successfully.
Nov 23 09:31:09 np0005532585.localdomain podman[237961]: 2025-11-23 09:31:09.452420411 +0000 UTC m=+1.015393211 container cleanup db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 23 09:31:09 np0005532585.localdomain podman[237961]: ceilometer_agent_compute
Nov 23 09:31:09 np0005532585.localdomain podman[237992]: 2025-11-23 09:31:09.597849229 +0000 UTC m=+0.108684188 container cleanup db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 23 09:31:09 np0005532585.localdomain podman[237992]: ceilometer_agent_compute
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:31:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 23 09:31:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:31:09 np0005532585.localdomain podman[238004]: 2025-11-23 09:31:09.758374235 +0000 UTC m=+0.134158563 container init db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + sudo -E kolla_set_configs
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: sudo: unable to send audit message: Operation not permitted
Nov 23 09:31:09 np0005532585.localdomain sudo[238024]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 23 09:31:09 np0005532585.localdomain sudo[238024]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:31:09 np0005532585.localdomain sudo[238024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 23 09:31:09 np0005532585.localdomain podman[238004]: 2025-11-23 09:31:09.799521147 +0000 UTC m=+0.175305485 container start db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 09:31:09 np0005532585.localdomain podman[238004]: ceilometer_agent_compute
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 23 09:31:09 np0005532585.localdomain sudo[237952]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Validating config file
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Copying service configuration files
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: INFO:__main__:Writing out command to execute
Nov 23 09:31:09 np0005532585.localdomain sudo[238024]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: ++ cat /run_command
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + ARGS=
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + sudo kolla_copy_cacerts
Nov 23 09:31:09 np0005532585.localdomain sudo[238050]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 23 09:31:09 np0005532585.localdomain sudo[238050]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 23 09:31:09 np0005532585.localdomain sudo[238050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: sudo: unable to send audit message: Operation not permitted
Nov 23 09:31:09 np0005532585.localdomain sudo[238050]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:09 np0005532585.localdomain podman[238026]: 2025-11-23 09:31:09.885590718 +0000 UTC m=+0.082184610 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + [[ ! -n '' ]]
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + . kolla_extend_start
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + umask 0022
Nov 23 09:31:09 np0005532585.localdomain ceilometer_agent_compute[238018]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 23 09:31:09 np0005532585.localdomain podman[238026]: 2025-11-23 09:31:09.915115092 +0000 UTC m=+0.111709014 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:31:09 np0005532585.localdomain podman[238026]: unhealthy
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:31:09 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.624 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.625 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.626 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.643 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.790 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 23 09:31:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.798 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.139 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}4943fe3d2d0f6bc86f045500bb55c0a9758cf8f280687dae1712f3a359331ec1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.190 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:11 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f6956fff-fab4-465a-abff-146d12a09e04 x-openstack-request-id: req-f6956fff-fab4-465a-abff-146d12a09e04 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.190 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.190 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-f6956fff-fab4-465a-abff-146d12a09e04 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.192 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}4943fe3d2d0f6bc86f045500bb55c0a9758cf8f280687dae1712f3a359331ec1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.205 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:11 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-278583a1-3626-4d3a-aac2-3600da13fd29 x-openstack-request-id: req-278583a1-3626-4d3a-aac2-3600da13fd29 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.205 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.205 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 used request id req-278583a1-3626-4d3a-aac2-3600da13fd29 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.207 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.241 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.242 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d752762-388a-4476-9fe6-6a513a7d6d7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.207942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25e81bfe-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '792906acc776b15dea53ed642914e5c0b50eb02c4fe38c66bfb2e45884c30fff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.207942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25e836e8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'ee1902992f5adc0102d1f06a03db6048390b1a05a5ba9fe2ced02116ebba5990'}]}, 'timestamp': '2025-11-23 09:31:11.243486', '_unique_id': '868f428b2a8b4b8a82eb4f06a639b112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.269 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.269 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd42c8a70-a277-46f6-8199-ab7df061edf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.256174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25ec364e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '19ef297c3c5fae48f6031d62226604b290e60d01193dbacd1453c67d0c5e6d97'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.256174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25ec49e0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': 'a6b05d07da2de1ad04ffacc00239b25d2f5b7198ea83cae09ef205abedddd8d2'}]}, 'timestamp': '2025-11-23 09:31:11.270157', '_unique_id': '4cd8393cee904d09b410cd68725160c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.272 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.273 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f57af3f8-1f00-484c-839e-ef4f5b3169fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.272841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25ecc5d2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'cb0433bb4b666e924724de00b4508eb294a34f28cebdd9edbd65aee0c77ce451'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.272841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25ecd5f4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '87c42c7e59b4ff8092491a035fe3e1c449015a85fe3525e5e02dde5281871d04'}]}, 'timestamp': '2025-11-23 09:31:11.273730', '_unique_id': '6f4cdd4c79964b07b6024b80698cd5f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.279 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 355032bc-9946-4f6d-817c-2bfc8694d41d / tapd3912d14-a3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.279 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b9b19d5-a20c-4805-9a73-c9928b9b39f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.276138', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25edc536-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'ec6bb8b43f732d6cb4ccade4178446970367fc55e27b28735a7d798c6929c7ce'}]}, 'timestamp': '2025-11-23 09:31:11.279926', '_unique_id': '9293d911f8794825a2a84628e196043f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.282 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.282 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96968f39-306e-41eb-a9d0-3b66b2f61027', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.282255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25ee33f4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '2284765210958643908a6580dbf48a3015e3f84258935f7828b3f08857bbc9e1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.282255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25ee45b0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '24113c3485b26f4b2fdbd94a6b5d364a94357e6612e63fd94181043721da0d26'}]}, 'timestamp': '2025-11-23 09:31:11.283183', '_unique_id': 'd703ccc50ffc48e2bb561fc135e355bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.285 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.285 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.286 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.286 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df11bfd-d474-419b-a9b1-5510637f7112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.286736', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25eee4a2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': '841f864d5672964a0da071ef3c6d00a9f3067cd16cd65d56c3e63eeb29f1a1ba'}]}, 'timestamp': '2025-11-23 09:31:11.287278', '_unique_id': '5ad1ed05ee814ef7994c14a801412048'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.289 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88e0df29-e9c3-4313-b948-3c3984fac4ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.289545', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25ef50cc-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'aabb3e636f1cec66aecf5e0d62d74e8bb77758ed027ab349b92869ec678f7014'}]}, 'timestamp': '2025-11-23 09:31:11.290051', '_unique_id': '3ed680710dae4338901e945e0f489403'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6caa6b8-a8fe-43e2-8f87-2192c62470ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.292777', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25efd038-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': '5668a565b3d5124a3b584e004b9065f2a6a2bd0ec3fdaf59995a28f63871390c'}]}, 'timestamp': '2025-11-23 09:31:11.293277', '_unique_id': 'bcd1cc1dcaf046f59f739318efb80a15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.296 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd448daf8-297a-450c-b4ad-c59165fc5d6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.295997', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f04cfc-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'bf396501856c65a621c0adffba0246641a96684dadc9d24c2e10651372157763'}]}, 'timestamp': '2025-11-23 09:31:11.296472', '_unique_id': '87aecc968c46486993381819e76a03ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.298 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.317 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 50920000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daed470a-8efc-41f2-b4f1-7f44b09ca862', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50920000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:11.298737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '25f38ce6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.494408789, 'message_signature': 'd56263cba6d3b88705bb463b6c1a6c324b6f7d14ef0914a95ff0d3d603474753'}]}, 'timestamp': '2025-11-23 09:31:11.317774', '_unique_id': '0d36c66607be43ee8c2a286f48baea17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.320 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.320 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7979a024-fb01-4deb-a40b-a19568c79610', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.320379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f4057c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '78c5a77f933751756f4eedf5fda598b95364869382a9e2064ef0c57959f0540a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.320379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f4176a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'e74580ff965d57ac3c9be3443336bcfb76cf92c596aa679787ea6e7d7daa1e8f'}]}, 'timestamp': '2025-11-23 09:31:11.321276', '_unique_id': 'ea390a3a7cfc49d4932ba0000f829b3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.323 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c49c4d1-5bad-4fe8-abe0-8148c6010cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.323535', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f480a6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'bf1fb8fab02ad2cfd213b9424f49b93b996248c29918dc9807b61dca3d78fbcf'}]}, 'timestamp': '2025-11-23 09:31:11.324034', '_unique_id': 'b04abe0dec9145159ef9d1ca756a50ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.326 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.326 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b200d2f-599f-49be-9cf8-8f49aa6f4bdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.326183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f4e7b2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '2c2b094e932177539f3278789f83d16b667fe1cda4a6e6803fbcdbe165df4627'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.326183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f4f7ca-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'd0f04e11fe1dbb73762c61e5e4391b8fdfa093f5ce2d885fe6556d7098b752f7'}]}, 'timestamp': '2025-11-23 09:31:11.327068', '_unique_id': 'b62d86c537c84c0ab9fb14ffb9aaa461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.330 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72586825-1252-425a-b056-68b381651c56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.329803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f57678-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '9d50ec48ce5ebb9620cb9dd84a7583b83bcdb2aad993c851bbcf5f86cf5f0046'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.329803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f58834-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '5e347d7a21698e2acfd5efa1b1dcfdcbe5038df26b2b4323ec9e59226e4a4a6a'}]}, 'timestamp': '2025-11-23 09:31:11.330718', '_unique_id': '26a7e8b199394657b900005e0586b0fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.333 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d18161d-6d89-41ea-9392-48e8d56eca28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.333013', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f5f2f6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'f5d53cc6d7455c0ac0c5e975470e442d0b93648bbbc20f304632d47aa1d9aa68'}]}, 'timestamp': '2025-11-23 09:31:11.333484', '_unique_id': 'e1f034a9ebd0431aaed53fee141b84a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.335 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.335 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.336 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af0cb322-a430-4fc5-a2fd-387640978536', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.335667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f65bba-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '2bd7ff7d1eb07d10a217937a53717a956a01bbe5cbcb895f72848f944013bbf1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.335667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f66c36-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': 'ec4b3b886ed880ce5b0eab1f31c6bc1673f1e856e36733abb26021c1a469daa7'}]}, 'timestamp': '2025-11-23 09:31:11.336556', '_unique_id': '574b6669eb6a424189be3d03fe557455'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.338 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '558f20e1-51fe-417d-8ed9-5dfaad043dc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.338750', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f6d61c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'a65aba3040e9165b3672b36f6ba86c871c81cb507e0641948bc70d25c984692c'}]}, 'timestamp': '2025-11-23 09:31:11.339411', '_unique_id': '7253ab9b47c0435b858666a5a9ee281a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.341 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.342 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.342 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73d6358e-456d-4e08-b2a0-51f4e6015ea0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.341992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f753b2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'dbe98f266ead22e858af9be52c6e06abfec6167e331da65eea9adfd39fbf905d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.341992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f76442-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '8ff575bc49a5bb613c762618603fb4a78976ecfc60e7a4ca20294a49afabac3a'}]}, 'timestamp': '2025-11-23 09:31:11.342931', '_unique_id': '2a6b8c74152d4e928a1da46d7cbf6ed6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.345 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.345 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3bc89d1-c744-4cc9-8ca9-b6a81caa77f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.345239', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f7d04e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'a16befeed15889d875ee29b95f498b82a34d16c82a444230427d9f3a4398f434'}]}, 'timestamp': '2025-11-23 09:31:11.345699', '_unique_id': '53f3a97f857f498592ae1801360c8427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.347 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.347 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b84d08e-40cb-4947-b668-35a3ed83a6b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:11.347641', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '25f82a9e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.494408789, 'message_signature': 'a21386dfdc3ff2a74c3c057c06f5fcc15efe4c17663a3d2229f0b9940436f637'}]}, 'timestamp': '2025-11-23 09:31:11.347932', '_unique_id': '3383384e18834abe90462bef2b74adbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.349 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1772a36d-f81e-4a40-b1ed-ff604ffb6e87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.349468', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f8721a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': '720d7a3f90fd0c296b88e3010b0ee23f07ff6b9c556f9fc1306b08fbe5f4b722'}]}, 'timestamp': '2025-11-23 09:31:11.349753', '_unique_id': 'bb0187783ce8424fa3e90eef2299e7b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:31:11 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:31:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24354 DF PROTO=TCP SPT=50984 DPT=9882 SEQ=2932324281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0F9F60000000001030307) 
Nov 23 09:31:12 np0005532585.localdomain sudo[238163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmsnpenrkedfcdvrgukvagpecindvuuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890271.752434-1545-15539360041088/AnsiballZ_stat.py
Nov 23 09:31:12 np0005532585.localdomain sudo[238163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:12 np0005532585.localdomain python3.9[238165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:31:12 np0005532585.localdomain sudo[238163]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:12 np0005532585.localdomain sudo[238251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyliimpnkekfjwqotqfcmkvxlvplyczr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890271.752434-1545-15539360041088/AnsiballZ_copy.py
Nov 23 09:31:12 np0005532585.localdomain sudo[238251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:12 np0005532585.localdomain python3.9[238253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890271.752434-1545-15539360041088/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:31:12 np0005532585.localdomain sudo[238251]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:13 np0005532585.localdomain sudo[238361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkmcjxiiqzaixgczgiaaqgpvodhpvgai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890273.7074265-1596-132368464801602/AnsiballZ_container_config_data.py
Nov 23 09:31:13 np0005532585.localdomain sudo[238361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:14 np0005532585.localdomain python3.9[238363]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 23 09:31:14 np0005532585.localdomain sudo[238361]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:14.248 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:14 np0005532585.localdomain sudo[238471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isuwvjilgiwioomhcenbmnpapoimxmuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890274.4120874-1623-143917514061553/AnsiballZ_container_config_hash.py
Nov 23 09:31:14 np0005532585.localdomain sudo[238471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:14 np0005532585.localdomain python3.9[238473]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:31:14 np0005532585.localdomain sudo[238471]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24356 DF PROTO=TCP SPT=50984 DPT=9882 SEQ=2932324281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D105E00000000001030307) 
Nov 23 09:31:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:31:16 np0005532585.localdomain podman[238491]: 2025-11-23 09:31:16.052960667 +0000 UTC m=+0.107341663 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 23 09:31:16 np0005532585.localdomain podman[238491]: 2025-11-23 09:31:16.09453645 +0000 UTC m=+0.148917396 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:31:16 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:31:16 np0005532585.localdomain sudo[238606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utlzatcuttdvioiimdezagxyhzckmogh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890276.017918-1653-223741670087934/AnsiballZ_edpm_container_manage.py
Nov 23 09:31:16 np0005532585.localdomain sudo[238606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:16 np0005532585.localdomain python3[238608]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:31:16 np0005532585.localdomain podman[238643]: 
Nov 23 09:31:16 np0005532585.localdomain podman[238643]: 2025-11-23 09:31:16.837665148 +0000 UTC m=+0.078589733 container create a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:31:16 np0005532585.localdomain podman[238643]: 2025-11-23 09:31:16.797767158 +0000 UTC m=+0.038691773 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 23 09:31:16 np0005532585.localdomain python3[238608]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 23 09:31:17 np0005532585.localdomain sudo[238606]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59344 DF PROTO=TCP SPT=41054 DPT=9102 SEQ=3168736183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D112200000000001030307) 
Nov 23 09:31:18 np0005532585.localdomain sudo[238790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvvmgqxlnwccwhayowaitesbsqiaohjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890278.2789128-1677-40996591290040/AnsiballZ_stat.py
Nov 23 09:31:18 np0005532585.localdomain sudo[238790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:18 np0005532585.localdomain python3.9[238792]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:31:18 np0005532585.localdomain sudo[238790]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:19.253 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:19.254 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:19.255 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:19.255 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:19.283 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:19.284 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:19 np0005532585.localdomain sudo[238902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnhahmgnnzezekgpnnjgegweenxtsimh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890279.1302075-1704-237981075462449/AnsiballZ_file.py
Nov 23 09:31:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:31:19 np0005532585.localdomain sudo[238902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:19 np0005532585.localdomain systemd[1]: tmp-crun.4TdPZz.mount: Deactivated successfully.
Nov 23 09:31:19 np0005532585.localdomain podman[238904]: 2025-11-23 09:31:19.524343999 +0000 UTC m=+0.097027160 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 23 09:31:19 np0005532585.localdomain podman[238904]: 2025-11-23 09:31:19.562773483 +0000 UTC m=+0.135456624 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 23 09:31:19 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:31:19 np0005532585.localdomain python3.9[238905]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:31:19 np0005532585.localdomain sudo[238902]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:20 np0005532585.localdomain sudo[239029]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gheeajgwlmqjhibwcmnoxnmdezxmphxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890279.6884835-1704-280128566221415/AnsiballZ_copy.py
Nov 23 09:31:20 np0005532585.localdomain sudo[239029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:20 np0005532585.localdomain python3.9[239031]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890279.6884835-1704-280128566221415/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:31:20 np0005532585.localdomain sudo[239029]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:20 np0005532585.localdomain sudo[239084]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqbqvjwsaxfvaddcgqlpskffyirbtckn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890279.6884835-1704-280128566221415/AnsiballZ_systemd.py
Nov 23 09:31:20 np0005532585.localdomain sudo[239084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:20 np0005532585.localdomain python3.9[239086]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:31:20 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:31:20 np0005532585.localdomain systemd-rc-local-generator[239107]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:31:20 np0005532585.localdomain systemd-sysv-generator[239113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:21 np0005532585.localdomain sudo[239084]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:21 np0005532585.localdomain sudo[239174]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-selqwzlcyakpwqenodlodhzkmyaayyln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890279.6884835-1704-280128566221415/AnsiballZ_systemd.py
Nov 23 09:31:21 np0005532585.localdomain sudo[239174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:21 np0005532585.localdomain python3.9[239176]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:31:21 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:31:21 np0005532585.localdomain systemd-sysv-generator[239205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:31:21 np0005532585.localdomain systemd-rc-local-generator[239202]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: Starting node_exporter container...
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:31:22 np0005532585.localdomain podman[239216]: 2025-11-23 09:31:22.355258309 +0000 UTC m=+0.132545594 container init a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.373Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.373Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.373Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=arp
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=bcache
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=bonding
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=cpu
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=edac
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=filefd
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=netclass
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=netdev
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=netstat
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=nfs
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=nvme
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=softnet
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=systemd
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=xfs
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=zfs
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 23 09:31:22 np0005532585.localdomain node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:31:22 np0005532585.localdomain podman[239216]: 2025-11-23 09:31:22.383475623 +0000 UTC m=+0.160762898 container start a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:31:22 np0005532585.localdomain podman[239216]: node_exporter
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: Started node_exporter container.
Nov 23 09:31:22 np0005532585.localdomain sudo[239174]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:22 np0005532585.localdomain podman[239239]: 2025-11-23 09:31:22.475459154 +0000 UTC m=+0.085391746 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:31:22 np0005532585.localdomain podman[239239]: 2025-11-23 09:31:22.489223945 +0000 UTC m=+0.099156537 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:31:22 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:31:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36411 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2246745688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D127200000000001030307) 
Nov 23 09:31:23 np0005532585.localdomain sudo[239369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckcqzruvazqlnwvrkzvcxxawbdxfgcqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890283.6300128-1776-133710959973528/AnsiballZ_systemd.py
Nov 23 09:31:23 np0005532585.localdomain sudo[239369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:24 np0005532585.localdomain python3.9[239371]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Stopping node_exporter container...
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: tmp-crun.SUbz43.mount: Deactivated successfully.
Nov 23 09:31:24 np0005532585.localdomain podman[239375]: 2025-11-23 09:31:24.283250223 +0000 UTC m=+0.058388961 container died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: libpod-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.scope: Deactivated successfully.
Nov 23 09:31:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:24.285 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:24.287 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:24.288 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:24.288 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:24.314 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:24.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.timer: Deactivated successfully.
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9-userdata-shm.mount: Deactivated successfully.
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-550e0a847b911bdcbe7aa7c90ef7335ef7fd8eecced200d056453fa3475177c6-merged.mount: Deactivated successfully.
Nov 23 09:31:24 np0005532585.localdomain podman[239375]: 2025-11-23 09:31:24.416771726 +0000 UTC m=+0.191910434 container cleanup a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:31:24 np0005532585.localdomain podman[239375]: node_exporter
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 23 09:31:24 np0005532585.localdomain podman[239402]: 2025-11-23 09:31:24.513507815 +0000 UTC m=+0.066044589 container cleanup a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:31:24 np0005532585.localdomain podman[239402]: node_exporter
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Stopped node_exporter container.
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Starting node_exporter container...
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:31:24 np0005532585.localdomain podman[239412]: 2025-11-23 09:31:24.67712506 +0000 UTC m=+0.131660405 container init a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.691Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.691Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.691Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.692Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.693Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.693Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 23 09:31:24 np0005532585.localdomain rsyslogd[760]: imjournal from <localhost:node_exporter>: begin to drop messages due to rate-limiting
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.694Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:117 level=info collector=arp
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:117 level=info collector=bcache
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:117 level=info collector=bonding
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.695Z caller=node_exporter.go:117 level=info collector=cpu
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=edac
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=filefd
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.696Z caller=node_exporter.go:117 level=info collector=netclass
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=netdev
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=netstat
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=nfs
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=nvme
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=softnet
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=systemd
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=xfs
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.697Z caller=node_exporter.go:117 level=info collector=zfs
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.698Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 23 09:31:24 np0005532585.localdomain node_exporter[239427]: ts=2025-11-23T09:31:24.699Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:31:24 np0005532585.localdomain podman[239412]: 2025-11-23 09:31:24.71764201 +0000 UTC m=+0.172177355 container start a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:31:24 np0005532585.localdomain podman[239412]: node_exporter
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: Started node_exporter container.
Nov 23 09:31:24 np0005532585.localdomain sudo[239369]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:24 np0005532585.localdomain podman[239436]: 2025-11-23 09:31:24.808192326 +0000 UTC m=+0.087037157 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:31:24 np0005532585.localdomain podman[239436]: 2025-11-23 09:31:24.819464629 +0000 UTC m=+0.098309510 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:31:24 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:31:26 np0005532585.localdomain sudo[239566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbhxdideeimqkkmrdadimbdzkibqrnpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890285.9363298-1800-137308862473395/AnsiballZ_stat.py
Nov 23 09:31:26 np0005532585.localdomain sudo[239566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:26 np0005532585.localdomain python3.9[239568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:31:26 np0005532585.localdomain sudo[239566]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:27 np0005532585.localdomain sudo[239654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmrlqgkbinwwwfnafjbbystphomtcmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890285.9363298-1800-137308862473395/AnsiballZ_copy.py
Nov 23 09:31:27 np0005532585.localdomain sudo[239654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24358 DF PROTO=TCP SPT=50984 DPT=9882 SEQ=2932324281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D136200000000001030307) 
Nov 23 09:31:27 np0005532585.localdomain python3.9[239656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890285.9363298-1800-137308862473395/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:31:27 np0005532585.localdomain sudo[239654]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:28 np0005532585.localdomain sudo[239764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiacrrblnriboyekuboerinahfpnuuzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890287.7975168-1851-207871263997255/AnsiballZ_container_config_data.py
Nov 23 09:31:28 np0005532585.localdomain sudo[239764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:28 np0005532585.localdomain python3.9[239766]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 23 09:31:28 np0005532585.localdomain sudo[239764]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:29.316 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:29.317 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:29.318 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:29.318 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:29.350 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:29.350 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:29 np0005532585.localdomain sudo[239874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqrzbexozggbgalnmtfikycylhqdkhgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890289.0906768-1878-241081202457798/AnsiballZ_container_config_hash.py
Nov 23 09:31:29 np0005532585.localdomain sudo[239874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:29 np0005532585.localdomain python3.9[239876]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:31:29 np0005532585.localdomain sudo[239874]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=722 DF PROTO=TCP SPT=43404 DPT=9101 SEQ=3570143108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D13FFA0000000001030307) 
Nov 23 09:31:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15378 DF PROTO=TCP SPT=48220 DPT=9105 SEQ=1339596618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D140AB0000000001030307) 
Nov 23 09:31:30 np0005532585.localdomain sudo[239984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fehicclqczsrpxmqvpihdonvebxhknnb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890289.949643-1908-4142070621977/AnsiballZ_edpm_container_manage.py
Nov 23 09:31:30 np0005532585.localdomain sudo[239984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:31:30 np0005532585.localdomain podman[239987]: 2025-11-23 09:31:30.331644061 +0000 UTC m=+0.082981031 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:31:30 np0005532585.localdomain podman[239987]: 2025-11-23 09:31:30.342415818 +0000 UTC m=+0.093752748 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:31:30 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:31:30 np0005532585.localdomain python3[239986]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:31:32 np0005532585.localdomain podman[240018]: 2025-11-23 09:31:30.707125973 +0000 UTC m=+0.094692548 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 23 09:31:32 np0005532585.localdomain podman[240087]: 
Nov 23 09:31:32 np0005532585.localdomain podman[240087]: 2025-11-23 09:31:32.632026721 +0000 UTC m=+0.079867762 container create 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Nov 23 09:31:32 np0005532585.localdomain podman[240087]: 2025-11-23 09:31:32.5952661 +0000 UTC m=+0.043107131 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 23 09:31:32 np0005532585.localdomain python3[239986]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 23 09:31:32 np0005532585.localdomain sudo[239984]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=724 DF PROTO=TCP SPT=43404 DPT=9101 SEQ=3570143108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D14C210000000001030307) 
Nov 23 09:31:33 np0005532585.localdomain sudo[240233]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmdnwpkacppamyrimudehmueafbpffbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890293.0801477-1933-162381488820677/AnsiballZ_stat.py
Nov 23 09:31:33 np0005532585.localdomain sudo[240233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:33 np0005532585.localdomain python3.9[240235]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:31:33 np0005532585.localdomain sudo[240233]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:34 np0005532585.localdomain sudo[240345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjmecjsyshavltbcvlzsbeirlwvdissd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890293.8554335-1959-234921848596327/AnsiballZ_file.py
Nov 23 09:31:34 np0005532585.localdomain sudo[240345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:34 np0005532585.localdomain python3.9[240347]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:31:34 np0005532585.localdomain sudo[240345]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:34.351 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:34.353 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:34.353 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:34.353 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:34.394 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:34.395 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:34 np0005532585.localdomain sudo[240454]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rizskkutwskiqmwocvseftrbqsvxwfgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890294.4225845-1959-213076799167240/AnsiballZ_copy.py
Nov 23 09:31:34 np0005532585.localdomain sudo[240454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:35 np0005532585.localdomain python3.9[240456]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890294.4225845-1959-213076799167240/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:31:35 np0005532585.localdomain sudo[240454]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:35 np0005532585.localdomain sudo[240509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdyylwgpeavpsxpbmrkuhiqkvydyirwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890294.4225845-1959-213076799167240/AnsiballZ_systemd.py
Nov 23 09:31:35 np0005532585.localdomain sudo[240509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:35 np0005532585.localdomain python3.9[240511]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:31:35 np0005532585.localdomain systemd-rc-local-generator[240539]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:31:35 np0005532585.localdomain systemd-sysv-generator[240543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:35 np0005532585.localdomain sudo[240509]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:35 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30844 DF PROTO=TCP SPT=52784 DPT=9100 SEQ=1354231002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D158200000000001030307) 
Nov 23 09:31:36 np0005532585.localdomain sudo[240599]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atyaudvqumfxrwdiwlvvixottdvzqxuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890294.4225845-1959-213076799167240/AnsiballZ_systemd.py
Nov 23 09:31:36 np0005532585.localdomain sudo[240599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:36 np0005532585.localdomain python3.9[240601]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:31:36 np0005532585.localdomain systemd-rc-local-generator[240626]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:31:36 np0005532585.localdomain systemd-sysv-generator[240631]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:31:36 np0005532585.localdomain systemd[1]: Starting podman_exporter container...
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:31:37 np0005532585.localdomain podman[240642]: 2025-11-23 09:31:37.130849928 +0000 UTC m=+0.179459513 container init 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: tmp-crun.Atmz1O.mount: Deactivated successfully.
Nov 23 09:31:37 np0005532585.localdomain podman_exporter[240656]: ts=2025-11-23T09:31:37.154Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 23 09:31:37 np0005532585.localdomain podman_exporter[240656]: ts=2025-11-23T09:31:37.154Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 23 09:31:37 np0005532585.localdomain podman_exporter[240656]: ts=2025-11-23T09:31:37.154Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 23 09:31:37 np0005532585.localdomain podman_exporter[240656]: ts=2025-11-23T09:31:37.154Z caller=handler.go:105 level=info collector=container
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:31:37 np0005532585.localdomain podman[240642]: 2025-11-23 09:31:37.169137857 +0000 UTC m=+0.217747442 container start 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:31:37 np0005532585.localdomain podman[240642]: podman_exporter
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: Starting Podman API Service...
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: Started Podman API Service.
Nov 23 09:31:37 np0005532585.localdomain systemd[1]: Started podman_exporter container.
Nov 23 09:31:37 np0005532585.localdomain sudo[240599]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:37Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:37Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:37Z" level=info msg="Setting parallel job count to 25"
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:37Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:37Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:31:37 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 23 09:31:37 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:37Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:31:37 np0005532585.localdomain podman[240667]: 2025-11-23 09:31:37.264119753 +0000 UTC m=+0.088616577 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:31:37 np0005532585.localdomain podman[240667]: 2025-11-23 09:31:37.347520525 +0000 UTC m=+0.172017319 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:37 np0005532585.localdomain podman[240667]: unhealthy
Nov 23 09:31:37 np0005532585.localdomain sudo[240812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoolommamajuaasxjfpvfrmeggzungim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890297.6702917-2031-271959718826363/AnsiballZ_systemd.py
Nov 23 09:31:37 np0005532585.localdomain sudo[240812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:31:38 np0005532585.localdomain python3.9[240814]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: Stopping podman_exporter container...
Nov 23 09:31:38 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:31:37 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1"
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: libpod-1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.scope: Deactivated successfully.
Nov 23 09:31:38 np0005532585.localdomain podman[240818]: 2025-11-23 09:31:38.400269453 +0000 UTC m=+0.073778322 container died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.timer: Deactivated successfully.
Nov 23 09:31:38 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: tmp-crun.Wiuk2A.mount: Deactivated successfully.
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-812143adf77fd5b73b00cfd1538061cd024cce3a256f4d841aa84ac73b980222-merged.mount: Deactivated successfully.
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e-userdata-shm.mount: Deactivated successfully.
Nov 23 09:31:39 np0005532585.localdomain podman[240818]: 2025-11-23 09:31:39.03223931 +0000 UTC m=+0.705748129 container cleanup 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:39 np0005532585.localdomain podman[240818]: podman_exporter
Nov 23 09:31:39 np0005532585.localdomain podman[240832]: 2025-11-23 09:31:39.043301586 +0000 UTC m=+0.637628715 container cleanup 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:31:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27977 DF PROTO=TCP SPT=48968 DPT=9100 SEQ=1905920573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D164E10000000001030307) 
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:31:39 np0005532585.localdomain systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 23 09:31:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:39.396 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:39.398 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:39.398 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:39.398 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:39.432 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:39.432 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:39 np0005532585.localdomain podman[240846]: 2025-11-23 09:31:39.450562903 +0000 UTC m=+0.094413918 container cleanup 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:31:39 np0005532585.localdomain podman[240846]: podman_exporter
Nov 23 09:31:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:31:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:31:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3-merged.mount: Deactivated successfully.
Nov 23 09:31:40 np0005532585.localdomain systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 23 09:31:40 np0005532585.localdomain systemd[1]: Stopped podman_exporter container.
Nov 23 09:31:40 np0005532585.localdomain systemd[1]: Starting podman_exporter container...
Nov 23 09:31:40 np0005532585.localdomain podman[240857]: 2025-11-23 09:31:40.129198562 +0000 UTC m=+0.091995643 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:31:40 np0005532585.localdomain podman[240857]: 2025-11-23 09:31:40.162437913 +0000 UTC m=+0.125235044 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:31:40 np0005532585.localdomain podman[240857]: unhealthy
Nov 23 09:31:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7088 DF PROTO=TCP SPT=50040 DPT=9882 SEQ=168501068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D16F270000000001030307) 
Nov 23 09:31:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:31:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:31:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:31:42 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:31:42 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:31:42 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:31:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:31:42 np0005532585.localdomain podman[240869]: 2025-11-23 09:31:42.219008685 +0000 UTC m=+2.131746068 container init 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:42 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:31:42.237Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 23 09:31:42 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:31:42.237Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 23 09:31:42 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:31:42 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 23 09:31:42 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:42Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:31:42 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:31:42.237Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 23 09:31:42 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:31:42.237Z caller=handler.go:105 level=info collector=container
Nov 23 09:31:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:31:42 np0005532585.localdomain podman[240869]: 2025-11-23 09:31:42.256276664 +0000 UTC m=+2.169014027 container start 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:42 np0005532585.localdomain podman[240869]: podman_exporter
Nov 23 09:31:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:31:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:31:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:31:44 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:31:44 np0005532585.localdomain systemd[1]: Started podman_exporter container.
Nov 23 09:31:44 np0005532585.localdomain podman[240899]: 2025-11-23 09:31:44.323237171 +0000 UTC m=+2.060961641 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:44 np0005532585.localdomain sudo[240812]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:44 np0005532585.localdomain podman[240899]: 2025-11-23 09:31:44.354749969 +0000 UTC m=+2.092474459 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:31:44 np0005532585.localdomain podman[240899]: unhealthy
Nov 23 09:31:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:44.434 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:44.436 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:44.436 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:44.436 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:44.475 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:44.475 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:44 np0005532585.localdomain sudo[241028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkhxvzgqfgrejrluvdlcnubqgdclyiqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890304.5192266-2055-152537022365494/AnsiballZ_stat.py
Nov 23 09:31:44 np0005532585.localdomain sudo[241028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7090 DF PROTO=TCP SPT=50040 DPT=9882 SEQ=168501068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D17B200000000001030307) 
Nov 23 09:31:44 np0005532585.localdomain python3.9[241030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:31:44 np0005532585.localdomain sudo[241028]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:31:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:31:45 np0005532585.localdomain sudo[241116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlgmtauasuvpyzgezuuwscmlitlratbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890304.5192266-2055-152537022365494/AnsiballZ_copy.py
Nov 23 09:31:45 np0005532585.localdomain sudo[241116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:31:45 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:31:45 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:31:45 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:31:45 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:31:45 np0005532585.localdomain python3.9[241118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890304.5192266-2055-152537022365494/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:31:45 np0005532585.localdomain sudo[241116]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:45 np0005532585.localdomain podman[240668]: time="2025-11-23T09:31:45Z" level=error msg="Getting root fs size for \"019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Nov 23 09:31:45 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:31:45 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:31:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:31:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:31:46 np0005532585.localdomain podman[241195]: 2025-11-23 09:31:46.284786108 +0000 UTC m=+0.086812441 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:31:46 np0005532585.localdomain sudo[241238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xegelbkordebjayhqehyzbotiumwgkua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890306.033101-2106-204734704202995/AnsiballZ_container_config_data.py
Nov 23 09:31:46 np0005532585.localdomain sudo[241238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:31:46 np0005532585.localdomain podman[241195]: 2025-11-23 09:31:46.34299241 +0000 UTC m=+0.145018793 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:31:46 np0005532585.localdomain python3.9[241251]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 23 09:31:46 np0005532585.localdomain sudo[241238]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:47 np0005532585.localdomain sudo[241359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubjiysgdgnbvdpregiljfarflplrtist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890306.7838674-2133-12749259775353/AnsiballZ_container_config_hash.py
Nov 23 09:31:47 np0005532585.localdomain sudo[241359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:47 np0005532585.localdomain sudo[241361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:31:47 np0005532585.localdomain sudo[241361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:31:47 np0005532585.localdomain sudo[241361]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:47 np0005532585.localdomain sudo[241380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:31:47 np0005532585.localdomain sudo[241380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:31:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:31:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3-merged.mount: Deactivated successfully.
Nov 23 09:31:47 np0005532585.localdomain python3.9[241375]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:31:47 np0005532585.localdomain sudo[241359]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:48 np0005532585.localdomain sudo[241518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqxdzewpqyunjvygvwdywofhpyqxuale ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890307.7380922-2163-57499906587359/AnsiballZ_edpm_container_manage.py
Nov 23 09:31:48 np0005532585.localdomain sudo[241518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9cd49e7d070c0c00c09eb4d4067ba3757eea8902a6a55fed079db1124e7c8aad-merged.mount: Deactivated successfully.
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9cd49e7d070c0c00c09eb4d4067ba3757eea8902a6a55fed079db1124e7c8aad-merged.mount: Deactivated successfully.
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:31:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36413 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2246745688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D188200000000001030307) 
Nov 23 09:31:48 np0005532585.localdomain python3[241520]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully.
Nov 23 09:31:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully.
Nov 23 09:31:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:49.477 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:49.479 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:49.479 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:49.479 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:49.506 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:49.507 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:31:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:31:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:31:50 np0005532585.localdomain podman[241534]: 2025-11-23 09:31:50.302092611 +0000 UTC m=+0.360777363 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:31:50 np0005532585.localdomain podman[241534]: 2025-11-23 09:31:50.340510093 +0000 UTC m=+0.399194895 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:31:50 np0005532585.localdomain sudo[241380]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:51 np0005532585.localdomain sudo[241582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:31:51 np0005532585.localdomain sudo[241582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:31:51 np0005532585.localdomain sudo[241582]: pam_unix(sudo:session): session closed for user root
Nov 23 09:31:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:31:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:31:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:31:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:31:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:31:52 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:31:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:31:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:31:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:31:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:31:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63876 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3280752926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D19C210000000001030307) 
Nov 23 09:31:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:31:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:54.508 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:54.509 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:31:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:54.510 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:31:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:54.510 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:54.546 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:54.546 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:31:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:31:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:31:55 np0005532585.localdomain podman[241601]: 2025-11-23 09:31:55.023882422 +0000 UTC m=+0.076596341 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:31:55 np0005532585.localdomain podman[241601]: 2025-11-23 09:31:55.062159971 +0000 UTC m=+0.114873890 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:31:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:31:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:31:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:31:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:31:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9cd49e7d070c0c00c09eb4d4067ba3757eea8902a6a55fed079db1124e7c8aad-merged.mount: Deactivated successfully.
Nov 23 09:31:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9cd49e7d070c0c00c09eb4d4067ba3757eea8902a6a55fed079db1124e7c8aad-merged.mount: Deactivated successfully.
Nov 23 09:31:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7092 DF PROTO=TCP SPT=50040 DPT=9882 SEQ=168501068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1AC200000000001030307) 
Nov 23 09:31:57 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:31:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:31:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully.
Nov 23 09:31:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully.
Nov 23 09:31:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:31:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:31:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:31:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:31:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:31:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:31:59.546 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:31:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully.
Nov 23 09:31:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21946 DF PROTO=TCP SPT=36448 DPT=9101 SEQ=1243181516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1B5290000000001030307) 
Nov 23 09:31:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28778 DF PROTO=TCP SPT=45074 DPT=9105 SEQ=623182031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1B5DB0000000001030307) 
Nov 23 09:32:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:32:00 np0005532585.localdomain podman[241636]: 2025-11-23 09:32:00.498229807 +0000 UTC m=+0.083556618 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 23 09:32:00 np0005532585.localdomain podman[241636]: 2025-11-23 09:32:00.512212935 +0000 UTC m=+0.097539766 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:32:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:00 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:32:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:32:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:32:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:02 np0005532585.localdomain podman[241548]: 2025-11-23 09:31:52.178878702 +0000 UTC m=+1.881680016 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 23 09:32:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28780 DF PROTO=TCP SPT=45074 DPT=9105 SEQ=623182031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1C1E00000000001030307) 
Nov 23 09:32:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483-merged.mount: Deactivated successfully.
Nov 23 09:32:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483-merged.mount: Deactivated successfully.
Nov 23 09:32:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:04.549 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:04.550 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:04.550 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:04.551 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:04.584 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:04.584 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:32:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:32:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63978 DF PROTO=TCP SPT=51130 DPT=9100 SEQ=477593843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1CE210000000001030307) 
Nov 23 09:32:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:06.692 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:06.693 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:06.715 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:06.715 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:32:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:06.716 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:32:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:07.590 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:32:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:07.590 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:32:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:07.591 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:32:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:07.591 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:32:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:08 np0005532585.localdomain podman[241701]: 2025-11-23 09:32:07.086456715 +0000 UTC m=+0.035524985 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 23 09:32:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:32:09.242 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:32:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:32:09.243 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:32:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:32:09.244 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:32:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17073 DF PROTO=TCP SPT=36106 DPT=9100 SEQ=1207958549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1DA200000000001030307) 
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.586 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.587 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.588 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.588 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.622 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.622 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.783 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.799 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.799 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.799 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.800 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.800 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.800 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.801 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.801 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.801 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.802 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.817 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.818 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.818 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.818 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:32:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:09.819 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:32:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.281 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.341 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.342 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.535 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.537 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12614MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.537 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.537 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:32:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.654 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.654 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.655 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:32:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:10.709 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:32:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:10 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:10 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:11.160 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:32:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:11.164 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:32:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:11.175 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:32:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:11.177 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:32:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:11.177 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:32:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36062 DF PROTO=TCP SPT=50226 DPT=9882 SEQ=4004665552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1E4560000000001030307) 
Nov 23 09:32:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:32:13 np0005532585.localdomain podman[241757]: 2025-11-23 09:32:13.038848785 +0000 UTC m=+0.089451523 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:32:13 np0005532585.localdomain podman[241757]: 2025-11-23 09:32:13.071311062 +0000 UTC m=+0.121913810 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 09:32:13 np0005532585.localdomain podman[241757]: unhealthy
Nov 23 09:32:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483-merged.mount: Deactivated successfully.
Nov 23 09:32:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f-merged.mount: Deactivated successfully.
Nov 23 09:32:13 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:13 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:13 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:32:13 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.623 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.624 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.624 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.624 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.624 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.625 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:14.626 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36064 DF PROTO=TCP SPT=50226 DPT=9882 SEQ=4004665552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1F0600000000001030307) 
Nov 23 09:32:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:32:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:16 np0005532585.localdomain podman[240668]: time="2025-11-23T09:32:16Z" level=error msg="Getting root fs size for \"108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08\": getting diffsize of layer \"3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": unmounting layer 3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34: replacing mount point \"/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged\": device or resource busy"
Nov 23 09:32:16 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:16 np0005532585.localdomain podman[241777]: 2025-11-23 09:32:16.04424999 +0000 UTC m=+0.097673200 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:32:16 np0005532585.localdomain podman[241777]: 2025-11-23 09:32:16.051792426 +0000 UTC m=+0.105215616 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:32:16 np0005532585.localdomain podman[241777]: unhealthy
Nov 23 09:32:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63878 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3280752926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D1FC200000000001030307) 
Nov 23 09:32:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f-merged.mount: Deactivated successfully.
Nov 23 09:32:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:32:18 np0005532585.localdomain podman[241701]: 
Nov 23 09:32:18 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:32:18 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:32:18 np0005532585.localdomain podman[241701]: 2025-11-23 09:32:18.752490458 +0000 UTC m=+11.701558668 container create ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 23 09:32:18 np0005532585.localdomain podman[241799]: 2025-11-23 09:32:18.779728405 +0000 UTC m=+0.136164972 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 23 09:32:18 np0005532585.localdomain podman[241799]: 2025-11-23 09:32:18.862406245 +0000 UTC m=+0.218842872 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:32:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:19.626 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:19.630 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:20 np0005532585.localdomain python3[241520]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 23 09:32:20 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:32:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:32:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:22 np0005532585.localdomain sudo[241518]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:22 np0005532585.localdomain podman[241846]: 2025-11-23 09:32:22.752467965 +0000 UTC m=+0.426424790 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:32:22 np0005532585.localdomain podman[241846]: 2025-11-23 09:32:22.788337933 +0000 UTC m=+0.462294768 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 23 09:32:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1373 DF PROTO=TCP SPT=46600 DPT=9102 SEQ=296611411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D211610000000001030307) 
Nov 23 09:32:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:24.666 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:24.667 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:24.667 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:24.667 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:24.668 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:24.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11-merged.mount: Deactivated successfully.
Nov 23 09:32:25 np0005532585.localdomain sudo[241970]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpjtzifxnvhovtgzeldfbznvjbpbleop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890344.8183534-2187-203790942089082/AnsiballZ_stat.py
Nov 23 09:32:25 np0005532585.localdomain sudo[241970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11-merged.mount: Deactivated successfully.
Nov 23 09:32:25 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:32:25 np0005532585.localdomain python3.9[241972]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:32:25 np0005532585.localdomain sudo[241970]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:25 np0005532585.localdomain sudo[242082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlwphvmqefofymctksnmxphlnefboakz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890345.5683746-2214-52286759486733/AnsiballZ_file.py
Nov 23 09:32:25 np0005532585.localdomain sudo[242082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:26 np0005532585.localdomain python3.9[242084]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:32:26 np0005532585.localdomain sudo[242082]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:26 np0005532585.localdomain sudo[242191]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcaeshxlddfzwnambcbfourlkycuenol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890346.0857327-2214-23808154433438/AnsiballZ_copy.py
Nov 23 09:32:26 np0005532585.localdomain sudo[242191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:26 np0005532585.localdomain python3.9[242193]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890346.0857327-2214-23808154433438/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:32:26 np0005532585.localdomain sudo[242191]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:26 np0005532585.localdomain sudo[242246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhyawkzhsrbqmnznfzepqnvqexuaeldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890346.0857327-2214-23808154433438/AnsiballZ_systemd.py
Nov 23 09:32:26 np0005532585.localdomain sudo[242246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36066 DF PROTO=TCP SPT=50226 DPT=9882 SEQ=4004665552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D220210000000001030307) 
Nov 23 09:32:27 np0005532585.localdomain python3.9[242248]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:32:27 np0005532585.localdomain systemd-rc-local-generator[242269]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:32:27 np0005532585.localdomain systemd-sysv-generator[242276]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:32:27 np0005532585.localdomain sudo[242246]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:27 np0005532585.localdomain podman[242285]: 2025-11-23 09:32:27.730306189 +0000 UTC m=+0.077818688 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:32:27 np0005532585.localdomain podman[242285]: 2025-11-23 09:32:27.769290235 +0000 UTC m=+0.116802704 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:32:27 np0005532585.localdomain sudo[242361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzwzaqknbrpspmnocaqqptovsddzylkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890346.0857327-2214-23808154433438/AnsiballZ_systemd.py
Nov 23 09:32:27 np0005532585.localdomain sudo[242361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:28 np0005532585.localdomain python3.9[242363]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:32:28 np0005532585.localdomain systemd-rc-local-generator[242388]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:32:28 np0005532585.localdomain systemd-sysv-generator[242396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:32:28 np0005532585.localdomain systemd[1]: Starting openstack_network_exporter container...
Nov 23 09:32:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:29 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:32:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:29.671 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:29.672 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:29.672 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:29.673 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:29.673 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:29.677 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14712 DF PROTO=TCP SPT=35598 DPT=9101 SEQ=1262218756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D22A5A0000000001030307) 
Nov 23 09:32:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28566 DF PROTO=TCP SPT=41530 DPT=9105 SEQ=382369090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D22B0B0000000001030307) 
Nov 23 09:32:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:30 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:30 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:30 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:32:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:32:30 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8033d9a10fae36560bce74d30fda0f9245d578587a49936e30f7c1bd819e3d5c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 23 09:32:30 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8033d9a10fae36560bce74d30fda0f9245d578587a49936e30f7c1bd819e3d5c/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 23 09:32:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:32:30 np0005532585.localdomain podman[242404]: 2025-11-23 09:32:30.798955561 +0000 UTC m=+2.183788339 container init ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *bridge.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *coverage.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *datapath.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *iface.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *memory.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *ovnnorthd.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *ovn.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *ovsdbserver.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *pmd_perf.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *pmd_rxq.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: INFO    09:32:30 main.go:48: registering *vswitch.Collector
Nov 23 09:32:30 np0005532585.localdomain openstack_network_exporter[242417]: NOTICE  09:32:30 main.go:82: listening on http://:9105/metrics
Nov 23 09:32:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:32:30 np0005532585.localdomain podman[242420]: 2025-11-23 09:32:30.869033975 +0000 UTC m=+0.094234755 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:32:30 np0005532585.localdomain podman[242404]: 2025-11-23 09:32:30.883238101 +0000 UTC m=+2.268070849 container start ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:32:30 np0005532585.localdomain podman[242404]: openstack_network_exporter
Nov 23 09:32:30 np0005532585.localdomain podman[242420]: 2025-11-23 09:32:30.902004471 +0000 UTC m=+0.127205271 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Nov 23 09:32:30 np0005532585.localdomain podman[242440]: 2025-11-23 09:32:30.913756461 +0000 UTC m=+0.069685282 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350)
Nov 23 09:32:30 np0005532585.localdomain podman[242440]: 2025-11-23 09:32:30.972338993 +0000 UTC m=+0.128267794 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Nov 23 09:32:31 np0005532585.localdomain podman[240668]: time="2025-11-23T09:32:31Z" level=error msg="Getting root fs size for \"11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 23 09:32:31 np0005532585.localdomain systemd[1]: Started openstack_network_exporter container.
Nov 23 09:32:31 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:32:31 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:32:31 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:31 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:31 np0005532585.localdomain sudo[242361]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14714 DF PROTO=TCP SPT=35598 DPT=9101 SEQ=1262218756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D236610000000001030307) 
Nov 23 09:32:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11-merged.mount: Deactivated successfully.
Nov 23 09:32:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:34 np0005532585.localdomain sudo[242576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytjzrcygmqoieomkfyuyngipgckeejwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890353.7666361-2286-262726770993493/AnsiballZ_systemd.py
Nov 23 09:32:34 np0005532585.localdomain sudo[242576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:34 np0005532585.localdomain python3.9[242578]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:32:34 np0005532585.localdomain systemd[1]: Stopping openstack_network_exporter container...
Nov 23 09:32:34 np0005532585.localdomain systemd[1]: libpod-ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.scope: Deactivated successfully.
Nov 23 09:32:34 np0005532585.localdomain podman[242582]: 2025-11-23 09:32:34.486698199 +0000 UTC m=+0.066530663 container died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 09:32:34 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.timer: Deactivated successfully.
Nov 23 09:32:34 np0005532585.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:32:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ff643707b40d1f4d028ddd9677b38d4c09285d4aaee2a2759e81caea717feb00-merged.mount: Deactivated successfully.
Nov 23 09:32:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9-userdata-shm.mount: Deactivated successfully.
Nov 23 09:32:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:34.678 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:34.679 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:34.680 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:34.680 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:34.702 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:34.702 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:35 np0005532585.localdomain sshd[242608]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:32:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27980 DF PROTO=TCP SPT=48968 DPT=9100 SEQ=1905920573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D244200000000001030307) 
Nov 23 09:32:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:36 np0005532585.localdomain podman[240668]: time="2025-11-23T09:32:36Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged: invalid argument"
Nov 23 09:32:36 np0005532585.localdomain podman[240668]: time="2025-11-23T09:32:36Z" level=error msg="Getting root fs size for \"1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f\": getting diffsize of layer \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\" and its parent \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\": creating overlay mount to /var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/NFUIYHX665CHWW36YXSJIBWYTL:/var/lib/containers/storage/overlay/l/JFEBLJKSZVKBBKIZ32WYGWWNT6:/var/lib/containers/storage/overlay/l/5TMMTLYMJES7T4R72OOYVFFCPF,upperdir=/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/diff,workdir=/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/work,nodev,metacopy=on\": no such file or directory"
Nov 23 09:32:36 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:36 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:37 np0005532585.localdomain sshd[242608]: Invalid user ubuntu from 117.5.148.56 port 59468
Nov 23 09:32:37 np0005532585.localdomain sshd[242608]: Received disconnect from 117.5.148.56 port 59468:11:  [preauth]
Nov 23 09:32:37 np0005532585.localdomain sshd[242608]: Disconnected from invalid user ubuntu 117.5.148.56 port 59468 [preauth]
Nov 23 09:32:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11041 DF PROTO=TCP SPT=46110 DPT=9100 SEQ=3886817616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D24F600000000001030307) 
Nov 23 09:32:39 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1cc40e2e89ec91462a29529e884582c68bbf460125466873b1904744d0a186a6-merged.mount: Deactivated successfully.
Nov 23 09:32:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 23 09:32:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Nov 23 09:32:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:39.703 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:39.705 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:39.705 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:39.706 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:39.736 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:39.737 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 23 09:32:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:32:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:32:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:32:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43158 DF PROTO=TCP SPT=60666 DPT=9882 SEQ=3454281802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D259860000000001030307) 
Nov 23 09:32:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:32:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:32:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:32:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:32:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ff643707b40d1f4d028ddd9677b38d4c09285d4aaee2a2759e81caea717feb00-merged.mount: Deactivated successfully.
Nov 23 09:32:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:32:43 np0005532585.localdomain podman[242610]: 2025-11-23 09:32:43.724236096 +0000 UTC m=+0.090711094 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:32:43 np0005532585.localdomain podman[242610]: 2025-11-23 09:32:43.755427347 +0000 UTC m=+0.121902375 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:32:43 np0005532585.localdomain podman[242610]: unhealthy
Nov 23 09:32:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:32:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:32:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:44.738 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:44 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:44.739 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43160 DF PROTO=TCP SPT=60666 DPT=9882 SEQ=3454281802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D265A10000000001030307) 
Nov 23 09:32:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8033d9a10fae36560bce74d30fda0f9245d578587a49936e30f7c1bd819e3d5c-merged.mount: Deactivated successfully.
Nov 23 09:32:46 np0005532585.localdomain podman[242582]: 2025-11-23 09:32:46.158781029 +0000 UTC m=+11.738613453 container cleanup ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:32:46 np0005532585.localdomain podman[242582]: openstack_network_exporter
Nov 23 09:32:46 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:32:46 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:32:46 np0005532585.localdomain systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 23 09:32:46 np0005532585.localdomain podman[242628]: 2025-11-23 09:32:46.26568291 +0000 UTC m=+0.077345593 container cleanup ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Nov 23 09:32:46 np0005532585.localdomain podman[242628]: openstack_network_exporter
Nov 23 09:32:46 np0005532585.localdomain sshd[242640]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:32:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1375 DF PROTO=TCP SPT=46600 DPT=9102 SEQ=296611411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D272200000000001030307) 
Nov 23 09:32:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:32:48 np0005532585.localdomain systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 23 09:32:48 np0005532585.localdomain systemd[1]: Stopped openstack_network_exporter container.
Nov 23 09:32:48 np0005532585.localdomain systemd[1]: Starting openstack_network_exporter container...
Nov 23 09:32:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:32:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:32:49 np0005532585.localdomain sshd[242640]: Invalid user blank from 182.79.86.102 port 38830
Nov 23 09:32:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:32:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:32:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8033d9a10fae36560bce74d30fda0f9245d578587a49936e30f7c1bd819e3d5c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 23 09:32:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8033d9a10fae36560bce74d30fda0f9245d578587a49936e30f7c1bd819e3d5c/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 23 09:32:49 np0005532585.localdomain podman[242655]: 2025-11-23 09:32:49.466031412 +0000 UTC m=+0.562070025 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:32:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:32:49 np0005532585.localdomain podman[242642]: 2025-11-23 09:32:49.477604906 +0000 UTC m=+1.156136064 container init ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *bridge.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *coverage.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *datapath.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *iface.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *memory.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *ovnnorthd.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *ovn.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *ovsdbserver.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *pmd_perf.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *pmd_rxq.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: INFO    09:32:49 main.go:48: registering *vswitch.Collector
Nov 23 09:32:49 np0005532585.localdomain openstack_network_exporter[242668]: NOTICE  09:32:49 main.go:82: listening on http://:9105/metrics
Nov 23 09:32:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:32:49 np0005532585.localdomain podman[242642]: 2025-11-23 09:32:49.514613359 +0000 UTC m=+1.193144517 container start ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 09:32:49 np0005532585.localdomain podman[242642]: openstack_network_exporter
Nov 23 09:32:49 np0005532585.localdomain podman[242655]: 2025-11-23 09:32:49.515142167 +0000 UTC m=+0.611180740 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:32:49 np0005532585.localdomain podman[242655]: unhealthy
Nov 23 09:32:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:49.741 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:49.743 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:49.743 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:49.743 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:49.772 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:49 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:49.772 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:50 np0005532585.localdomain sshd[242640]: Connection closed by invalid user blank 182.79.86.102 port 38830 [preauth]
Nov 23 09:32:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:32:51 np0005532585.localdomain sudo[242714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:32:51 np0005532585.localdomain sudo[242714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:32:51 np0005532585.localdomain sudo[242714]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:51 np0005532585.localdomain sudo[242732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:32:51 np0005532585.localdomain sudo[242732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:32:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:32:51 np0005532585.localdomain systemd[1]: Started openstack_network_exporter container.
Nov 23 09:32:51 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:32:51 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:32:51 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:51 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:51 np0005532585.localdomain podman[242689]: 2025-11-23 09:32:51.626188837 +0000 UTC m=+2.111155774 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:32:51 np0005532585.localdomain sudo[242576]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:51 np0005532585.localdomain podman[242701]: 2025-11-23 09:32:51.661796017 +0000 UTC m=+0.717203833 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:32:51 np0005532585.localdomain podman[242689]: 2025-11-23 09:32:51.667615669 +0000 UTC m=+2.152582586 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Nov 23 09:32:51 np0005532585.localdomain podman[242701]: 2025-11-23 09:32:51.720303496 +0000 UTC m=+0.775711332 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 23 09:32:52 np0005532585.localdomain sudo[242892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-garpuerwaincfucwvgyqnekzwwalrvmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890371.8207984-2310-65606264113128/AnsiballZ_find.py
Nov 23 09:32:52 np0005532585.localdomain sudo[242892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:32:52 np0005532585.localdomain python3.9[242894]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:32:52 np0005532585.localdomain sudo[242892]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:52 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:32:52 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:32:52 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:52 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:32:52 np0005532585.localdomain systemd[1]: tmp-crun.wpvYSI.mount: Deactivated successfully.
Nov 23 09:32:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:32:52 np0005532585.localdomain sudo[242732]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:32:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40588 DF PROTO=TCP SPT=48436 DPT=9102 SEQ=1509193721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D286A00000000001030307) 
Nov 23 09:32:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:32:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:32:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-102a6fceace8ab36236a8b21c9b9829f6b100c83ca8f9b2c07c3282080c29222-merged.mount: Deactivated successfully.
Nov 23 09:32:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:54.775 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:54.778 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:54.779 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:32:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:54.779 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:54.814 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:54 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:54.815 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:32:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:32:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:32:55 np0005532585.localdomain sudo[242931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:32:55 np0005532585.localdomain sudo[242931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:32:55 np0005532585.localdomain sudo[242931]: pam_unix(sudo:session): session closed for user root
Nov 23 09:32:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:32:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:32:55 np0005532585.localdomain podman[242949]: 2025-11-23 09:32:55.309981631 +0000 UTC m=+0.089130794 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:32:55 np0005532585.localdomain podman[242949]: 2025-11-23 09:32:55.341230773 +0000 UTC m=+0.120379926 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 23 09:32:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:32:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1cc40e2e89ec91462a29529e884582c68bbf460125466873b1904744d0a186a6-merged.mount: Deactivated successfully.
Nov 23 09:32:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:32:56 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:32:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 23 09:32:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Nov 23 09:32:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Nov 23 09:32:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43162 DF PROTO=TCP SPT=60666 DPT=9882 SEQ=3454281802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D296200000000001030307) 
Nov 23 09:32:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1e604deea57dbda554a168861cff1238f93b8c6c69c863c43aed37d9d99c5fed-merged.mount: Deactivated successfully.
Nov 23 09:32:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 23 09:32:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully.
Nov 23 09:32:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:32:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully.
Nov 23 09:32:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully.
Nov 23 09:32:58 np0005532585.localdomain sshd[242966]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:32:58 np0005532585.localdomain sshd[242966]: error: kex_exchange_identification: Connection closed by remote host
Nov 23 09:32:58 np0005532585.localdomain sshd[242966]: Connection closed by 178.163.66.191 port 41026
Nov 23 09:32:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:32:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:32:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:32:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:32:59 np0005532585.localdomain sshd[242967]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:32:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:32:59 np0005532585.localdomain systemd[1]: tmp-crun.ZXnUpd.mount: Deactivated successfully.
Nov 23 09:32:59 np0005532585.localdomain podman[242969]: 2025-11-23 09:32:59.678149055 +0000 UTC m=+0.099486069 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:32:59 np0005532585.localdomain podman[242969]: 2025-11-23 09:32:59.711566565 +0000 UTC m=+0.132903609 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:32:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=603 DF PROTO=TCP SPT=42020 DPT=9101 SEQ=3842055596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D29F8A0000000001030307) 
Nov 23 09:32:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:59.816 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:32:59 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:32:59.819 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:32:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48515 DF PROTO=TCP SPT=59268 DPT=9105 SEQ=1751528930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2A03B0000000001030307) 
Nov 23 09:33:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:33:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:33:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:33:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:33:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:02 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:33:02 np0005532585.localdomain podman[242990]: 2025-11-23 09:33:02.699625953 +0000 UTC m=+0.750014385 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 09:33:02 np0005532585.localdomain podman[242990]: 2025-11-23 09:33:02.741331504 +0000 UTC m=+0.791719886 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Nov 23 09:33:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=605 DF PROTO=TCP SPT=42020 DPT=9101 SEQ=3842055596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2ABA00000000001030307) 
Nov 23 09:33:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:33:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:33:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:33:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:04 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:33:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:04.820 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:04.821 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:04.821 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:04.821 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:04.847 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:04.847 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:33:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:33:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:33:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17076 DF PROTO=TCP SPT=36106 DPT=9100 SEQ=1207958549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2B8200000000001030307) 
Nov 23 09:33:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:33:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41730 DF PROTO=TCP SPT=55048 DPT=9100 SEQ=2471495946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2C4600000000001030307) 
Nov 23 09:33:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:33:09.244 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:33:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:33:09.244 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:33:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:33:09.246 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:33:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89-merged.mount: Deactivated successfully.
Nov 23 09:33:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-102a6fceace8ab36236a8b21c9b9829f6b100c83ca8f9b2c07c3282080c29222-merged.mount: Deactivated successfully.
Nov 23 09:33:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:09.848 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:09.850 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:09.851 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:09.851 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:09.886 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:09.887 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:33:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:33:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.801 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.802 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.807 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6722af28-fa91-4d83-a272-74b827aad8fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.802921', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d2c573c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': 'c8194a80e2bc6d85bf2b01b03c0e00bf70325f46d8bc1d5d77b00fdcabb410e6'}]}, 'timestamp': '2025-11-23 09:33:10.808335', '_unique_id': '684db5a70fb14d9fad00f2c7ac4ff50e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.810 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.812 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.825 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.825 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d9173b3-45f7-4cde-aeaf-c7b6bf049374', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.812209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d2f0c16-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.98986707, 'message_signature': 'b4398561bf2371053c7efc28453c016f3c9f2607fc4e3f5ed41819ea22c14345'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.812209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d2f23ae-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.98986707, 'message_signature': 'be9203a8ee705852a86a0474d7b781cc0694b7cb0bbad21ba307788a9d06067f'}]}, 'timestamp': '2025-11-23 09:33:10.826522', '_unique_id': 'b8673ff2434f420eaad7b672ca859d8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.827 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.829 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd58ef5a8-f484-4e1f-82bd-dc3c2b64c6a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.829252', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d2fa1b2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': 'db5facda5d067bfea39153ec54658095bc3e53231694fb030ad226c31eb645e4'}]}, 'timestamp': '2025-11-23 09:33:10.829801', '_unique_id': '5faa4a588d2547a088bf943e0cabfd1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.830 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.832 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ded4911-795c-48c6-8e09-23fb3b92d63f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.832207', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d301886-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': '51739387cf084efd7e6e2e749c387a28ebc24790ee9a1992b9e004770902ad34'}]}, 'timestamp': '2025-11-23 09:33:10.832814', '_unique_id': '70824eaf7f9f4e97a983f5f2f44a4b89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.833 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.835 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.853 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3cafdd8-ca10-4c3d-b738-d47e9bbab360', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:33:10.835248', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6d335f50-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.03119555, 'message_signature': '9802f9e5c76c821f4b5a831f29a996cc25126a6e76d5bcac9c80ab497541c474'}]}, 'timestamp': '2025-11-23 09:33:10.854295', '_unique_id': '6d9be9c4a4374b3ab323ed9d06cbd247'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.856 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.896 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a411451-bb5a-4341-a33a-4cbd7979cbc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.856709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d39d038-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '4920a89170d16708d62ee51bd31c22a56617f64c1016a900f9cd3732f0853821'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.856709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d39e280-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '94438f6b9230883436001952926b8ebb616ff85eeaf28483f04f0666e1cc0c33'}]}, 'timestamp': '2025-11-23 09:33:10.896947', '_unique_id': '1f629a6eb73648618d4fa9e897b6b1b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.899 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c41bb5b9-4d38-4f07-b1e3-ff89a857af77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.899738', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d3a6368-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': 'edbc5a955e06c0fb66c857fc720203bf5b4ac0ef7f1c7443a8558a9aefd04827'}]}, 'timestamp': '2025-11-23 09:33:10.900240', '_unique_id': '732b6421fe134153812dbbf66373fe23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35628dd0-2b6b-4a90-ba79-680194dd0b87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.902563', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d3ad1f4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': '1ebbfc404f09c2facc63fd514c49f774b8518736bf58af7a99104bd278c13e80'}]}, 'timestamp': '2025-11-23 09:33:10.903177', '_unique_id': '002397e479d44e22a225301a102daff5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94f1d39f-bbe9-4f3a-b161-e7d44a00dc04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.905602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3b4724-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '389ed01823ee8802f5bb11b03a71ca1ab4ee26ad8bee439920c5325b16eb35e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.905602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3b58ea-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '0c8675bc0583cf40aef8c3cf51abb4e88178cd67af7f4b38a545a3be27baf069'}]}, 'timestamp': '2025-11-23 09:33:10.906496', '_unique_id': '5646189b69764c36a3e046e7bff90152'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '512da8e6-204d-4d0c-9052-db20bc71c4b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.909095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3bd248-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.98986707, 'message_signature': '2473c85e9b75ac185daa1c00a305f719c5ecb4717d03e5859812272c06d2953f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.909095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3be42c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.98986707, 'message_signature': 'd0718ef578ea498964c87d6c5f739e23f1f7a5bd40fa5546d07261a359de2fa9'}]}, 'timestamp': '2025-11-23 09:33:10.910092', '_unique_id': 'ed33f30b09ef4fa395d33e426e0df6b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7e74bb3-a769-454a-8781-1d05621ce344', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.912472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3c53c6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '27b1b22bfdf97688de5a19a4a1ab339780e351d14d26963af28f7b6891a56b2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.912472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3c658c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '464145d90ba295fabc0b9ff52cdac92753fb296b120ab01d77ed9bd6aa750801'}]}, 'timestamp': '2025-11-23 09:33:10.913375', '_unique_id': 'ea8b107c6a9944618f846f3d2285fb8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6aa30da6-2993-43e7-a817-cd51230e1265', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.915690', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3cd382-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': 'b417896902741660761cde4d2ecb187535931aa0d958d27cd8141272ded09cb8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.915690', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3ce430-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '5f36042674b98b69be9d05de9aeb9d9227b33eb6d0f9d0304da6064dcf402ccc'}]}, 'timestamp': '2025-11-23 09:33:10.916613', '_unique_id': '1f6ccc167f6c4c709c2c2b4ab1fb34a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 51880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89e09ffc-ec8c-4e65-a04d-ba6177b93523', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51880000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:33:10.918879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6d3d5032-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.03119555, 'message_signature': 'e2323d2828d41b6c16667f1ec64d3f67f99c461ae72921a6c14366c86d68680e'}]}, 'timestamp': '2025-11-23 09:33:10.919465', '_unique_id': '79b0b21655954f888ede464fd5561ab8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2d6f698-c558-4016-baea-9860a15b30cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.921719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3dbe3c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '38468a8d859c96487ff4a838872843675224fedcdab719e4a49546a02be56a89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.921719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3dcf8a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': 'd96749c008ac84ce2745fb7b63812ea7ae3f0121db63ec403c914f3e18d11184'}]}, 'timestamp': '2025-11-23 09:33:10.922650', '_unique_id': 'a9d684dca6bc430b8def8b96509a488a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b922d006-71f9-4f83-8c1e-f5bbb4fe822c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.925167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3e43de-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.98986707, 'message_signature': '4434f77996159f80526eb263e13c278eec92396e0c37ba1d1840b2c31ad525c9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.925167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3e57e8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.98986707, 'message_signature': 'b89aa41a3eaea08ff3efb39470ca1cab92da2066f985d1745ded73a61db0e792'}]}, 'timestamp': '2025-11-23 09:33:10.926151', '_unique_id': '5d70b2922e464f94ae2fb63b3387f6f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c77c31e7-7a89-4f88-9abb-01b9de328e4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.928585', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d3ec9a8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': 'dea7b1899bc707dce2a78a318d02c2bf90be10e19d4e1c40fa03ac037c6d074e'}]}, 'timestamp': '2025-11-23 09:33:10.929116', '_unique_id': '052f84f0407a489a9bd4d245b388dc4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34038bc9-9ee0-4015-9f48-3adc4e76d4b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.930961', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d3f2254-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': '28f196e87313fe011adc3a79840f9af3577f735d37179df96c2ebabcb9d8b021'}]}, 'timestamp': '2025-11-23 09:33:10.931253', '_unique_id': 'db29bf1fad734f859eabdf1eb10d98d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '842a2884-9b2d-4d7f-830b-41e28508ae49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.932622', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d3f635e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': 'a97e158e596353176f28dac870b0e22740a7c0bba39681a1dccdff1097213a39'}]}, 'timestamp': '2025-11-23 09:33:10.932942', '_unique_id': '4873746dbe284e70aa6a2532b484aa3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.934 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e125d7ad-7fbf-4eb5-98f7-b5e1de1d9583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:33:10.934529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6d3fadd2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': '1a8412aa6da46b5c32b426dfdbe69cef319d1b0c077b449260f9869929384b95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:33:10.934529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6d3fb8fe-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10334.034341019, 'message_signature': 'b0fc4fc2457ffc20f2c4f7bf3ac7cce8f6fb737f715e3a758ce7ba3604b26621'}]}, 'timestamp': '2025-11-23 09:33:10.935093', '_unique_id': '3bcdb952cfbb439fbb6864db9ad64df4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.936 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e244aac6-e1df-4860-8ee1-03f62a4aaa3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.936451', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d3ff904-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': '7931f8522b5e9182fa819131278f3366a8056035650b4b2edde67bc6b6d13eee'}]}, 'timestamp': '2025-11-23 09:33:10.936752', '_unique_id': 'f8a9057dda77431d85b72537438da8aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.938 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfc95981-6ee4-4d28-a77d-b2cc1a5cf81a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:33:10.938129', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '6d403a72-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10333.980584328, 'message_signature': '054882bb14ca23ac83425339f7f6029193868169465cac15067f9249c1ce24ef'}]}, 'timestamp': '2025-11-23 09:33:10.938428', '_unique_id': 'b6d039b134114fba9611279f1854ce4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:33:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:33:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:33:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:33:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.179 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.179 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.179 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.179 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:33:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.616 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.617 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.617 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:33:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:11.617 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:33:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13079 DF PROTO=TCP SPT=48100 DPT=9882 SEQ=1763731378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2CEB70000000001030307) 
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.001 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.015 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.015 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.015 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.016 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.016 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.017 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.017 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.018 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.018 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.018 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.035 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.035 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.036 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.036 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.037 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:33:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.502 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:33:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully.
Nov 23 09:33:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully.
Nov 23 09:33:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.567 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.567 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.806 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.808 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12445MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.808 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.808 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.862 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.864 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.864 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:33:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:12.912 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:33:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully.
Nov 23 09:33:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:13.363 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:33:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:13.369 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:33:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:13.383 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:33:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:13.385 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:33:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:13.386 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:33:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:33:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:33:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully.
Nov 23 09:33:14 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:14 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:14 np0005532585.localdomain sshd[242967]: Invalid user a from 178.163.66.191 port 41040
Nov 23 09:33:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:14.886 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13081 DF PROTO=TCP SPT=48100 DPT=9882 SEQ=1763731378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2DAA00000000001030307) 
Nov 23 09:33:15 np0005532585.localdomain sshd[242967]: Connection closed by invalid user a 178.163.66.191 port 41040 [preauth]
Nov 23 09:33:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:33:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:33:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d5e4e29a67d68eb763c810cf5dda69eb1a37e523f562c6e80552f33c1fd3c8b-merged.mount: Deactivated successfully.
Nov 23 09:33:16 np0005532585.localdomain podman[243052]: 2025-11-23 09:33:16.305727427 +0000 UTC m=+0.073045708 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:33:16 np0005532585.localdomain podman[243052]: 2025-11-23 09:33:16.335160582 +0000 UTC m=+0.102478873 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 23 09:33:16 np0005532585.localdomain podman[243052]: unhealthy
Nov 23 09:33:17 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:33:17 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:33:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40590 DF PROTO=TCP SPT=48436 DPT=9102 SEQ=1509193721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2E6210000000001030307) 
Nov 23 09:33:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:19.889 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:19.891 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:19.892 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:19.892 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:19.914 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:19 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:19.915 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87-merged.mount: Deactivated successfully.
Nov 23 09:33:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:33:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:33:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:33:22 np0005532585.localdomain podman[243071]: 2025-11-23 09:33:22.007593877 +0000 UTC m=+0.068546686 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:33:22 np0005532585.localdomain podman[243071]: 2025-11-23 09:33:22.048133192 +0000 UTC m=+0.109086001 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:33:22 np0005532585.localdomain podman[243071]: unhealthy
Nov 23 09:33:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:33:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01-merged.mount: Deactivated successfully.
Nov 23 09:33:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:33:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:33:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50279 DF PROTO=TCP SPT=60588 DPT=9102 SEQ=788175603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D2FBE10000000001030307) 
Nov 23 09:33:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89-merged.mount: Deactivated successfully.
Nov 23 09:33:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:33:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 23 09:33:24 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:33:24 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:33:24 np0005532585.localdomain podman[243094]: 2025-11-23 09:33:24.226684489 +0000 UTC m=+1.288755980 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 23 09:33:24 np0005532585.localdomain podman[243095]: 2025-11-23 09:33:24.305245633 +0000 UTC m=+1.365104997 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Nov 23 09:33:24 np0005532585.localdomain podman[243094]: 2025-11-23 09:33:24.313320848 +0000 UTC m=+1.375392379 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 23 09:33:24 np0005532585.localdomain podman[243095]: 2025-11-23 09:33:24.367750657 +0000 UTC m=+1.427610111 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 23 09:33:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:24.915 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:24.916 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:24.916 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:24.916 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:24.917 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:24 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:24.919 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:33:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:33:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:33:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:33:26 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:33:26 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:33:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:33:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:33:27 np0005532585.localdomain systemd[1]: tmp-crun.tO2xq0.mount: Deactivated successfully.
Nov 23 09:33:27 np0005532585.localdomain podman[243138]: 2025-11-23 09:33:27.038771174 +0000 UTC m=+0.091396285 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 09:33:27 np0005532585.localdomain podman[243138]: 2025-11-23 09:33:27.069048156 +0000 UTC m=+0.121673217 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:33:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13083 DF PROTO=TCP SPT=48100 DPT=9882 SEQ=1763731378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D30A210000000001030307) 
Nov 23 09:33:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:27 np0005532585.localdomain podman[240668]: time="2025-11-23T09:33:27Z" level=error msg="Getting root fs size for \"42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Nov 23 09:33:27 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:33:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:33:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43336 DF PROTO=TCP SPT=56222 DPT=9101 SEQ=2085061718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D314BA0000000001030307) 
Nov 23 09:33:29 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:29 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:29 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:29.921 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:29.923 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:29.923 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:29.923 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:29.924 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:29 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:29.926 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 23 09:33:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0-merged.mount: Deactivated successfully.
Nov 23 09:33:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59136 DF PROTO=TCP SPT=58018 DPT=9105 SEQ=2581158530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3156B0000000001030307) 
Nov 23 09:33:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:33:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d5e4e29a67d68eb763c810cf5dda69eb1a37e523f562c6e80552f33c1fd3c8b-merged.mount: Deactivated successfully.
Nov 23 09:33:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:33:32 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:33:32 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:32 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:33:32 np0005532585.localdomain podman[243156]: 2025-11-23 09:33:32.887211165 +0000 UTC m=+0.086784585 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:33:32 np0005532585.localdomain podman[243156]: 2025-11-23 09:33:32.899400176 +0000 UTC m=+0.098973586 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:33:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43338 DF PROTO=TCP SPT=56222 DPT=9101 SEQ=2085061718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D320E00000000001030307) 
Nov 23 09:33:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:33:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:34 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:33:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:34.927 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:34.929 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:34.930 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:34.930 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:34.970 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:34 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:34.971 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:34 np0005532585.localdomain podman[243177]: 2025-11-23 09:33:34.978016047 +0000 UTC m=+0.332387186 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:33:35 np0005532585.localdomain podman[243177]: 2025-11-23 09:33:35.012729625 +0000 UTC m=+0.367100764 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 23 09:33:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 23 09:33:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87-merged.mount: Deactivated successfully.
Nov 23 09:33:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87-merged.mount: Deactivated successfully.
Nov 23 09:33:36 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:33:36 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11044 DF PROTO=TCP SPT=46110 DPT=9100 SEQ=3886817616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D32E200000000001030307) 
Nov 23 09:33:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:36 np0005532585.localdomain podman[240668]: time="2025-11-23T09:33:36Z" level=error msg="Getting root fs size for \"4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 23 09:33:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:33:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:33:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01-merged.mount: Deactivated successfully.
Nov 23 09:33:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01-merged.mount: Deactivated successfully.
Nov 23 09:33:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64421 DF PROTO=TCP SPT=57258 DPT=9100 SEQ=949974710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D339A00000000001030307) 
Nov 23 09:33:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:33:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617-merged.mount: Deactivated successfully.
Nov 23 09:33:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:39.972 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:39.974 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:39.974 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:39 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:39.974 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:40.006 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:40.007 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:33:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:33:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 23 09:33:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39272 DF PROTO=TCP SPT=41112 DPT=9882 SEQ=2995713946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D343E60000000001030307) 
Nov 23 09:33:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 23 09:33:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:33:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b-merged.mount: Deactivated successfully.
Nov 23 09:33:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:33:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:33:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:33:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:33:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:33:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:33:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:44 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:44 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39274 DF PROTO=TCP SPT=41112 DPT=9882 SEQ=2995713946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D34FE00000000001030307) 
Nov 23 09:33:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:45.008 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:45.009 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:45.010 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:45.010 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:45.013 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:45.013 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:33:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:33:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:47 np0005532585.localdomain podman[243196]: 2025-11-23 09:33:47.278815225 +0000 UTC m=+0.085971550 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:33:47 np0005532585.localdomain podman[243196]: 2025-11-23 09:33:47.311094408 +0000 UTC m=+0.118250743 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:33:47 np0005532585.localdomain podman[243196]: unhealthy
Nov 23 09:33:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 23 09:33:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0-merged.mount: Deactivated successfully.
Nov 23 09:33:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50281 DF PROTO=TCP SPT=60588 DPT=9102 SEQ=788175603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D35C200000000001030307) 
Nov 23 09:33:48 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:33:48 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:33:48 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0-merged.mount: Deactivated successfully.
Nov 23 09:33:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.014 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.015 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.015 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.015 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.015 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.016 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:50.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:50 np0005532585.localdomain podman[240668]: time="2025-11-23T09:33:50Z" level=error msg="Getting root fs size for \"53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8\": getting diffsize of layer \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\" and its parent \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\": unmounting layer f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958: replacing mount point \"/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged\": device or resource busy"
Nov 23 09:33:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:33:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:33:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398-merged.mount: Deactivated successfully.
Nov 23 09:33:52 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13622 DF PROTO=TCP SPT=43264 DPT=9102 SEQ=3852245901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D370E10000000001030307) 
Nov 23 09:33:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:33:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:33:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:33:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:33:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:54 np0005532585.localdomain podman[243214]: 2025-11-23 09:33:54.550260444 +0000 UTC m=+0.098340766 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:33:54 np0005532585.localdomain podman[243214]: 2025-11-23 09:33:54.584300671 +0000 UTC m=+0.132380973 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:33:54 np0005532585.localdomain podman[243214]: unhealthy
Nov 23 09:33:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:55.018 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:55.020 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:33:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:55.021 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:33:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:55.021 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:55.050 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:33:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:33:55.051 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:33:55 np0005532585.localdomain sudo[243237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:33:55 np0005532585.localdomain sudo[243237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:33:55 np0005532585.localdomain sudo[243237]: pam_unix(sudo:session): session closed for user root
Nov 23 09:33:55 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:33:55 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:33:55 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:55 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:55 np0005532585.localdomain sudo[243255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:33:55 np0005532585.localdomain sudo[243255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:33:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:33:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-503e987379308b9e6b9946670c4ac6382bcf032235dd53dd51b9045ac75aedc7-merged.mount: Deactivated successfully.
Nov 23 09:33:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:33:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:33:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:33:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:33:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:33:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39276 DF PROTO=TCP SPT=41112 DPT=9882 SEQ=2995713946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D380200000000001030307) 
Nov 23 09:33:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:33:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:33:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617-merged.mount: Deactivated successfully.
Nov 23 09:33:58 np0005532585.localdomain podman[243286]: 2025-11-23 09:33:58.16467997 +0000 UTC m=+1.637923317 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 23 09:33:58 np0005532585.localdomain podman[243286]: 2025-11-23 09:33:58.19521352 +0000 UTC m=+1.668456857 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 23 09:33:58 np0005532585.localdomain podman[243287]: 2025-11-23 09:33:58.244536093 +0000 UTC m=+1.717303815 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 23 09:33:58 np0005532585.localdomain podman[243287]: 2025-11-23 09:33:58.256173847 +0000 UTC m=+1.728941549 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:33:58 np0005532585.localdomain podman[243307]: 2025-11-23 09:33:58.305629503 +0000 UTC m=+0.367598559 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Nov 23 09:33:58 np0005532585.localdomain podman[243307]: 2025-11-23 09:33:58.334469073 +0000 UTC m=+0.396438119 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Nov 23 09:33:58 np0005532585.localdomain sudo[243255]: pam_unix(sudo:session): session closed for user root
Nov 23 09:33:58 np0005532585.localdomain podman[240668]: time="2025-11-23T09:33:58Z" level=error msg="Getting root fs size for \"4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014\": getting diffsize of layer \"d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 23 09:33:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain sudo[243368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:33:59 np0005532585.localdomain sudo[243368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:33:59 np0005532585.localdomain sudo[243368]: pam_unix(sudo:session): session closed for user root
Nov 23 09:33:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32072 DF PROTO=TCP SPT=59732 DPT=9101 SEQ=90601073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D389EA0000000001030307) 
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b-merged.mount: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:33:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b-merged.mount: Deactivated successfully.
Nov 23 09:33:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44931 DF PROTO=TCP SPT=47796 DPT=9105 SEQ=1683445164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D38A9B0000000001030307) 
Nov 23 09:34:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:00.052 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:00.054 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:00.055 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:34:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:00.055 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:00.056 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:00.057 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:34:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:34:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:34:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:34:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:34:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246-merged.mount: Deactivated successfully.
Nov 23 09:34:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:34:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:34:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 23 09:34:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44933 DF PROTO=TCP SPT=47796 DPT=9105 SEQ=1683445164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D396A00000000001030307) 
Nov 23 09:34:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 23 09:34:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:34:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:34:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:34:05 np0005532585.localdomain podman[243386]: 2025-11-23 09:34:05.002911153 +0000 UTC m=+0.057305837 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:34:05 np0005532585.localdomain podman[243386]: 2025-11-23 09:34:05.032706531 +0000 UTC m=+0.087101205 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:34:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:05.055 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:05 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:34:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:05.920 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:05.941 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:05.941 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:34:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:05.942 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.002 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.003 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.003 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.004 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:34:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41733 DF PROTO=TCP SPT=55048 DPT=9100 SEQ=2471495946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3A2210000000001030307) 
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.357 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.368 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.368 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.368 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.369 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.369 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.369 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.384 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.384 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.384 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.385 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.385 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:34:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:34:06 np0005532585.localdomain podman[243409]: 2025-11-23 09:34:06.517624045 +0000 UTC m=+0.077062019 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Nov 23 09:34:06 np0005532585.localdomain podman[243409]: 2025-11-23 09:34:06.528149986 +0000 UTC m=+0.087587980 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:34:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.834 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.914 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:34:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:06.915 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.135 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.137 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12464MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.138 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.138 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.202 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.202 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.203 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.242 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:34:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.700 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.706 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.717 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.720 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:34:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:07.721 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:34:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:07 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:07 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:07 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:34:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:08.068 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:08.069 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:08.069 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:08.069 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:34:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:08.070 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:34:08 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:08 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:34:09.245 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:34:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:34:09.246 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:34:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:34:09.247 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:34:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37076 DF PROTO=TCP SPT=58556 DPT=9100 SEQ=2531984057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3AEE00000000001030307) 
Nov 23 09:34:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 23 09:34:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:09 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:09 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:09 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.059 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.061 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.061 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.061 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.063 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.063 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:10.066 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4af87402ffde9ee67f7ec4afd129c011169c73f7a6b2401b07ae2b65c68e82e0-merged.mount: Deactivated successfully.
Nov 23 09:34:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:34:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47284 DF PROTO=TCP SPT=51320 DPT=9882 SEQ=3933650639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3B9160000000001030307) 
Nov 23 09:34:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:34:12 np0005532585.localdomain sshd[230901]: Received disconnect from 192.168.122.30 port 37274:11: disconnected by user
Nov 23 09:34:12 np0005532585.localdomain sshd[230901]: Disconnected from user zuul 192.168.122.30 port 37274
Nov 23 09:34:12 np0005532585.localdomain sshd[230898]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:34:12 np0005532585.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Nov 23 09:34:12 np0005532585.localdomain systemd[1]: session-55.scope: Consumed 56.874s CPU time.
Nov 23 09:34:12 np0005532585.localdomain systemd-logind[761]: Session 55 logged out. Waiting for processes to exit.
Nov 23 09:34:12 np0005532585.localdomain systemd-logind[761]: Removed session 55.
Nov 23 09:34:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad-merged.mount: Deactivated successfully.
Nov 23 09:34:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398-merged.mount: Deactivated successfully.
Nov 23 09:34:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398-merged.mount: Deactivated successfully.
Nov 23 09:34:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:34:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47286 DF PROTO=TCP SPT=51320 DPT=9882 SEQ=3933650639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3C5210000000001030307) 
Nov 23 09:34:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:15.064 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:15.071 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 23 09:34:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-503e987379308b9e6b9946670c4ac6382bcf032235dd53dd51b9045ac75aedc7-merged.mount: Deactivated successfully.
Nov 23 09:34:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:34:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:34:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:34:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13624 DF PROTO=TCP SPT=43264 DPT=9102 SEQ=3852245901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3D2200000000001030307) 
Nov 23 09:34:18 np0005532585.localdomain podman[243471]: 2025-11-23 09:34:18.34165146 +0000 UTC m=+0.089028313 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:34:18 np0005532585.localdomain podman[243471]: 2025-11-23 09:34:18.345686702 +0000 UTC m=+0.093063585 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 23 09:34:18 np0005532585.localdomain podman[243471]: unhealthy
Nov 23 09:34:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:34:19 np0005532585.localdomain sshd[243489]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:34:19 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:34:19 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:34:19 np0005532585.localdomain sshd[243489]: Accepted publickey for zuul from 192.168.122.30 port 49346 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:34:19 np0005532585.localdomain systemd-logind[761]: New session 56 of user zuul.
Nov 23 09:34:19 np0005532585.localdomain systemd[1]: Started Session 56 of User zuul.
Nov 23 09:34:19 np0005532585.localdomain sshd[243489]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:34:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246-merged.mount: Deactivated successfully.
Nov 23 09:34:19 np0005532585.localdomain sudo[243583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gydsqhxyfmedjvvroohygnvjocjoepsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890459.488686-2568-93680195006571/AnsiballZ_podman_container_info.py
Nov 23 09:34:19 np0005532585.localdomain sudo[243583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:19 np0005532585.localdomain python3.9[243585]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 23 09:34:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:20.067 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:20.080 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:34:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 23 09:34:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 23 09:34:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22-merged.mount: Deactivated successfully.
Nov 23 09:34:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22-merged.mount: Deactivated successfully.
Nov 23 09:34:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50971 DF PROTO=TCP SPT=33012 DPT=9102 SEQ=742435514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3E6200000000001030307) 
Nov 23 09:34:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:34:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:34:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:24 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:24 np0005532585.localdomain sudo[243583]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:24 np0005532585.localdomain sudo[243706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwnrsogyiryciskuongwwipbpzjdnilt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890464.5602372-2579-61399426353508/AnsiballZ_podman_container_exec.py
Nov 23 09:34:24 np0005532585.localdomain sudo[243706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:25 np0005532585.localdomain python3.9[243708]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:25.071 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:25.082 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:25 np0005532585.localdomain systemd[1]: Started libpod-conmon-2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.scope.
Nov 23 09:34:25 np0005532585.localdomain podman[243709]: 2025-11-23 09:34:25.159607194 +0000 UTC m=+0.114605252 container exec 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:34:25 np0005532585.localdomain podman[243709]: 2025-11-23 09:34:25.190849906 +0000 UTC m=+0.145847954 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 23 09:34:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:34:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:26 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:26 np0005532585.localdomain sudo[243706]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:26 np0005532585.localdomain podman[243738]: 2025-11-23 09:34:26.519967865 +0000 UTC m=+1.180136231 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:34:26 np0005532585.localdomain podman[243738]: 2025-11-23 09:34:26.558384565 +0000 UTC m=+1.218553052 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:34:26 np0005532585.localdomain podman[243738]: unhealthy
Nov 23 09:34:26 np0005532585.localdomain sudo[243868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-astaszstuolxdlxonndyannbijivtpqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890466.5893557-2587-184558374263030/AnsiballZ_podman_container_exec.py
Nov 23 09:34:26 np0005532585.localdomain sudo[243868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:26 np0005532585.localdomain python3.9[243870]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47288 DF PROTO=TCP SPT=51320 DPT=9882 SEQ=3933650639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3F6200000000001030307) 
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: libpod-conmon-2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.scope: Deactivated successfully.
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:34:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.scope.
Nov 23 09:34:27 np0005532585.localdomain podman[243871]: 2025-11-23 09:34:27.832422823 +0000 UTC m=+0.824042611 container exec 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:34:27 np0005532585.localdomain podman[243871]: 2025-11-23 09:34:27.868228411 +0000 UTC m=+0.859848149 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:34:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:28 np0005532585.localdomain podman[240668]: time="2025-11-23T09:34:28Z" level=error msg="Getting root fs size for \"59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Nov 23 09:34:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:28 np0005532585.localdomain sudo[243868]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:28 np0005532585.localdomain sudo[244010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pemtwuimzbbochngwdltlmtvnwfxmuza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890468.6756752-2595-242912038605705/AnsiballZ_file.py
Nov 23 09:34:28 np0005532585.localdomain sudo[244010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:29 np0005532585.localdomain python3.9[244012]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:34:29 np0005532585.localdomain sudo[244010]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:29 np0005532585.localdomain sudo[244120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehxerrealhhdynonhmdbwvnhrwtewxmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890469.392414-2604-93842681902759/AnsiballZ_podman_container_info.py
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:34:29 np0005532585.localdomain sudo[244120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:34:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18940 DF PROTO=TCP SPT=58856 DPT=9101 SEQ=3066547842 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3FF1A0000000001030307) 
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4af87402ffde9ee67f7ec4afd129c011169c73f7a6b2401b07ae2b65c68e82e0-merged.mount: Deactivated successfully.
Nov 23 09:34:29 np0005532585.localdomain python3.9[244123]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 23 09:34:29 np0005532585.localdomain systemd[1]: libpod-conmon-2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.scope: Deactivated successfully.
Nov 23 09:34:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53372 DF PROTO=TCP SPT=60402 DPT=9105 SEQ=2355464562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D3FFCB0000000001030307) 
Nov 23 09:34:30 np0005532585.localdomain podman[244122]: 2025-11-23 09:34:30.006403651 +0000 UTC m=+0.283599934 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:34:30 np0005532585.localdomain podman[244125]: 2025-11-23 09:34:30.060488629 +0000 UTC m=+0.331254824 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 23 09:34:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:30.073 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:30 np0005532585.localdomain podman[244122]: 2025-11-23 09:34:30.079159311 +0000 UTC m=+0.356355564 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 09:34:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:30.089 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:30 np0005532585.localdomain podman[244125]: 2025-11-23 09:34:30.122437808 +0000 UTC m=+0.393204033 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:34:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:34:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:31 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:34:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:32 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:34:32 np0005532585.localdomain podman[244124]: 2025-11-23 09:34:32.65349431 +0000 UTC m=+2.930164227 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 23 09:34:32 np0005532585.localdomain podman[244124]: 2025-11-23 09:34:32.687359198 +0000 UTC m=+2.964029085 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 23 09:34:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18942 DF PROTO=TCP SPT=58856 DPT=9101 SEQ=3066547842 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D40B200000000001030307) 
Nov 23 09:34:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:34:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:34:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:34:34 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:34:34 np0005532585.localdomain sudo[244120]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.100 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.102 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.103 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5013 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.103 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.106 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.106 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:35.109 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:35 np0005532585.localdomain sudo[244305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzlvnnkoqygdvpxtpdvmjvlorbdkhfyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890474.9966698-2612-12415663010024/AnsiballZ_podman_container_exec.py
Nov 23 09:34:35 np0005532585.localdomain sudo[244305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:35 np0005532585.localdomain python3.9[244307]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:35 np0005532585.localdomain systemd[1]: Started libpod-conmon-9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.scope.
Nov 23 09:34:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:34:35 np0005532585.localdomain podman[244308]: 2025-11-23 09:34:35.599959995 +0000 UTC m=+0.098631924 container exec 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 23 09:34:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad-merged.mount: Deactivated successfully.
Nov 23 09:34:35 np0005532585.localdomain podman[244308]: 2025-11-23 09:34:35.641283142 +0000 UTC m=+0.139955021 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 23 09:34:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:34:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64424 DF PROTO=TCP SPT=57258 DPT=9100 SEQ=949974710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D418210000000001030307) 
Nov 23 09:34:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:34:36 np0005532585.localdomain sudo[244305]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:36 np0005532585.localdomain podman[244337]: 2025-11-23 09:34:36.628723589 +0000 UTC m=+0.683118079 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:34:36 np0005532585.localdomain podman[244337]: 2025-11-23 09:34:36.641348666 +0000 UTC m=+0.695743126 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:34:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:34:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:34:37 np0005532585.localdomain sudo[244469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kggylkeotsjkczxhvvashbgebfsozjxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890477.3888638-2620-252572065986343/AnsiballZ_podman_container_exec.py
Nov 23 09:34:37 np0005532585.localdomain sudo[244469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:37 np0005532585.localdomain python3.9[244471]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:34:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:39 np0005532585.localdomain systemd[1]: libpod-conmon-9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.scope: Deactivated successfully.
Nov 23 09:34:39 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:34:39 np0005532585.localdomain systemd[1]: Started libpod-conmon-9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.scope.
Nov 23 09:34:39 np0005532585.localdomain podman[244472]: 2025-11-23 09:34:39.187829701 +0000 UTC m=+1.296901183 container exec 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:34:39 np0005532585.localdomain podman[244472]: 2025-11-23 09:34:39.223610288 +0000 UTC m=+1.332681790 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:34:39 np0005532585.localdomain podman[244483]: 2025-11-23 09:34:39.269985629 +0000 UTC m=+1.323405915 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:34:39 np0005532585.localdomain podman[244483]: 2025-11-23 09:34:39.285259788 +0000 UTC m=+1.338680064 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:34:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28681 DF PROTO=TCP SPT=50622 DPT=9100 SEQ=450288367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D424200000000001030307) 
Nov 23 09:34:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:34:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.135 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.137 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.137 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.138 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.141 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.141 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:40.144 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:34:41 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:34:41 np0005532585.localdomain sudo[244469]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:41 np0005532585.localdomain sudo[244629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aogmleptvzyrwhwmusvgoqbhkfmrefls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890481.3028255-2628-200527279752391/AnsiballZ_file.py
Nov 23 09:34:41 np0005532585.localdomain sudo[244629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:34:41 np0005532585.localdomain python3.9[244631]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:34:41 np0005532585.localdomain sudo[244629]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48201 DF PROTO=TCP SPT=56130 DPT=9882 SEQ=3811039386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D42E460000000001030307) 
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: libpod-conmon-9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.scope: Deactivated successfully.
Nov 23 09:34:42 np0005532585.localdomain sudo[244739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdjaybfnwarvohdpiuqftpgyztnrdyjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890482.0277493-2637-250495399580911/AnsiballZ_podman_container_info.py
Nov 23 09:34:42 np0005532585.localdomain sudo[244739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:42 np0005532585.localdomain python3.9[244741]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:34:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully.
Nov 23 09:34:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48203 DF PROTO=TCP SPT=56130 DPT=9882 SEQ=3811039386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D43A600000000001030307) 
Nov 23 09:34:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:45.143 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:45.148 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:46 np0005532585.localdomain sudo[244739]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:46 np0005532585.localdomain sudo[244859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnezrgcamtyciuirqpxicxdljgvsesiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890486.768982-2645-23355256906535/AnsiballZ_podman_container_exec.py
Nov 23 09:34:46 np0005532585.localdomain sudo[244859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:47 np0005532585.localdomain python3.9[244861]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:47 np0005532585.localdomain systemd[1]: Started libpod-conmon-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.scope.
Nov 23 09:34:47 np0005532585.localdomain podman[244862]: 2025-11-23 09:34:47.272755932 +0000 UTC m=+0.095583771 container exec 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Nov 23 09:34:47 np0005532585.localdomain podman[244862]: 2025-11-23 09:34:47.304208356 +0000 UTC m=+0.127036195 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:34:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50973 DF PROTO=TCP SPT=33012 DPT=9102 SEQ=742435514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D446200000000001030307) 
Nov 23 09:34:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:34:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22-merged.mount: Deactivated successfully.
Nov 23 09:34:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22-merged.mount: Deactivated successfully.
Nov 23 09:34:49 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:49 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:49 np0005532585.localdomain sudo[244859]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:34:49 np0005532585.localdomain sudo[245011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgyqyczemzgrayemmhunkubdfkfohyih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890489.3127706-2653-24416914521744/AnsiballZ_podman_container_exec.py
Nov 23 09:34:49 np0005532585.localdomain sudo[245011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:49 np0005532585.localdomain python3.9[245013]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:50.173 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:50.175 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:50.176 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:34:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:50.176 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:50.179 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:34:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:50.179 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:34:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:51 np0005532585.localdomain systemd[1]: libpod-conmon-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.scope: Deactivated successfully.
Nov 23 09:34:51 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:51 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:51 np0005532585.localdomain systemd[1]: Started libpod-conmon-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.scope.
Nov 23 09:34:51 np0005532585.localdomain podman[245014]: 2025-11-23 09:34:51.450812768 +0000 UTC m=+1.539618113 container exec 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 23 09:34:51 np0005532585.localdomain podman[244957]: 2025-11-23 09:34:51.467222841 +0000 UTC m=+2.015913433 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 09:34:51 np0005532585.localdomain podman[245014]: 2025-11-23 09:34:51.486261595 +0000 UTC m=+1.575066950 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd)
Nov 23 09:34:51 np0005532585.localdomain podman[244957]: 2025-11-23 09:34:51.503343688 +0000 UTC m=+2.052034340 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 23 09:34:51 np0005532585.localdomain podman[244957]: unhealthy
Nov 23 09:34:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36016 DF PROTO=TCP SPT=32894 DPT=9102 SEQ=1535383150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D45B610000000001030307) 
Nov 23 09:34:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:34:53 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:53 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:34:53 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:34:53 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'.
Nov 23 09:34:53 np0005532585.localdomain sudo[245011]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:34:54 np0005532585.localdomain systemd[1]: libpod-conmon-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.scope: Deactivated successfully.
Nov 23 09:34:54 np0005532585.localdomain sudo[245158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymmnuwhzagyaqroyzkkvrhzmewptjjdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890494.2860577-2661-257121929544050/AnsiballZ_file.py
Nov 23 09:34:54 np0005532585.localdomain sudo[245158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:54 np0005532585.localdomain python3.9[245160]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:34:54 np0005532585.localdomain sudo[245158]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:34:55.180 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:34:55 np0005532585.localdomain sudo[245268]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hshwypbbkepyscqurpknwsyusjopszii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890494.9877722-2670-259209815448462/AnsiballZ_podman_container_info.py
Nov 23 09:34:55 np0005532585.localdomain sudo[245268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:34:55 np0005532585.localdomain python3.9[245270]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 23 09:34:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:34:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473-merged.mount: Deactivated successfully.
Nov 23 09:34:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473-merged.mount: Deactivated successfully.
Nov 23 09:34:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48205 DF PROTO=TCP SPT=56130 DPT=9882 SEQ=3811039386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D46A200000000001030307) 
Nov 23 09:34:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:34:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:34:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:34:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:34:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:34:58 np0005532585.localdomain sudo[245268]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:58 np0005532585.localdomain podman[245284]: 2025-11-23 09:34:58.728761443 +0000 UTC m=+0.844159506 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:34:58 np0005532585.localdomain podman[245284]: 2025-11-23 09:34:58.739033888 +0000 UTC m=+0.854431951 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:34:58 np0005532585.localdomain podman[245284]: unhealthy
Nov 23 09:34:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:34:59 np0005532585.localdomain sudo[245414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xunmuxzsxgpokltnbuvrtdcgftnajvcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890498.8635156-2678-182756537955801/AnsiballZ_podman_container_exec.py
Nov 23 09:34:59 np0005532585.localdomain sudo[245414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:34:59 np0005532585.localdomain python3.9[245416]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:34:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48707 DF PROTO=TCP SPT=43318 DPT=9101 SEQ=3687178667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4744A0000000001030307) 
Nov 23 09:34:59 np0005532585.localdomain sudo[245428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:34:59 np0005532585.localdomain sudo[245428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:34:59 np0005532585.localdomain sudo[245428]: pam_unix(sudo:session): session closed for user root
Nov 23 09:34:59 np0005532585.localdomain sudo[245446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:34:59 np0005532585.localdomain sudo[245446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:34:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17299 DF PROTO=TCP SPT=53474 DPT=9105 SEQ=1448009210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D474FB0000000001030307) 
Nov 23 09:35:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:00.182 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:00.184 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:00.184 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:35:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:00.184 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:00.205 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:00.205 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:35:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:35:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:35:00 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:35:00 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:35:00 np0005532585.localdomain systemd[1]: Started libpod-conmon-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope.
Nov 23 09:35:00 np0005532585.localdomain podman[245417]: 2025-11-23 09:35:00.978080009 +0000 UTC m=+1.687991521 container exec db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:35:00 np0005532585.localdomain podman[245417]: 2025-11-23 09:35:00.986229899 +0000 UTC m=+1.696141411 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 23 09:35:01 np0005532585.localdomain sudo[245446]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully.
Nov 23 09:35:02 np0005532585.localdomain sudo[245414]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:35:02 np0005532585.localdomain systemd[1]: tmp-crun.JzmGD7.mount: Deactivated successfully.
Nov 23 09:35:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:02.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:02.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 09:35:02 np0005532585.localdomain podman[245514]: 2025-11-23 09:35:02.728069131 +0000 UTC m=+0.620964606 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 23 09:35:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:02.736 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 09:35:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:02.737 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:02.737 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 09:35:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:02.750 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:02 np0005532585.localdomain podman[245514]: 2025-11-23 09:35:02.796328062 +0000 UTC m=+0.689223508 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:35:02 np0005532585.localdomain sudo[245564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:35:02 np0005532585.localdomain sudo[245564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:35:02 np0005532585.localdomain sudo[245564]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:02 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48709 DF PROTO=TCP SPT=43318 DPT=9101 SEQ=3687178667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D480610000000001030307) 
Nov 23 09:35:03 np0005532585.localdomain sudo[245677]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iweokfbrihvxgkdrezmegmdwjcqkyayv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890502.8101404-2686-188542093539683/AnsiballZ_podman_container_exec.py
Nov 23 09:35:03 np0005532585.localdomain sudo[245677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:03 np0005532585.localdomain python3.9[245679]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:35:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:03.760 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:03.785 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:35:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:03.785 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:35:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:03.786 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:35:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:03.786 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:35:03 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:03.787 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.239 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.292 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.292 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.493 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.494 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12293MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.495 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.495 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.572 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.573 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.573 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:35:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:04.608 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:35:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:35:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.074 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.081 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.093 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.094 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.094 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:35:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:05 np0005532585.localdomain systemd[1]: libpod-conmon-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope: Deactivated successfully.
Nov 23 09:35:05 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.206 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.208 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.208 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.209 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.239 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:05 np0005532585.localdomain podman[245527]: 2025-11-23 09:35:05.241688488 +0000 UTC m=+2.578053703 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 09:35:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:05.240 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:05 np0005532585.localdomain systemd[1]: Started libpod-conmon-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope.
Nov 23 09:35:05 np0005532585.localdomain podman[245680]: 2025-11-23 09:35:05.299613214 +0000 UTC m=+2.000913693 container exec db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 23 09:35:05 np0005532585.localdomain podman[245527]: 2025-11-23 09:35:05.307737923 +0000 UTC m=+2.644103238 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7)
Nov 23 09:35:05 np0005532585.localdomain podman[245734]: 2025-11-23 09:35:05.378971576 +0000 UTC m=+0.468895063 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 23 09:35:05 np0005532585.localdomain podman[245680]: 2025-11-23 09:35:05.383355561 +0000 UTC m=+2.084656040 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:35:05 np0005532585.localdomain podman[245734]: 2025-11-23 09:35:05.413384522 +0000 UTC m=+0.503307969 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 09:35:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:35:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.050 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.050 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.051 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:35:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37079 DF PROTO=TCP SPT=58556 DPT=9100 SEQ=2531984057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D48E200000000001030307) 
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.457 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.458 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.458 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.458 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:35:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.969 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.981 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.981 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.982 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.982 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:06.982 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:07 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:35:07 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:35:07 np0005532585.localdomain sudo[245677]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:35:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:07.644 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:07.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:07.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:07 np0005532585.localdomain sudo[245888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgeulrczjrheurcsftsmnxzmvtuckdlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890507.4837794-2694-192580950886727/AnsiballZ_file.py
Nov 23 09:35:07 np0005532585.localdomain sudo[245888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:07 np0005532585.localdomain python3.9[245890]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:35:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:35:07 np0005532585.localdomain sudo[245888]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:08 np0005532585.localdomain sudo[245998]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-setgbragtdgsxycuhuiesspxkoxcfieb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890508.1453733-2703-215214560903781/AnsiballZ_podman_container_info.py
Nov 23 09:35:08 np0005532585.localdomain sudo[245998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:08 np0005532585.localdomain systemd[1]: libpod-conmon-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope: Deactivated successfully.
Nov 23 09:35:08 np0005532585.localdomain python3.9[246000]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 23 09:35:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:08.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:35:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:08.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:35:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:35:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26810 DF PROTO=TCP SPT=57838 DPT=9100 SEQ=4169834911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D499200000000001030307) 
Nov 23 09:35:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:35:09.247 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:35:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:35:09.248 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:35:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:35:09.249 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.241 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.277 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.277 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.277 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.277 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.278 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:10.278 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully.
Nov 23 09:35:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully.
Nov 23 09:35:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully.
Nov 23 09:35:10 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:10 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.805 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain podman[246015]: 2025-11-23 09:35:10.8084002 +0000 UTC m=+1.631834770 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.823 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.824 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e14dece-4417-4655-abda-4001a9da341c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.807503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4b54b54-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10453.985168916, 'message_signature': 'e714d998055215729c54afe4a975ec1cdac4b56aa5c768762935a5a08520d419'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.807503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4b55fc2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10453.985168916, 'message_signature': '8670c306b7c6f7612e9d8c3c956c89dd30498ef20db424f2505600dc8736a32f'}]}, 'timestamp': '2025-11-23 09:35:10.824471', '_unique_id': 'f51cff46dcfc463da673071ee7ae2e92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.825 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.827 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain podman[246015]: 2025-11-23 09:35:10.842395092 +0000 UTC m=+1.665829582 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95e4c657-543f-4c60-a810-29c122dcba2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.827351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4bbfcba-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '4b3abb465c9754901551cb876a8035d3f03bec3e2d13410b63754c8a64da4e30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.827351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4bc1056-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '242d445d99692efb5c694b4351be531ddd64f1e7946af909a13cdf71cccceecc'}]}, 'timestamp': '2025-11-23 09:35:10.868291', '_unique_id': '3377e36ce9474c5c8f7eb5b36e508704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6eb27313-48d1-4a4b-88e4-04d96b8720b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.870932', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4bd00c4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': '4faefa98886d43211aac6774bb5725b566e8e4dcc4d4b43e9d2a0765cc474bdb'}]}, 'timestamp': '2025-11-23 09:35:10.874473', '_unique_id': 'febbda1da03945a280147a6304cb66f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0288d635-2918-49c4-b042-b6a4f5d62235', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.876802', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4bd6f32-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10453.985168916, 'message_signature': '2b5a3932a39539f885061bed3f9b98f08cc9888d7ae78a2f6a4bce06e205bc17'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.876802', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4bd7efa-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10453.985168916, 'message_signature': 'b83618a6a24e3d08f77c0193583d3fa40d555dfed2746d54b497f5cbc646f332'}]}, 'timestamp': '2025-11-23 09:35:10.877666', '_unique_id': '81e45e9d17594b4dbb1e508286faf9a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd56586ca-bb17-46b1-8804-10b98e70f22b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.880012', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4bdeb38-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': 'dffcd5f409890b952a160fb853e2e5ef034a3d659925735f24dd1ef1076010fe'}]}, 'timestamp': '2025-11-23 09:35:10.880468', '_unique_id': 'd7fd2d0388cc4e3fb62edbeb63340c83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6e56135-6c80-4658-8f4b-db72aaa219ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.882558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4be4e20-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10453.985168916, 'message_signature': '992e432ae5c13507b309823344efbcbe46c7533412b158a1a0cd762cf4f76625'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.882558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4be5f14-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10453.985168916, 'message_signature': '4a87a780f6f194c06a9869725eccfe671f3051919ae0d1d52ea4e93fb2da2af2'}]}, 'timestamp': '2025-11-23 09:35:10.883401', '_unique_id': 'f7de781199d74cdb8326870cf6d41e72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '973510b1-7565-4443-bd42-d3d055d8d019', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:35:10.885667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b4c1b25e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.082133348, 'message_signature': '1379cc236ea3a1c613850636f958c010c29a02076431a91dfb980f52a212fcc5'}]}, 'timestamp': '2025-11-23 09:35:10.905244', '_unique_id': 'b9bc84b9379e4fae8637ccd351064c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.906 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.907 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 52860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0350f766-5e31-48d8-8489-b5afe4d66381', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52860000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:35:10.907563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b4c21f14-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.082133348, 'message_signature': '8b116d01c57e0f3e88499b85a2587b5e645f35a438407238c8cb42acb82431ca'}]}, 'timestamp': '2025-11-23 09:35:10.908029', '_unique_id': '6dc5a65c225f448581827dcf1fbb5176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd126526-e28c-486c-b8e9-f18cc8be81c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.910150', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c28558-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': 'eec57fdd029923a1856ebcba395d5745d0abc0f60acfcea29c2281f6e86afeff'}]}, 'timestamp': '2025-11-23 09:35:10.910630', '_unique_id': 'fe33213d730c4957bcbd33194c449c79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3448916e-1be8-492b-9e54-c999b641aec9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.912805', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c2ee9e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': '2a6969358ff69fb5a7d8368a1b048dd585347e2a927eb4edac5638fd3cf3dc99'}]}, 'timestamp': '2025-11-23 09:35:10.913324', '_unique_id': '0dcf42faee95436697508797a6bc6012'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c800107-81c5-4be4-97b9-24cfdd1a80a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.915386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4c350a0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '3958c32b0ebd0402cfc969bc3dca4ca2cdf3a3fe628b23192d5fe3bccaa4dfa3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.915386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4c3619e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '438e3d003bec3c524fd304439b441852ba8af2b0d98d198f90499ab32d1a1935'}]}, 'timestamp': '2025-11-23 09:35:10.916236', '_unique_id': 'a7302db7e35142a5a386469dffc4dec9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f72ac544-5f3f-458a-a01d-68f2c397f1c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.918368', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c3c53a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': 'dadc199663305c899c4e88267ef3e10e9e09206b22d56d7abbf9eb0422df22e5'}]}, 'timestamp': '2025-11-23 09:35:10.918812', '_unique_id': 'cd87effdd310446181bf4d23a3163637'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43acea74-bdee-4fdb-a65a-43856445bda5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.920867', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c4280e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': 'f43bd15029b65417c8b2b003169cbaeb785ad6d78d65647c40b12e10174e65e7'}]}, 'timestamp': '2025-11-23 09:35:10.921343', '_unique_id': '120835bc2849436ca64dc48e0097286d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ff4507e-b061-4935-84f4-537b75a80432', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.923473', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c48c90-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': '1d391fb826ffd48136a2de7dd914893fe865aefea008b68860f1f9abfb5d8510'}]}, 'timestamp': '2025-11-23 09:35:10.923945', '_unique_id': 'd3f9e5e04fec4d8186dc2de5c9ff4024'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09f04868-506d-4645-a93f-5eb7d7547e50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.926146', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c4f4f0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': 'd16acc757594e358cb7bac96095fb94803267fa91acbf0800fe110ff27867d1b'}]}, 'timestamp': '2025-11-23 09:35:10.926591', '_unique_id': 'ccedacdb7a0f4631bb1be8ab34438d82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '622bb7cb-838e-4ada-875f-02e35795e6a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.928747', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c55bd4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': '2c2f51a9115f6b874f57bc271e1e5ad71b97f2acbec35bada6988069a4981927'}]}, 'timestamp': '2025-11-23 09:35:10.929224', '_unique_id': 'ea4b8d74cadb480ba1d4be521daef7cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fdcdd44-4b91-460a-8168-3f82cd31d9b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.931263', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4c5bcb4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '6f02abdb7488097c349ccd7806eee392c36ebf1cc100b91134557d41b6cd93b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.931263', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4c5cc54-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': 'dbefd2395748b24265ccd917b3dec9f66a6f76f7a320e4eddcf6a3188e32931f'}]}, 'timestamp': '2025-11-23 09:35:10.932101', '_unique_id': 'c8c27a3d69ea402dbf1f9cb5528d7543'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d478375-01fe-4c5f-9e20-e74019039b19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:35:10.935310', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'b4c65d72-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.04858036, 'message_signature': 'a1bfdb354283e190cbd7630c0dbc0162c2e9ba432da86f81b6535091a7e40470'}]}, 'timestamp': '2025-11-23 09:35:10.935867', '_unique_id': '983c1e7501dc430aa03af72a03ea3742'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.938 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.938 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b9b818e-d664-4474-9a59-72f3e786546d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.938036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4c6c69a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '0265bce180c1376dd0b3398b642dc0cf96a8f40dabab4392b986d8ae8f4cd9fc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.938036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4c6d1d0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': 'bf7f691fd94149ed361dbf2b57e7f5c5fe0ed7799fc3768c1aea562fc4a91416'}]}, 'timestamp': '2025-11-23 09:35:10.938699', '_unique_id': 'f9719cbfcb5a4ca3b286c95b44f3f5ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.940 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.940 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e31cb911-ee43-4da7-8377-ab6e3306a3a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.940094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4c712ee-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '2a28391bba4fdc8d3d92e36164903b0bd40f921f5bab41a5b2d9a5e8727317bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.940094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4c71ce4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': 'c1ee1fc62fbcbb20960a87af55ce1887849102b6543ffe57977e0c3ecb7249bd'}]}, 'timestamp': '2025-11-23 09:35:10.940615', '_unique_id': '6dcd7192a0804dbc9272290735cf0018'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d64d004-4e8c-4e99-b52e-479fcaac3a02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:35:10.941986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4c75cb8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': '761d53b2602caf28fc04a8c89a377540824cb6cd31ed95b98e16e45fe075cb95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:35:10.941986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4c766a4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10454.005001433, 'message_signature': 'bc3a5a7b902cc00c96a0e49d6a94c1de2314ff9a5eb1f90010cf5ed5ff04242b'}]}, 'timestamp': '2025-11-23 09:35:10.942502', '_unique_id': '66ff49ff11b540d7a6bbcb0a287137eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:35:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:35:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:35:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:35:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19635 DF PROTO=TCP SPT=47770 DPT=9882 SEQ=1297036305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4A3760000000001030307) 
Nov 23 09:35:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:35:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:35:13 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:13 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:35:13 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:35:13 np0005532585.localdomain sudo[245998]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:13 np0005532585.localdomain podman[246037]: 2025-11-23 09:35:13.160184367 +0000 UTC m=+1.715886587 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 23 09:35:13 np0005532585.localdomain podman[246037]: 2025-11-23 09:35:13.176297411 +0000 UTC m=+1.731999591 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:35:13 np0005532585.localdomain sudo[246163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbipsczworybpekhuijzmcjequcdyzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890513.7911093-2711-59444792007380/AnsiballZ_podman_container_exec.py
Nov 23 09:35:14 np0005532585.localdomain sudo[246163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:14 np0005532585.localdomain python3.9[246165]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:35:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19637 DF PROTO=TCP SPT=47770 DPT=9882 SEQ=1297036305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4AF600000000001030307) 
Nov 23 09:35:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 23 09:35:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:15.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:15.281 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:15.282 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:35:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:15.282 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:15.309 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:15.309 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b-merged.mount: Deactivated successfully.
Nov 23 09:35:16 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:16 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:16 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:35:16 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:16 np0005532585.localdomain systemd[1]: Started libpod-conmon-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.scope.
Nov 23 09:35:16 np0005532585.localdomain podman[246166]: 2025-11-23 09:35:16.203102659 +0000 UTC m=+1.948960001 container exec a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:35:16 np0005532585.localdomain podman[246166]: 2025-11-23 09:35:16.237241785 +0000 UTC m=+1.983099097 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:35:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:35:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:17 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:17 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:17 np0005532585.localdomain sudo[246163]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:17 np0005532585.localdomain sudo[246302]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfixvhxulkjgbpohvjffinghdmjjytwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890517.5759647-2719-108739728825195/AnsiballZ_podman_container_exec.py
Nov 23 09:35:17 np0005532585.localdomain sudo[246302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:18 np0005532585.localdomain podman[240668]: time="2025-11-23T09:35:18Z" level=error msg="Getting root fs size for \"76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Nov 23 09:35:18 np0005532585.localdomain python3.9[246304]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36018 DF PROTO=TCP SPT=32894 DPT=9102 SEQ=1535383150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4BC200000000001030307) 
Nov 23 09:35:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:18 np0005532585.localdomain systemd[1]: libpod-conmon-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.scope: Deactivated successfully.
Nov 23 09:35:18 np0005532585.localdomain systemd[1]: Started libpod-conmon-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.scope.
Nov 23 09:35:18 np0005532585.localdomain podman[246305]: 2025-11-23 09:35:18.758435965 +0000 UTC m=+0.641337189 container exec a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:35:18 np0005532585.localdomain podman[246305]: 2025-11-23 09:35:18.794417998 +0000 UTC m=+0.677319182 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:35:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:20.310 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:20.312 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:20.313 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:35:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:20.313 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:20.352 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:20.353 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:35:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 23 09:35:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473-merged.mount: Deactivated successfully.
Nov 23 09:35:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473-merged.mount: Deactivated successfully.
Nov 23 09:35:20 np0005532585.localdomain sudo[246302]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:21 np0005532585.localdomain sudo[246442]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dertuohtxuekhktgusamnokftbosreeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890520.949638-2727-212247089856970/AnsiballZ_file.py
Nov 23 09:35:21 np0005532585.localdomain sudo[246442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:21 np0005532585.localdomain python3.9[246444]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:35:21 np0005532585.localdomain sudo[246442]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 23 09:35:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:35:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:35:21 np0005532585.localdomain systemd[1]: libpod-conmon-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.scope: Deactivated successfully.
Nov 23 09:35:21 np0005532585.localdomain sudo[246552]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihpfptctqfdyfsjsthozjynjgxulsoxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890521.680606-2736-20879143193913/AnsiballZ_podman_container_info.py
Nov 23 09:35:21 np0005532585.localdomain sudo[246552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:22 np0005532585.localdomain python3.9[246554]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 23 09:35:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:35:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250-merged.mount: Deactivated successfully.
Nov 23 09:35:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 23 09:35:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43053 DF PROTO=TCP SPT=60316 DPT=9102 SEQ=3023913204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4D0A00000000001030307) 
Nov 23 09:35:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:35:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:35:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:35:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:25.354 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:25.360 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:25 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:25 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:25 np0005532585.localdomain sudo[246552]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:25 np0005532585.localdomain podman[246568]: 2025-11-23 09:35:25.666823022 +0000 UTC m=+2.106362256 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:35:25 np0005532585.localdomain podman[246568]: 2025-11-23 09:35:25.677420077 +0000 UTC m=+2.116959311 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Nov 23 09:35:26 np0005532585.localdomain sudo[246694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxbvkclwxyhyimedbmzxpxkhaudihwbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890525.8101087-2744-230027926183737/AnsiballZ_podman_container_exec.py
Nov 23 09:35:26 np0005532585.localdomain sudo[246694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:26 np0005532585.localdomain python3.9[246696]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19639 DF PROTO=TCP SPT=47770 DPT=9882 SEQ=1297036305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4E0210000000001030307) 
Nov 23 09:35:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:27 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:27 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:35:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.scope.
Nov 23 09:35:27 np0005532585.localdomain podman[246697]: 2025-11-23 09:35:27.698949862 +0000 UTC m=+1.428600031 container exec 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:35:27 np0005532585.localdomain podman[246697]: 2025-11-23 09:35:27.729279772 +0000 UTC m=+1.458929921 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:35:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:28 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:28 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:28 np0005532585.localdomain sudo[246694]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:35:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully.
Nov 23 09:35:29 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:29 np0005532585.localdomain systemd[1]: libpod-conmon-1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.scope: Deactivated successfully.
Nov 23 09:35:29 np0005532585.localdomain sudo[246833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkzweiysqejdkdhwkvjmxnteknopipkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890529.4791868-2752-268343508587812/AnsiballZ_podman_container_exec.py
Nov 23 09:35:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58246 DF PROTO=TCP SPT=36116 DPT=9101 SEQ=3443640407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4E97A0000000001030307) 
Nov 23 09:35:29 np0005532585.localdomain sudo[246833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3810 DF PROTO=TCP SPT=44980 DPT=9105 SEQ=3590607699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4EA2B0000000001030307) 
Nov 23 09:35:29 np0005532585.localdomain python3.9[246835]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:30 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:30 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:30 np0005532585.localdomain systemd[1]: Started libpod-conmon-1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.scope.
Nov 23 09:35:30 np0005532585.localdomain podman[246836]: 2025-11-23 09:35:30.112968626 +0000 UTC m=+0.111293672 container exec 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:35:30 np0005532585.localdomain podman[246836]: 2025-11-23 09:35:30.147731202 +0000 UTC m=+0.146056258 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:35:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:30.358 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:30.367 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:35:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully.
Nov 23 09:35:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully.
Nov 23 09:35:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b-merged.mount: Deactivated successfully.
Nov 23 09:35:32 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:32 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:32 np0005532585.localdomain sudo[246833]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:32 np0005532585.localdomain podman[246865]: 2025-11-23 09:35:32.506813833 +0000 UTC m=+1.547845036 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:35:32 np0005532585.localdomain podman[246865]: 2025-11-23 09:35:32.518227733 +0000 UTC m=+1.559258966 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:35:32 np0005532585.localdomain podman[246865]: unhealthy
Nov 23 09:35:32 np0005532585.localdomain sudo[246995]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgpatmzktccatwlpogkmchzmcheilxrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890532.6138437-2760-40376464947466/AnsiballZ_file.py
Nov 23 09:35:32 np0005532585.localdomain sudo[246995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58248 DF PROTO=TCP SPT=36116 DPT=9101 SEQ=3443640407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D4F5A10000000001030307) 
Nov 23 09:35:33 np0005532585.localdomain python3.9[246997]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:35:33 np0005532585.localdomain sudo[246995]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:33 np0005532585.localdomain sudo[247105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdbjtlgykxqdlfigxzsmhfxfrjlvfewt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890533.292354-2769-123756749955350/AnsiballZ_podman_container_info.py
Nov 23 09:35:33 np0005532585.localdomain sudo[247105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:35:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:35:33 np0005532585.localdomain python3.9[247107]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 23 09:35:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully.
Nov 23 09:35:34 np0005532585.localdomain systemd[1]: libpod-conmon-1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.scope: Deactivated successfully.
Nov 23 09:35:34 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:35:34 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:35:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully.
Nov 23 09:35:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:35.361 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:35.368 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:35:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully.
Nov 23 09:35:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28684 DF PROTO=TCP SPT=50622 DPT=9100 SEQ=450288367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D502200000000001030307) 
Nov 23 09:35:36 np0005532585.localdomain podman[247119]: 2025-11-23 09:35:36.330188449 +0000 UTC m=+0.740966399 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:35:36 np0005532585.localdomain podman[247119]: 2025-11-23 09:35:36.372708559 +0000 UTC m=+0.783486539 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:35:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully.
Nov 23 09:35:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:35:37 np0005532585.localdomain sudo[247105]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:37 np0005532585.localdomain podman[247145]: 2025-11-23 09:35:37.460135495 +0000 UTC m=+0.086790376 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64)
Nov 23 09:35:37 np0005532585.localdomain podman[247145]: 2025-11-23 09:35:37.473974262 +0000 UTC m=+0.100629183 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:35:37 np0005532585.localdomain sudo[247282]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiglchyyjzfdkzowbvsngtfepcpbxbaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890537.5587065-2777-229015462070719/AnsiballZ_podman_container_exec.py
Nov 23 09:35:37 np0005532585.localdomain sudo[247282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:38 np0005532585.localdomain python3.9[247284]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:38 np0005532585.localdomain systemd[1]: Started libpod-conmon-ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.scope.
Nov 23 09:35:38 np0005532585.localdomain podman[247285]: 2025-11-23 09:35:38.177678351 +0000 UTC m=+0.111853689 container exec ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=)
Nov 23 09:35:38 np0005532585.localdomain podman[247285]: 2025-11-23 09:35:38.211322138 +0000 UTC m=+0.145497436 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 09:35:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:35:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 23 09:35:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250-merged.mount: Deactivated successfully.
Nov 23 09:35:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250-merged.mount: Deactivated successfully.
Nov 23 09:35:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12680 DF PROTO=TCP SPT=52656 DPT=9100 SEQ=637467846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D50E600000000001030307) 
Nov 23 09:35:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully.
Nov 23 09:35:39 np0005532585.localdomain podman[247144]: 2025-11-23 09:35:39.747612049 +0000 UTC m=+2.381857444 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:35:39 np0005532585.localdomain sudo[247282]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:39 np0005532585.localdomain podman[247144]: 2025-11-23 09:35:39.780316306 +0000 UTC m=+2.414561721 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:35:40 np0005532585.localdomain sudo[247428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jofewmwibqlwarpeoupgrcimmnkypsnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890539.9292672-2785-275593091015625/AnsiballZ_podman_container_exec.py
Nov 23 09:35:40 np0005532585.localdomain sudo[247428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:40.365 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:40.370 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:40 np0005532585.localdomain python3.9[247430]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 23 09:35:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:35:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26936 DF PROTO=TCP SPT=49348 DPT=9882 SEQ=2677714958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D518A60000000001030307) 
Nov 23 09:35:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: libpod-conmon-ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.scope: Deactivated successfully.
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:35:42 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:42 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: Started libpod-conmon-ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.scope.
Nov 23 09:35:42 np0005532585.localdomain podman[247431]: 2025-11-23 09:35:42.282508498 +0000 UTC m=+1.836990860 container exec ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:35:42 np0005532585.localdomain podman[247431]: 2025-11-23 09:35:42.312857223 +0000 UTC m=+1.867339555 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:35:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:44 np0005532585.localdomain sudo[247428]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:44 np0005532585.localdomain podman[247459]: 2025-11-23 09:35:44.353554671 +0000 UTC m=+0.410730871 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:35:44 np0005532585.localdomain podman[247459]: 2025-11-23 09:35:44.366344914 +0000 UTC m=+0.423521174 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:35:44 np0005532585.localdomain systemd[1]: libpod-conmon-ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.scope: Deactivated successfully.
Nov 23 09:35:44 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:35:44 np0005532585.localdomain sudo[247591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfdmnlxoqfhwowfxomeibrnbnnbkdyrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890544.4525554-2793-208458334350323/AnsiballZ_file.py
Nov 23 09:35:44 np0005532585.localdomain sudo[247591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:35:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26938 DF PROTO=TCP SPT=49348 DPT=9882 SEQ=2677714958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D524A00000000001030307) 
Nov 23 09:35:44 np0005532585.localdomain python3.9[247593]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:35:44 np0005532585.localdomain sudo[247591]: pam_unix(sudo:session): session closed for user root
Nov 23 09:35:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 23 09:35:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully.
Nov 23 09:35:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:45.369 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:45.375 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:35:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully.
Nov 23 09:35:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:35:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:46 np0005532585.localdomain podman[247611]: 2025-11-23 09:35:46.641003412 +0000 UTC m=+0.097709661 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:35:46 np0005532585.localdomain podman[247611]: 2025-11-23 09:35:46.652091694 +0000 UTC m=+0.108797963 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:35:46 np0005532585.localdomain systemd[1]: tmp-crun.LWsLnk.mount: Deactivated successfully.
Nov 23 09:35:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully.
Nov 23 09:35:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully.
Nov 23 09:35:47 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:35:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43055 DF PROTO=TCP SPT=60316 DPT=9102 SEQ=3023913204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D530200000000001030307) 
Nov 23 09:35:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully.
Nov 23 09:35:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully.
Nov 23 09:35:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully.
Nov 23 09:35:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully.
Nov 23 09:35:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:50.373 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:50.375 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb-merged.mount: Deactivated successfully.
Nov 23 09:35:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 23 09:35:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:35:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:35:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully.
Nov 23 09:35:51 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:35:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 23 09:35:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:35:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27557 DF PROTO=TCP SPT=33334 DPT=9102 SEQ=2546460128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D545A10000000001030307) 
Nov 23 09:35:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:35:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:35:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:35:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:35:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:35:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227-merged.mount: Deactivated successfully.
Nov 23 09:35:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully.
Nov 23 09:35:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Nov 23 09:35:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Nov 23 09:35:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:35:55.375 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:35:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully.
Nov 23 09:35:57 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26940 DF PROTO=TCP SPT=49348 DPT=9882 SEQ=2677714958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D554200000000001030307) 
Nov 23 09:35:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:35:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:35:58 np0005532585.localdomain podman[247627]: 2025-11-23 09:35:58.01484388 +0000 UTC m=+0.074988332 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:35:58 np0005532585.localdomain podman[247627]: 2025-11-23 09:35:58.024827378 +0000 UTC m=+0.084971830 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 23 09:35:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:35:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:35:59 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:35:59 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:59 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:35:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64128 DF PROTO=TCP SPT=35362 DPT=9101 SEQ=1621211674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D55EA90000000001030307) 
Nov 23 09:35:59 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48589 DF PROTO=TCP SPT=41182 DPT=9105 SEQ=1986526438 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D55F5B0000000001030307) 
Nov 23 09:36:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:00 np0005532585.localdomain podman[240668]: time="2025-11-23T09:36:00Z" level=error msg="Getting root fs size for \"a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Nov 23 09:36:00 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:00 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:00.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:00.378 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:00.379 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:00.379 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:00.380 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:00.381 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb-merged.mount: Deactivated successfully.
Nov 23 09:36:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb-merged.mount: Deactivated successfully.
Nov 23 09:36:03 np0005532585.localdomain sudo[247646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:36:03 np0005532585.localdomain sudo[247646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:36:03 np0005532585.localdomain sudo[247646]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:03 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48591 DF PROTO=TCP SPT=41182 DPT=9105 SEQ=1986526438 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D56B600000000001030307) 
Nov 23 09:36:03 np0005532585.localdomain sudo[247664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 09:36:03 np0005532585.localdomain sudo[247664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:36:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 23 09:36:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af-merged.mount: Deactivated successfully.
Nov 23 09:36:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af-merged.mount: Deactivated successfully.
Nov 23 09:36:03 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:03 np0005532585.localdomain sudo[247664]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:04 np0005532585.localdomain sudo[247702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:36:04 np0005532585.localdomain sudo[247702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:36:04 np0005532585.localdomain sudo[247702]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:04 np0005532585.localdomain sudo[247720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:36:04 np0005532585.localdomain sudo[247720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:36:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:36:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 23 09:36:04 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:04.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:04 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:04.731 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:36:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:04.731 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:36:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:04.731 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:36:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:04.732 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:36:04 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:04.732 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:36:04 np0005532585.localdomain podman[247750]: 2025-11-23 09:36:04.79452417 +0000 UTC m=+0.107829525 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:36:04 np0005532585.localdomain podman[247750]: 2025-11-23 09:36:04.803217847 +0000 UTC m=+0.116523172 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:36:04 np0005532585.localdomain podman[247750]: unhealthy
Nov 23 09:36:05 np0005532585.localdomain sudo[247720]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.163 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.217 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.218 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.381 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.383 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.383 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.383 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.384 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.385 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.405 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.406 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12341MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.406 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.406 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.509 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.509 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.509 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.523 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:36:05 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:36:05 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:36:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.584 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.585 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.604 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.630 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_F16C,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:36:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:05.663 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:36:05 np0005532585.localdomain podman[240668]: time="2025-11-23T09:36:05Z" level=error msg="Getting root fs size for \"a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Nov 23 09:36:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:05 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:06.086 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:36:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:06.092 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:36:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:06.105 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:36:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:06.107 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:36:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:06.107 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:36:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:06 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26813 DF PROTO=TCP SPT=57838 DPT=9100 SEQ=4169834911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D578200000000001030307) 
Nov 23 09:36:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:06 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227-merged.mount: Deactivated successfully.
Nov 23 09:36:06 np0005532585.localdomain podman[240668]: time="2025-11-23T09:36:06Z" level=error msg="Getting rw size for \"a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8\": unmounting layer e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9: replacing mount point \"/var/lib/containers/storage/overlay/e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9/merged\": device or resource busy"
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.108 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.108 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.126 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.126 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.126 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:36:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:36:07 np0005532585.localdomain podman[247835]: 2025-11-23 09:36:07.537319746 +0000 UTC m=+0.092386998 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 09:36:07 np0005532585.localdomain podman[247835]: 2025-11-23 09:36:07.616593269 +0000 UTC m=+0.171660521 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller)
Nov 23 09:36:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Nov 23 09:36:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Nov 23 09:36:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-63ab7b1688305e5f88e4974e557ea0bb87f0e73ce1236c8e61a48437546ff0ac-merged.mount: Deactivated successfully.
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.663 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.664 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.664 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:36:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:07.664 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:36:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.030 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.043 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.043 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.043 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.044 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.044 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.045 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:08 np0005532585.localdomain sudo[247871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:36:08 np0005532585.localdomain sudo[247871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:36:08 np0005532585.localdomain sudo[247871]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:08.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:36:09.248 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:36:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:36:09.249 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:36:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:36:09.250 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:36:09 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13094 DF PROTO=TCP SPT=38202 DPT=9100 SEQ=3105715667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D583A00000000001030307) 
Nov 23 09:36:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:09 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:36:09 np0005532585.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:09 np0005532585.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:09 np0005532585.localdomain podman[240668]: time="2025-11-23T09:36:09Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged: invalid argument"
Nov 23 09:36:09 np0005532585.localdomain podman[240668]: time="2025-11-23T09:36:09Z" level=error msg="Getting root fs size for \"c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b\": getting diffsize of layer \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\" and its parent \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\": creating overlay mount to /var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/NFUIYHX665CHWW36YXSJIBWYTL:/var/lib/containers/storage/overlay/l/JFEBLJKSZVKBBKIZ32WYGWWNT6:/var/lib/containers/storage/overlay/l/5TMMTLYMJES7T4R72OOYVFFCPF,upperdir=/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/diff,workdir=/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/work,nodev,metacopy=on\": no such file or directory"
Nov 23 09:36:09 np0005532585.localdomain podman[247860]: 2025-11-23 09:36:09.756824624 +0000 UTC m=+1.816395834 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:36:09 np0005532585.localdomain podman[247860]: 2025-11-23 09:36:09.772379313 +0000 UTC m=+1.831950513 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 09:36:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:10.385 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:10.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:36:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:10.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:36:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:11 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29558 DF PROTO=TCP SPT=40324 DPT=9882 SEQ=1115219792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D58DD70000000001030307) 
Nov 23 09:36:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5-merged.mount: Deactivated successfully.
Nov 23 09:36:12 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:12 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:36:12 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:36:12 np0005532585.localdomain systemd[1]: tmp-crun.OJLJJw.mount: Deactivated successfully.
Nov 23 09:36:12 np0005532585.localdomain podman[247897]: 2025-11-23 09:36:12.475725774 +0000 UTC m=+0.081632717 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Nov 23 09:36:12 np0005532585.localdomain podman[247897]: 2025-11-23 09:36:12.483151453 +0000 UTC m=+0.089058406 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 23 09:36:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:36:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:36:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:36:13 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:36:13 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:13 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:36:14 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29560 DF PROTO=TCP SPT=40324 DPT=9882 SEQ=1115219792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D599E00000000001030307) 
Nov 23 09:36:15 np0005532585.localdomain podman[247915]: 2025-11-23 09:36:15.02931509 +0000 UTC m=+0.085990462 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:36:15 np0005532585.localdomain podman[247915]: 2025-11-23 09:36:15.033754287 +0000 UTC m=+0.090429619 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:36:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:15.385 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:15.389 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:36:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af-merged.mount: Deactivated successfully.
Nov 23 09:36:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af-merged.mount: Deactivated successfully.
Nov 23 09:36:17 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:36:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:36:17 np0005532585.localdomain podman[247936]: 2025-11-23 09:36:17.825861803 +0000 UTC m=+0.127143299 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:36:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 23 09:36:18 np0005532585.localdomain podman[247936]: 2025-11-23 09:36:18.048430763 +0000 UTC m=+0.349712269 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 23 09:36:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27559 DF PROTO=TCP SPT=33334 DPT=9102 SEQ=2546460128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5A6210000000001030307) 
Nov 23 09:36:18 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:18 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:36:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 23 09:36:19 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:19 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:19 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:19 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:19 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:19 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:20.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:20.390 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:36:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 23 09:36:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-63ab7b1688305e5f88e4974e557ea0bb87f0e73ce1236c8e61a48437546ff0ac-merged.mount: Deactivated successfully.
Nov 23 09:36:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a8785be8dea5fa0361315af1fc74fe453e62e737d2ff3d773f6811b45d15cd9a-merged.mount: Deactivated successfully.
Nov 23 09:36:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully.
Nov 23 09:36:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully.
Nov 23 09:36:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully.
Nov 23 09:36:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35042 DF PROTO=TCP SPT=43398 DPT=9102 SEQ=4184572961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5BAE10000000001030307) 
Nov 23 09:36:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:36:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully.
Nov 23 09:36:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:36:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:36:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:36:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:36:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:36:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:25.390 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:25.392 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:25.392 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:25.393 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:25.423 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:25.424 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully.
Nov 23 09:36:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:26 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:26 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:27 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29562 DF PROTO=TCP SPT=40324 DPT=9882 SEQ=1115219792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5CA200000000001030307) 
Nov 23 09:36:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:36:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:36:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:36:29 np0005532585.localdomain podman[247954]: 2025-11-23 09:36:29.35508177 +0000 UTC m=+0.082603468 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:36:29 np0005532585.localdomain podman[247954]: 2025-11-23 09:36:29.36319514 +0000 UTC m=+0.090716818 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:36:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48423 DF PROTO=TCP SPT=50924 DPT=9101 SEQ=1717505873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5D3DA0000000001030307) 
Nov 23 09:36:29 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49955 DF PROTO=TCP SPT=59302 DPT=9105 SEQ=1764690671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5D48B0000000001030307) 
Nov 23 09:36:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 23 09:36:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5-merged.mount: Deactivated successfully.
Nov 23 09:36:30 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:30.424 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:30.426 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:30.426 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:30.427 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:30.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:30.454 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:31 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:31 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:31 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:36:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:32 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48425 DF PROTO=TCP SPT=50924 DPT=9101 SEQ=1717505873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5DFE10000000001030307) 
Nov 23 09:36:32 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:32 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:32 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:36:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:36:34 np0005532585.localdomain sudo[248063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvytxeeiepscbjomagopikifgecyjwaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890594.0940251-3039-48228899027630/AnsiballZ_file.py
Nov 23 09:36:34 np0005532585.localdomain sudo[248063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 23 09:36:34 np0005532585.localdomain python3.9[248065]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:34 np0005532585.localdomain sudo[248063]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:35 np0005532585.localdomain sudo[248173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnymyveicsuxcaqlzigvwkvtnxqwygcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890594.9422784-3067-99197595990099/AnsiballZ_stat.py
Nov 23 09:36:35 np0005532585.localdomain sudo[248173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:35 np0005532585.localdomain python3.9[248175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:35 np0005532585.localdomain sudo[248173]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:35.455 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:35.457 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:35.457 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:35.457 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:35.495 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:35.495 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:36:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:35 np0005532585.localdomain podman[248225]: 2025-11-23 09:36:35.694343015 +0000 UTC m=+0.085538708 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:36:35 np0005532585.localdomain podman[248225]: 2025-11-23 09:36:35.726436273 +0000 UTC m=+0.117631956 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:36:35 np0005532585.localdomain podman[248225]: unhealthy
Nov 23 09:36:35 np0005532585.localdomain sudo[248283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihdygqakeyvyqrnmyfmyqglvqrhmuklb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890594.9422784-3067-99197595990099/AnsiballZ_copy.py
Nov 23 09:36:35 np0005532585.localdomain sudo[248283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:35 np0005532585.localdomain python3.9[248285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890594.9422784-3067-99197595990099/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:35 np0005532585.localdomain sudo[248283]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:36 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12683 DF PROTO=TCP SPT=52656 DPT=9100 SEQ=637467846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5EC210000000001030307) 
Nov 23 09:36:36 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5-merged.mount: Deactivated successfully.
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 09:36:36 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Failed with result 'exit-code'.
Nov 23 09:36:36 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:36 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:36 np0005532585.localdomain sudo[248393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ashrxkgdobkudkyailluwbbaqnmvsfcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890596.3109384-3114-87567611113987/AnsiballZ_file.py
Nov 23 09:36:36 np0005532585.localdomain sudo[248393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:36 np0005532585.localdomain python3.9[248395]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:36 np0005532585.localdomain sudo[248393]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:37 np0005532585.localdomain sudo[248503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsegkgmjbgqbngjtyoarjymqshcuirpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890596.9385698-3138-270214560211830/AnsiballZ_stat.py
Nov 23 09:36:37 np0005532585.localdomain sudo[248503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:36:37 np0005532585.localdomain python3.9[248505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:37 np0005532585.localdomain sudo[248503]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:36:37 np0005532585.localdomain sudo[248560]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkvommhkzwqwbvaddhfefrjtqiqexsyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890596.9385698-3138-270214560211830/AnsiballZ_file.py
Nov 23 09:36:37 np0005532585.localdomain sudo[248560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:37 np0005532585.localdomain python3.9[248562]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:37 np0005532585.localdomain sudo[248560]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 23 09:36:38 np0005532585.localdomain sudo[248670]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guljxgijzkeuqsjwzsuqlhdravckhssx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890598.059364-3174-122142384801438/AnsiballZ_stat.py
Nov 23 09:36:38 np0005532585.localdomain sudo[248670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a8785be8dea5fa0361315af1fc74fe453e62e737d2ff3d773f6811b45d15cd9a-merged.mount: Deactivated successfully.
Nov 23 09:36:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:38 np0005532585.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 23 09:36:38 np0005532585.localdomain python3.9[248672]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:38 np0005532585.localdomain sudo[248670]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:38 np0005532585.localdomain sudo[248727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bayxdtyswnrlusivaihngavlcnuvqdbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890598.059364-3174-122142384801438/AnsiballZ_file.py
Nov 23 09:36:38 np0005532585.localdomain sudo[248727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:39 np0005532585.localdomain python3.9[248729]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ffkff0pr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:39 np0005532585.localdomain sudo[248727]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:39 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18761 DF PROTO=TCP SPT=59220 DPT=9100 SEQ=919474547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D5F8E00000000001030307) 
Nov 23 09:36:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:36:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310-merged.mount: Deactivated successfully.
Nov 23 09:36:39 np0005532585.localdomain podman[240668]: time="2025-11-23T09:36:39Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Nov 23 09:36:39 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:31:37 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Nov 23 09:36:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310-merged.mount: Deactivated successfully.
Nov 23 09:36:39 np0005532585.localdomain sudo[248837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tddgldhqisrzpumfixeyeupetbyjltqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890599.2218404-3210-117585695649176/AnsiballZ_stat.py
Nov 23 09:36:39 np0005532585.localdomain sudo[248837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:39 np0005532585.localdomain python3.9[248839]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:39 np0005532585.localdomain sudo[248837]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:39 np0005532585.localdomain sudo[248894]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozujcgntfcmjdnbbzvobbyaucqrxhawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890599.2218404-3210-117585695649176/AnsiballZ_file.py
Nov 23 09:36:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:36:39 np0005532585.localdomain sudo[248894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:40 np0005532585.localdomain podman[248896]: 2025-11-23 09:36:40.029354475 +0000 UTC m=+0.090009215 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:36:40 np0005532585.localdomain podman[248896]: 2025-11-23 09:36:40.066984096 +0000 UTC m=+0.127638856 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:36:40 np0005532585.localdomain python3.9[248897]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:40 np0005532585.localdomain sudo[248894]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully.
Nov 23 09:36:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:40.496 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:40.498 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:40.498 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:40.498 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:40.541 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:40.542 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:40 np0005532585.localdomain sudo[249030]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udwomktlcepibpqrhkmwmucjcmfdqqci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890600.5057077-3249-111024503922108/AnsiballZ_command.py
Nov 23 09:36:40 np0005532585.localdomain sudo[249030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:36:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully.
Nov 23 09:36:40 np0005532585.localdomain python3.9[249032]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:36:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully.
Nov 23 09:36:40 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:36:41 np0005532585.localdomain sudo[249030]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:36:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:36:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully.
Nov 23 09:36:41 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44437 DF PROTO=TCP SPT=38910 DPT=9882 SEQ=941438419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D603070000000001030307) 
Nov 23 09:36:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:36:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully.
Nov 23 09:36:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:36:42 np0005532585.localdomain podman[249089]: 2025-11-23 09:36:42.281935233 +0000 UTC m=+0.068529213 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 09:36:42 np0005532585.localdomain podman[249089]: 2025-11-23 09:36:42.321246624 +0000 UTC m=+0.107840634 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:36:42 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:36:42 np0005532585.localdomain sudo[249161]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbshuutbzhvfzxksmufyintqgracylpx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890602.1059237-3273-198434494940133/AnsiballZ_edpm_nftables_from_files.py
Nov 23 09:36:42 np0005532585.localdomain sudo[249161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:42 np0005532585.localdomain python3[249164]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 09:36:42 np0005532585.localdomain sudo[249161]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully.
Nov 23 09:36:43 np0005532585.localdomain sudo[249272]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-popdnvbtrobnquszglmtrzoziuzuwojm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890603.016007-3297-20604195674847/AnsiballZ_stat.py
Nov 23 09:36:43 np0005532585.localdomain sudo[249272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:43 np0005532585.localdomain python3.9[249274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:43 np0005532585.localdomain sudo[249272]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully.
Nov 23 09:36:43 np0005532585.localdomain sudo[249329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pztgteajwjoppivdgwbnojsvymfcfniy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890603.016007-3297-20604195674847/AnsiballZ_file.py
Nov 23 09:36:43 np0005532585.localdomain sudo[249329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:36:43 np0005532585.localdomain python3.9[249331]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:44 np0005532585.localdomain sudo[249329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:44 np0005532585.localdomain podman[249332]: 2025-11-23 09:36:44.038234174 +0000 UTC m=+0.092793711 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:36:44 np0005532585.localdomain podman[249332]: 2025-11-23 09:36:44.046229051 +0000 UTC m=+0.100788548 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:36:44 np0005532585.localdomain systemd[1]: tmp-crun.G02UJg.mount: Deactivated successfully.
Nov 23 09:36:44 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44439 DF PROTO=TCP SPT=38910 DPT=9882 SEQ=941438419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D60F200000000001030307) 
Nov 23 09:36:45 np0005532585.localdomain sudo[249457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etljhnujgmybtdyoaonjmrpwbqqkisrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890605.039023-3333-212229114096483/AnsiballZ_stat.py
Nov 23 09:36:45 np0005532585.localdomain sudo[249457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:45.543 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:45.545 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:45.545 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:45.545 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:45.583 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:45.583 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:45 np0005532585.localdomain python3.9[249459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:45 np0005532585.localdomain sudo[249457]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:45 np0005532585.localdomain sudo[249514]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvhoixdieezhzieryfcsbwmrafjuqtaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890605.039023-3333-212229114096483/AnsiballZ_file.py
Nov 23 09:36:45 np0005532585.localdomain sudo[249514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:36:46 np0005532585.localdomain python3.9[249516]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:46 np0005532585.localdomain sudo[249514]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:36:46 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:36:46 np0005532585.localdomain sudo[249624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztmztkfnwmkvilrgkvdtbvjrvdwiexrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890606.3403537-3369-196167826426435/AnsiballZ_stat.py
Nov 23 09:36:46 np0005532585.localdomain sudo[249624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:46 np0005532585.localdomain python3.9[249626]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:46 np0005532585.localdomain sudo[249624]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:47 np0005532585.localdomain sudo[249681]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmydyzgyhincdblatekfqezaewlthsxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890606.3403537-3369-196167826426435/AnsiballZ_file.py
Nov 23 09:36:47 np0005532585.localdomain sudo[249681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:47 np0005532585.localdomain python3.9[249683]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:47 np0005532585.localdomain sudo[249681]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:36:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:47 np0005532585.localdomain podman[249739]: 2025-11-23 09:36:47.749496121 +0000 UTC m=+0.087544109 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:36:47 np0005532585.localdomain podman[249739]: 2025-11-23 09:36:47.788337718 +0000 UTC m=+0.126385696 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:36:47 np0005532585.localdomain sudo[249813]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjopvadtzhbdwbgwhdgwoyxeyaxgyaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890607.559811-3405-245205009526669/AnsiballZ_stat.py
Nov 23 09:36:47 np0005532585.localdomain sudo[249813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 23 09:36:48 np0005532585.localdomain python3.9[249815]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:48 np0005532585.localdomain sudo[249813]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:48 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:36:48 np0005532585.localdomain sudo[249870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjimdqdwahfnurglxmrzbhcmkstxvzsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890607.559811-3405-245205009526669/AnsiballZ_file.py
Nov 23 09:36:48 np0005532585.localdomain sudo[249870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35044 DF PROTO=TCP SPT=43398 DPT=9102 SEQ=4184572961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D61C200000000001030307) 
Nov 23 09:36:48 np0005532585.localdomain python3.9[249872]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:48 np0005532585.localdomain sudo[249870]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:36:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:49 np0005532585.localdomain podman[249928]: 2025-11-23 09:36:49.030218334 +0000 UTC m=+0.084893107 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd)
Nov 23 09:36:49 np0005532585.localdomain podman[249928]: 2025-11-23 09:36:49.042409911 +0000 UTC m=+0.097084734 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251118, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:36:49 np0005532585.localdomain sudo[250001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvouncizygtpkhjrgkgtscwzgyordnfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890608.809193-3442-13907295190807/AnsiballZ_stat.py
Nov 23 09:36:49 np0005532585.localdomain sudo[250001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 23 09:36:49 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:36:49 np0005532585.localdomain python3.9[250003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:36:49 np0005532585.localdomain sudo[250001]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:49 np0005532585.localdomain sudo[250091]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqjpkuchkkhucibakcmoocuqfrobinsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890608.809193-3442-13907295190807/AnsiballZ_copy.py
Nov 23 09:36:49 np0005532585.localdomain sudo[250091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:49 np0005532585.localdomain python3.9[250093]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763890608.809193-3442-13907295190807/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:49 np0005532585.localdomain sudo[250091]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:50 np0005532585.localdomain sudo[250201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmgtkljfnhagyjpdksyjuymhcynenmht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890610.2584894-3486-179981471156183/AnsiballZ_file.py
Nov 23 09:36:50 np0005532585.localdomain sudo[250201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:50.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:50.586 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:50.587 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:50.587 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:50.628 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:50.629 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:50 np0005532585.localdomain python3.9[250203]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:50 np0005532585.localdomain sudo[250201]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:51 np0005532585.localdomain sudo[250311]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbuntsmyhtidrbzgyafyvdwvxnlqhouc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890610.9839754-3510-29493848514214/AnsiballZ_command.py
Nov 23 09:36:51 np0005532585.localdomain sudo[250311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:51 np0005532585.localdomain python3.9[250313]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:36:51 np0005532585.localdomain sudo[250311]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:52 np0005532585.localdomain sudo[250424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nttafnkzkdptyjhvkabkdjpyvvczhrdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890611.7945914-3534-68677019821205/AnsiballZ_blockinfile.py
Nov 23 09:36:52 np0005532585.localdomain sudo[250424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 23 09:36:52 np0005532585.localdomain python3.9[250426]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:52 np0005532585.localdomain sudo[250424]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5-merged.mount: Deactivated successfully.
Nov 23 09:36:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5-merged.mount: Deactivated successfully.
Nov 23 09:36:53 np0005532585.localdomain sudo[250534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axgmegiwbcxppxcxpssgowlidbjvpsmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890612.926433-3562-53158895015012/AnsiballZ_command.py
Nov 23 09:36:53 np0005532585.localdomain sudo[250534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:53 np0005532585.localdomain python3.9[250536]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:36:53 np0005532585.localdomain sudo[250534]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=253 DF PROTO=TCP SPT=43814 DPT=9102 SEQ=423925314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D630200000000001030307) 
Nov 23 09:36:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:36:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:36:53 np0005532585.localdomain sudo[250645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcdjjpaskiopuexqygopzyjxshojnqwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890613.6311061-3585-186961092002442/AnsiballZ_stat.py
Nov 23 09:36:53 np0005532585.localdomain sudo[250645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:54 np0005532585.localdomain python3.9[250647]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:36:54 np0005532585.localdomain sudo[250645]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 23 09:36:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:54 np0005532585.localdomain sudo[250757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmovpupwxxpmqgqrlkfexpjjnrlsldzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890614.343807-3609-164386972315255/AnsiballZ_command.py
Nov 23 09:36:54 np0005532585.localdomain sudo[250757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 23 09:36:54 np0005532585.localdomain python3.9[250759]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:36:54 np0005532585.localdomain sudo[250757]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:55 np0005532585.localdomain auditd[727]: Audit daemon rotating log files
Nov 23 09:36:55 np0005532585.localdomain sudo[250870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnwrpltshbunjlnryohexdwxtxkpiosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890615.041936-3633-130550318526183/AnsiballZ_file.py
Nov 23 09:36:55 np0005532585.localdomain sudo[250870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:36:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 23 09:36:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310-merged.mount: Deactivated successfully.
Nov 23 09:36:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310-merged.mount: Deactivated successfully.
Nov 23 09:36:55 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:31:42 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 142713 "" "Go-http-client/1.1"
Nov 23 09:36:55 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:36:55.489Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 23 09:36:55 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:36:55.490Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 23 09:36:55 np0005532585.localdomain podman_exporter[240889]: ts=2025-11-23T09:36:55.490Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Nov 23 09:36:55 np0005532585.localdomain python3.9[250872]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:36:55 np0005532585.localdomain sudo[250870]: pam_unix(sudo:session): session closed for user root
Nov 23 09:36:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:55.630 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:55.632 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:36:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:55.632 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:36:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:55.633 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:55.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:36:55 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:36:55.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:36:56 np0005532585.localdomain sshd[243489]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:36:56 np0005532585.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Nov 23 09:36:56 np0005532585.localdomain systemd[1]: session-56.scope: Consumed 28.367s CPU time.
Nov 23 09:36:56 np0005532585.localdomain systemd-logind[761]: Session 56 logged out. Waiting for processes to exit.
Nov 23 09:36:56 np0005532585.localdomain systemd-logind[761]: Removed session 56.
Nov 23 09:36:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:36:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:36:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:36:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:36:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:36:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:37:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:37:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:37:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:00 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:37:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:37:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:00.672 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:00.673 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:00.674 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:00.674 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:00.703 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:00 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:00.704 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=254 DF PROTO=TCP SPT=43814 DPT=9102 SEQ=423925314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D650200000000001030307) 
Nov 23 09:37:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:37:02 np0005532585.localdomain podman[250896]: 2025-11-23 09:37:02.03043586 +0000 UTC m=+0.083255827 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:37:02 np0005532585.localdomain podman[250896]: 2025-11-23 09:37:02.070235377 +0000 UTC m=+0.123055374 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:37:02 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:37:02 np0005532585.localdomain sshd[250915]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:37:02 np0005532585.localdomain sshd[250915]: Accepted publickey for zuul from 192.168.122.30 port 48384 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:37:02 np0005532585.localdomain systemd-logind[761]: New session 57 of user zuul.
Nov 23 09:37:02 np0005532585.localdomain systemd[1]: Started Session 57 of User zuul.
Nov 23 09:37:02 np0005532585.localdomain sshd[250915]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:37:03 np0005532585.localdomain sudo[251026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeumbqgihwzkvbnmwubzuvqwpzdcjasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890622.6653101-27-16069491302465/AnsiballZ_file.py
Nov 23 09:37:03 np0005532585.localdomain sudo[251026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:03 np0005532585.localdomain python3.9[251028]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:03 np0005532585.localdomain sudo[251026]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:03 np0005532585.localdomain sudo[251136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnhnhbmjmmecztuftjtnwigccbrjdoce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890623.5570343-27-127342157294353/AnsiballZ_file.py
Nov 23 09:37:03 np0005532585.localdomain sudo[251136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:04 np0005532585.localdomain python3.9[251138]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:04 np0005532585.localdomain sudo[251136]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:04 np0005532585.localdomain sudo[251246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkelauearadbpbrhpbirechvgamrpwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890624.214133-27-278032027116401/AnsiballZ_file.py
Nov 23 09:37:04 np0005532585.localdomain sudo[251246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:04 np0005532585.localdomain python3.9[251248]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:04 np0005532585.localdomain sudo[251246]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:05 np0005532585.localdomain python3.9[251356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.704 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.706 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.706 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.706 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.736 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.737 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.741 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.742 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.742 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.743 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:37:05 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:05.743 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:37:06 np0005532585.localdomain python3.9[251462]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890624.8607254-105-32040181574826/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.156 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.235 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.236 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.420 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.421 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12358MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.421 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.422 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.473 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.473 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.473 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.507 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:37:06 np0005532585.localdomain python3.9[251573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.921 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.964 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:37:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.978 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.980 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:37:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:06.980 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:37:07 np0005532585.localdomain podman[251644]: 2025-11-23 09:37:07.057724426 +0000 UTC m=+0.078402267 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:37:07 np0005532585.localdomain podman[251644]: 2025-11-23 09:37:07.065592959 +0000 UTC m=+0.086270800 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:37:07 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:37:07 np0005532585.localdomain python3.9[251693]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890626.3337617-150-93623025259296/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:07 np0005532585.localdomain python3.9[251811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:07.980 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:07.981 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:07.981 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:37:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:07.981 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:37:08 np0005532585.localdomain python3.9[251897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890627.387503-150-184845381003822/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:08 np0005532585.localdomain sudo[251986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:37:08 np0005532585.localdomain sudo[251986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:37:08 np0005532585.localdomain sudo[251986]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:08.687 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:37:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:08.687 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:37:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:08.688 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:37:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:08.688 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:37:08 np0005532585.localdomain sudo[252024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:37:08 np0005532585.localdomain sudo[252024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:37:08 np0005532585.localdomain python3.9[252022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:37:09.249 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:37:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:37:09.251 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:37:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:37:09.252 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:37:09 np0005532585.localdomain python3.9[252141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890628.4458354-150-10218077851948/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=15ac25f966f9cb8b3bd5b9b2dce4587a8dd8a371 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:09 np0005532585.localdomain sudo[252024]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.720 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.742 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.743 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.744 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.744 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.745 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:09.745 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:10 np0005532585.localdomain sudo[252176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:37:10 np0005532585.localdomain sudo[252176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:37:10 np0005532585.localdomain sudo[252176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.718 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.737 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.739 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.739 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.739 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.772 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:10.773 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:10 np0005532585.localdomain python3.9[252284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.801 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.801 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.804 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '195d1d12-c7aa-41ec-b871-60a1cfd63c36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.801752', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc390754-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': 'd47454a52b33f6d33b3325ca18c6930bd6759a9c16d1a41ffc632834ad207e42'}]}, 'timestamp': '2025-11-23 09:37:10.805433', '_unique_id': '062cee9cdb744cf7b132b4b992e1aba8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf5b26bf-66a9-4b43-b51f-042c8626a703', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.806929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc3fb860-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '905dca719bc02c91e67995e9395265fcb23b259ce51a5718a8b9b65f12833efc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.806929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc3fc210-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': 'cfb585266bf8e8d1d45e48f178528b71f0c3ef5a40f5285729af01481a4897a9'}]}, 'timestamp': '2025-11-23 09:37:10.849488', '_unique_id': 'cbfcda770bbd4c008f08898ef7db121a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ed5666e-b462-405a-8552-5d36591a095b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.850988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc400536-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '19d8e31e2da3c8a56f546ff8b1fcb21f700fae78adb228416ef5024e0167a553'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.850988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc400cb6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': 'd79d1ae3fba450409fc6a9713d268c6a0f8a02cf359f11040fe7fa9821c7234a'}]}, 'timestamp': '2025-11-23 09:37:10.851389', '_unique_id': '0b185d8c3786412e8438e0b495bfd4f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.851 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e053c28-77e6-4d99-a336-5fef9650a18b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.852403', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc403c68-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': '2c550518ddd88f34391477bc9902a653f34717fdc4d5978c9116887bc99b5074'}]}, 'timestamp': '2025-11-23 09:37:10.852626', '_unique_id': '98ac099eefb3403fb47f2d829ba8f80f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.853 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '722a45e7-4679-4cfd-9354-743c06b03626', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.853632', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc406c56-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': '0165bb88e81df8ddd89bb18d56400871107056e35b6bad811fabb79e8efaefc5'}]}, 'timestamp': '2025-11-23 09:37:10.853852', '_unique_id': '3fa6e0cbff724823b883b54aa37a83e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.854 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24a17303-4c5f-4ec6-a626-fda19eb03ca5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.855070', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc40a4c8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': 'eb9970b0ac4597681131622f2275eb0953bdaf0ec008339ae6a81ef30406e4c8'}]}, 'timestamp': '2025-11-23 09:37:10.855297', '_unique_id': 'c003aec1fdd141958af83515673bfc95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c98dd1c0-f628-4d48-8354-1d19d909ffbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.856323', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc40d574-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': '1a1f8f8a4a809cb2ca762a24c7f8d7459409421ddad799d0fa3df64b3ef72160'}]}, 'timestamp': '2025-11-23 09:37:10.856542', '_unique_id': 'f070ae66f70249949b0ef8600ae919d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.856 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.857 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3782737-3f66-4f0a-b779-fdcc285030ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.857540', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc4104f4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': '70341ab760c1f42eff5434ed64a928e67d1a05f031ab3e7315ba78853cb21bbd'}]}, 'timestamp': '2025-11-23 09:37:10.857758', '_unique_id': 'ad050b2df18f4e4baa9687abf5b7d24c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.858 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35209d33-e768-4972-9f8d-aa8273d3b107', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.858776', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc431a78-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.036388222, 'message_signature': '2b53e15ad5041c1db272dcb156e6ac1a482512349cc454702dddf4c327fc075b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.858776', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc432284-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.036388222, 'message_signature': 'b7e47f260c00e5605557f3a97718d89848ad90c2d9ccde8f8c6eed26411dbdd0'}]}, 'timestamp': '2025-11-23 09:37:10.871609', '_unique_id': '9cc9babd4624493e9c3f933eedb3d1a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4d6e4ee-9393-4c96-ad44-d3fb18adfce4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.872680', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc4354ca-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': '776f32310b2b7df78433b4b377071613e392a5714d346f31d5bf0dbb3f222849'}]}, 'timestamp': '2025-11-23 09:37:10.872931', '_unique_id': '4f73172920f042478019e3920c0500e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07f9ee18-2a26-4034-9bdd-57c680cb5e62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.873988', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc438756-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': 'a3aefc8818293812883f66f19a0a0ffd60c358466a0a8a05dc7c2beaa5d99899'}]}, 'timestamp': '2025-11-23 09:37:10.874213', '_unique_id': '7446d62a7b1344c69dfb7e7f8435f3ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.875 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.875 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bc85438-3e3c-47f7-8745-1953a76ab92c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.875364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc43beb0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '30ea797b8710a14a8fd0c677e186a0c4d18a6426bb4ee20da1bd2516fa4eb50f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.875364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc43c950-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': 'd0bdee15e2b5c2b10bcd7f70b6ee50a135d7f0679175120927c23ce2bbbf30f9'}]}, 'timestamp': '2025-11-23 09:37:10.875941', '_unique_id': '3a6215b699114e82994913513b68d6be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd502de3-63bc-4a9b-8171-2a16b7c25959', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.877358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc440cb2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '3618d9bee58bad02a4629a9106a536efddb4c144400ef1bc23c4388bd5208823'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.877358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc441766-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '440b2b887fea4d0725f3794d38b992e735a0e3885755915baebc74af22392410'}]}, 'timestamp': '2025-11-23 09:37:10.878033', '_unique_id': '575eb4f8da9b42c6bef8d2a978c6f994'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6a67af1-1409-44e2-bed1-69a451057a9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.879451', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc445e60-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.036388222, 'message_signature': '3352908a74f4dfc410f1b9c088c355992638f271207d7e15dda01707915b0cda'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.879451', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc4468e2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.036388222, 'message_signature': '86ec077870ea8c0bf2ed4da5668dbf4a3ae769ebd0dab12966be234ee36b28a5'}]}, 'timestamp': '2025-11-23 09:37:10.880026', '_unique_id': '17d263f12a6f49c7b9c38d195f7c87b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '806fbc0e-e485-414c-b87a-805a211aca5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.881455', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc44acd0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': 'cbdb8703ab55bd602d6cf79344f3f434905ef563afea47bc02dd8c5160486926'}]}, 'timestamp': '2025-11-23 09:37:10.881759', '_unique_id': '20c9e07b83824ae09b66e92ee548f8f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 53840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc30ab5e-d871-46a2-a090-4da13e46694c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53840000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:37:10.883183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fc48a57e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.084885087, 'message_signature': 'bb0cba19f808f5a0f52778956d306433486e8e35e19688b930cbf60350f419f1'}]}, 'timestamp': '2025-11-23 09:37:10.907814', '_unique_id': '526a0410b4a54a35a123d91133877681'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0341501-2d42-4f1c-9034-0b5495c8f407', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:37:10.909665', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'fc48fa92-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.979348204, 'message_signature': '8e4f9bd97aa600df263a143ac36c490ebacd957c8c40d431c0bf69994933cebb'}]}, 'timestamp': '2025-11-23 09:37:10.909986', '_unique_id': '77c2d948997a48ac97ab7c478fd56dd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1463b13c-93b3-4b26-a8d6-7453820ff873', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.911529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc494308-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': 'a7a3eb89a0815083d36452a8b7477c38eccce3ed65c17fcd25cbc4f574582d76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.911529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc494e0c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '522982cbaea40351177089202f7de85d572e3af9bd95c2f667e66a7a8d849eaf'}]}, 'timestamp': '2025-11-23 09:37:10.912079', '_unique_id': 'c4668c4bb57e4919a7b2f87fd3d692ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.913 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.913 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '621e1bef-f67b-4db3-bff3-7fe007be5add', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.913598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc4993e4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.036388222, 'message_signature': '4e66978e202ed1fa6ef4d3d61db3b754bc8f9ded7b66c1c39bd885f855b0f401'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.913598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc499ee8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.036388222, 'message_signature': '14a8f92f4a2ef2da313138c1674881e5294ccde780664d9a162530bf176147c3'}]}, 'timestamp': '2025-11-23 09:37:10.914149', '_unique_id': '1fccf969bbad44ed95e0ebe692e06346'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '663d6a01-d02d-4ce6-9d48-8eba30ab3c85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:37:10.915519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fc49df02-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10574.084885087, 'message_signature': 'ab53d53ada343b2dfb1a38587e5265cb198f847d9d8818f6b5728642b5fa068a'}]}, 'timestamp': '2025-11-23 09:37:10.915797', '_unique_id': '4c98e51a5ec34bc9b4f4e0df551e4d49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '826feb32-4e72-471b-b0ef-f4ca8b084d60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:37:10.917143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fc4a1e54-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': 'c1d6eae39493bb1405941652f37b7907db9158bfeac480bd79e1c84d0c3b0e83'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:37:10.917143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fc4a287c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10573.984521304, 'message_signature': '5138b3dd60882615aac27ecaab3abcd577fc1c51ae5d9faa7b229c8ef91add80'}]}, 'timestamp': '2025-11-23 09:37:10.917669', '_unique_id': '3d8ee8d808f24ca1a2143d4fd8716e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:37:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:37:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:37:11 np0005532585.localdomain python3.9[252370]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890630.3134668-324-149590681080002/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=1067e04911e84d9dc262158a63dd8e464b0e5dfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:11.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:37:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:11.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:37:11 np0005532585.localdomain python3.9[252478]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:37:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:37:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:37:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:37:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144349 "" "Go-http-client/1.1"
Nov 23 09:37:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:37:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:37:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16348 "" "Go-http-client/1.1"
Nov 23 09:37:12 np0005532585.localdomain systemd[1]: tmp-crun.1uixS1.mount: Deactivated successfully.
Nov 23 09:37:12 np0005532585.localdomain podman[252483]: 2025-11-23 09:37:12.051416159 +0000 UTC m=+0.108203476 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:37:12 np0005532585.localdomain podman[252483]: 2025-11-23 09:37:12.080050811 +0000 UTC m=+0.136838178 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:37:12 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:37:12 np0005532585.localdomain sudo[252615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urzfqaifpydzexibyvfnetbkdzfvutep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890632.1470928-396-57026940737986/AnsiballZ_file.py
Nov 23 09:37:12 np0005532585.localdomain sudo[252615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:12 np0005532585.localdomain python3.9[252617]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:12 np0005532585.localdomain sudo[252615]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:37:13 np0005532585.localdomain podman[252689]: 2025-11-23 09:37:13.032720904 +0000 UTC m=+0.086986342 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Nov 23 09:37:13 np0005532585.localdomain podman[252689]: 2025-11-23 09:37:13.045516809 +0000 UTC m=+0.099782237 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 23 09:37:13 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:37:13 np0005532585.localdomain sudo[252746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efzhopnrrnijnpultjhzadejcynushil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890632.8004642-420-50952068131888/AnsiballZ_stat.py
Nov 23 09:37:13 np0005532585.localdomain sudo[252746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:13 np0005532585.localdomain python3.9[252748]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:13 np0005532585.localdomain sudo[252746]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:13 np0005532585.localdomain sudo[252803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sofufzncsecfcyoclerwgxcuizljxzix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890632.8004642-420-50952068131888/AnsiballZ_file.py
Nov 23 09:37:13 np0005532585.localdomain sudo[252803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:13 np0005532585.localdomain python3.9[252805]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:13 np0005532585.localdomain sudo[252803]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:14 np0005532585.localdomain sudo[252913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrfnvhdqojdmgvgnlytescczklbuzced ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890633.8859396-420-211942699603376/AnsiballZ_stat.py
Nov 23 09:37:14 np0005532585.localdomain sudo[252913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:14 np0005532585.localdomain python3.9[252915]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:14 np0005532585.localdomain sudo[252913]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:14 np0005532585.localdomain sudo[252970]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvzunhncwsdiqsmvgalnjiynwroypzlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890633.8859396-420-211942699603376/AnsiballZ_file.py
Nov 23 09:37:14 np0005532585.localdomain sudo[252970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:14 np0005532585.localdomain python3.9[252972]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:14 np0005532585.localdomain sudo[252970]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:15 np0005532585.localdomain sudo[253080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmugfcftqsxtdwmxszuiqapgyypwfzns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890634.9087164-489-133719667397612/AnsiballZ_file.py
Nov 23 09:37:15 np0005532585.localdomain sudo[253080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:15 np0005532585.localdomain python3.9[253082]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:15 np0005532585.localdomain sudo[253080]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:15.774 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:15.776 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:15.777 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:15.777 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:15.807 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:15.808 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42259 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D6897B0000000001030307) 
Nov 23 09:37:16 np0005532585.localdomain sudo[253190]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvriqdiegigufsdgyfornqwclfthcqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890636.2398741-513-260541713471770/AnsiballZ_stat.py
Nov 23 09:37:16 np0005532585.localdomain sudo[253190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:37:16 np0005532585.localdomain systemd[1]: tmp-crun.FO3GUw.mount: Deactivated successfully.
Nov 23 09:37:16 np0005532585.localdomain podman[253193]: 2025-11-23 09:37:16.65929115 +0000 UTC m=+0.096105974 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:37:16 np0005532585.localdomain podman[253193]: 2025-11-23 09:37:16.664227692 +0000 UTC m=+0.101042516 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:37:16 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:37:16 np0005532585.localdomain python3.9[253192]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:16 np0005532585.localdomain sudo[253190]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:16 np0005532585.localdomain sudo[253267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drwdwvmtajjsjplxabjlalnuqzifnlro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890636.2398741-513-260541713471770/AnsiballZ_file.py
Nov 23 09:37:16 np0005532585.localdomain sudo[253267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:17 np0005532585.localdomain python3.9[253269]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:17 np0005532585.localdomain sudo[253267]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42260 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D68DA00000000001030307) 
Nov 23 09:37:17 np0005532585.localdomain sudo[253377]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvxexappwfgdwpkzfbxsjcymyailgmun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890637.406837-549-19002656439659/AnsiballZ_stat.py
Nov 23 09:37:17 np0005532585.localdomain sudo[253377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=255 DF PROTO=TCP SPT=43814 DPT=9102 SEQ=423925314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D690200000000001030307) 
Nov 23 09:37:18 np0005532585.localdomain python3.9[253379]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:18 np0005532585.localdomain sudo[253377]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:18 np0005532585.localdomain sudo[253434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydtlllyjnrjebttrlpmbtzkfxdmxlqet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890637.406837-549-19002656439659/AnsiballZ_file.py
Nov 23 09:37:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:37:18 np0005532585.localdomain sudo[253434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:18 np0005532585.localdomain podman[253436]: 2025-11-23 09:37:18.395770611 +0000 UTC m=+0.069366619 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:37:18 np0005532585.localdomain podman[253436]: 2025-11-23 09:37:18.429230522 +0000 UTC m=+0.102826550 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:37:18 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:37:18 np0005532585.localdomain python3.9[253437]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:18 np0005532585.localdomain sudo[253434]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42261 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D695A00000000001030307) 
Nov 23 09:37:19 np0005532585.localdomain sudo[253566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fewrrendcpxmzmrwnoptsowajchcphvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890638.725872-585-65100720258515/AnsiballZ_systemd.py
Nov 23 09:37:19 np0005532585.localdomain sudo[253566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:37:19 np0005532585.localdomain systemd[1]: tmp-crun.NXBCyA.mount: Deactivated successfully.
Nov 23 09:37:19 np0005532585.localdomain podman[253569]: 2025-11-23 09:37:19.978324958 +0000 UTC m=+0.091321496 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 23 09:37:19 np0005532585.localdomain podman[253569]: 2025-11-23 09:37:19.993249718 +0000 UTC m=+0.106246266 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:37:20 np0005532585.localdomain python3.9[253568]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:37:20 np0005532585.localdomain systemd-rc-local-generator[253612]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:37:20 np0005532585.localdomain systemd-sysv-generator[253617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35045 DF PROTO=TCP SPT=43398 DPT=9102 SEQ=4184572961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D69A200000000001030307) 
Nov 23 09:37:20 np0005532585.localdomain sudo[253566]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.808 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.810 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.811 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.811 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.845 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.845 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:20 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:20.847 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:21 np0005532585.localdomain sudo[253733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pezkzhakqumdzxproascqbluctlcoqvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890640.7184708-609-236595582902725/AnsiballZ_stat.py
Nov 23 09:37:21 np0005532585.localdomain sudo[253733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:21 np0005532585.localdomain python3.9[253735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:21 np0005532585.localdomain sudo[253733]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:22 np0005532585.localdomain sudo[253790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wujqkiaqiwxzgzddwxweymabehluxrdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890640.7184708-609-236595582902725/AnsiballZ_file.py
Nov 23 09:37:22 np0005532585.localdomain sudo[253790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:22 np0005532585.localdomain python3.9[253792]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:22 np0005532585.localdomain sudo[253790]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:22 np0005532585.localdomain sudo[253900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmmrhhlgpiylemqwtpucbhhhrsvwunlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890642.6452162-645-93108699252935/AnsiballZ_stat.py
Nov 23 09:37:22 np0005532585.localdomain sudo[253900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:23 np0005532585.localdomain python3.9[253902]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:23 np0005532585.localdomain sudo[253900]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:23 np0005532585.localdomain sudo[253957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exyvqzczxvzacyhppulzbjemphxftclx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890642.6452162-645-93108699252935/AnsiballZ_file.py
Nov 23 09:37:23 np0005532585.localdomain sudo[253957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42262 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D6A5600000000001030307) 
Nov 23 09:37:23 np0005532585.localdomain python3.9[253959]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:23 np0005532585.localdomain sudo[253957]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:24 np0005532585.localdomain sudo[254067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veeucjyiatqwfmyourdexlohjaazfcff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890643.7805269-681-237318964502932/AnsiballZ_systemd.py
Nov 23 09:37:24 np0005532585.localdomain sudo[254067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:24 np0005532585.localdomain python3.9[254069]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:37:24 np0005532585.localdomain systemd-sysv-generator[254097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:37:24 np0005532585.localdomain systemd-rc-local-generator[254093]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:37:24 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:37:24 np0005532585.localdomain sudo[254067]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:25 np0005532585.localdomain sudo[254219]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqslubmlwgmrvetznzxsbgkiqexpbpmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890645.3641098-711-260356659063154/AnsiballZ_file.py
Nov 23 09:37:25 np0005532585.localdomain sudo[254219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:25 np0005532585.localdomain python3.9[254221]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:37:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:25.848 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:25.851 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:25.851 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:25.851 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:25 np0005532585.localdomain sudo[254219]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:25.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:25 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:25.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:26 np0005532585.localdomain sudo[254329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maetqtxvintrxwoyknwbsszqlkhnrgas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890646.0631018-735-181285469941966/AnsiballZ_stat.py
Nov 23 09:37:26 np0005532585.localdomain sudo[254329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:26 np0005532585.localdomain python3.9[254331]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:37:26 np0005532585.localdomain sudo[254329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:27 np0005532585.localdomain sudo[254417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhytzkafaccekvlqwpyqlxleajpgoxcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890646.0631018-735-181285469941966/AnsiballZ_copy.py
Nov 23 09:37:27 np0005532585.localdomain sudo[254417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:27 np0005532585.localdomain python3.9[254419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890646.0631018-735-181285469941966/.source.json _original_basename=.2dy13t8v follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:27 np0005532585.localdomain sudo[254417]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:27 np0005532585.localdomain sudo[254527]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfztubuaqyfdiwrahgbhbwxjvdhmrlgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890647.5902374-780-191162379823495/AnsiballZ_file.py
Nov 23 09:37:27 np0005532585.localdomain sudo[254527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:28 np0005532585.localdomain python3.9[254529]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:28 np0005532585.localdomain sudo[254527]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:28 np0005532585.localdomain sudo[254637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpdsdvgpmipjvrlovvbhlgthfpqwwodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890648.2563498-804-274147364829906/AnsiballZ_stat.py
Nov 23 09:37:28 np0005532585.localdomain sudo[254637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:28 np0005532585.localdomain sudo[254637]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:29 np0005532585.localdomain sudo[254725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbxtyuhqbaiddbqpoydvktfsiecszdck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890648.2563498-804-274147364829906/AnsiballZ_copy.py
Nov 23 09:37:29 np0005532585.localdomain sudo[254725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:29 np0005532585.localdomain sudo[254725]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:37:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:37:30 np0005532585.localdomain sudo[254835]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omquwrrsuzkpjstndtzingryxezbiqcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890649.6334713-855-256470946610199/AnsiballZ_container_config_data.py
Nov 23 09:37:30 np0005532585.localdomain sudo[254835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:30 np0005532585.localdomain python3.9[254837]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Nov 23 09:37:30 np0005532585.localdomain sudo[254835]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:30.875 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:30.877 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:30.877 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:30.878 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:30.896 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:30 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:30.897 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:31 np0005532585.localdomain sudo[254945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atwzzquzgxefmimttfwbgncfgylrienm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890650.5026631-882-262269254542360/AnsiballZ_container_config_hash.py
Nov 23 09:37:31 np0005532585.localdomain sudo[254945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:31 np0005532585.localdomain python3.9[254947]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:37:31 np0005532585.localdomain sudo[254945]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42263 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D6C6200000000001030307) 
Nov 23 09:37:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:37:32 np0005532585.localdomain podman[254964]: 2025-11-23 09:37:32.297528142 +0000 UTC m=+0.086897729 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:37:32 np0005532585.localdomain podman[254964]: 2025-11-23 09:37:32.332326665 +0000 UTC m=+0.121696252 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 23 09:37:32 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:37:32 np0005532585.localdomain sudo[255076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maaivkttibebcbfqguugpkmamlyuanws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890652.4051054-909-171157815942289/AnsiballZ_podman_container_info.py
Nov 23 09:37:32 np0005532585.localdomain sudo[255076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:33 np0005532585.localdomain python3.9[255078]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 09:37:33 np0005532585.localdomain sudo[255076]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:35.898 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:35.900 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:35.900 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:35.900 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:35.930 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:35 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:35.931 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:37 np0005532585.localdomain sudo[255211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eczmiufvffhugaftweqljhqeersmdcro ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890656.6616156-948-278592748385781/AnsiballZ_edpm_container_manage.py
Nov 23 09:37:37 np0005532585.localdomain sudo[255211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:37:37 np0005532585.localdomain podman[255214]: 2025-11-23 09:37:37.283124796 +0000 UTC m=+0.079945435 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:37:37 np0005532585.localdomain podman[255214]: 2025-11-23 09:37:37.316490885 +0000 UTC m=+0.113311454 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:37:37 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:37:37 np0005532585.localdomain python3[255213]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:37:37 np0005532585.localdomain podman[255273]: 
Nov 23 09:37:37 np0005532585.localdomain podman[255273]: 2025-11-23 09:37:37.718115183 +0000 UTC m=+0.088581482 container create 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, tcib_managed=true, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:37:37 np0005532585.localdomain podman[255273]: 2025-11-23 09:37:37.673532689 +0000 UTC m=+0.043999028 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 23 09:37:37 np0005532585.localdomain python3[255213]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=d192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 23 09:37:37 np0005532585.localdomain sudo[255211]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:38 np0005532585.localdomain sudo[255418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tftjhsklnkpqiwzldjfczjhsinkbxwjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890658.115218-972-268722190358286/AnsiballZ_stat.py
Nov 23 09:37:38 np0005532585.localdomain sudo[255418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:38 np0005532585.localdomain python3.9[255420]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:37:38 np0005532585.localdomain sudo[255418]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:39 np0005532585.localdomain sudo[255530]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qicndhotudjabsznmajqmnkspbawjhup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890658.8539288-999-226585644689851/AnsiballZ_file.py
Nov 23 09:37:39 np0005532585.localdomain sudo[255530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:39 np0005532585.localdomain python3.9[255532]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:39 np0005532585.localdomain sudo[255530]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:39 np0005532585.localdomain sudo[255585]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgydnxlpssypbhtbebmxbodanburpgff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890658.8539288-999-226585644689851/AnsiballZ_stat.py
Nov 23 09:37:39 np0005532585.localdomain sudo[255585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:39 np0005532585.localdomain python3.9[255587]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:37:39 np0005532585.localdomain sudo[255585]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:40 np0005532585.localdomain sudo[255694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfglsdttiqpctclhhywhdjnqjqgzkqcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890659.79506-999-241354458670538/AnsiballZ_copy.py
Nov 23 09:37:40 np0005532585.localdomain sudo[255694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:40 np0005532585.localdomain python3.9[255696]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890659.79506-999-241354458670538/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:37:40 np0005532585.localdomain sudo[255694]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:40 np0005532585.localdomain sudo[255749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppvlnsfsffdqhgpwcruksjwwsvwlqpmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890659.79506-999-241354458670538/AnsiballZ_systemd.py
Nov 23 09:37:40 np0005532585.localdomain sudo[255749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:40.932 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:40.934 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:40.934 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:40.935 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:40 np0005532585.localdomain python3.9[255751]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:37:40 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:37:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:40.969 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:40 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:40.970 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:41 np0005532585.localdomain systemd-rc-local-generator[255775]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:37:41 np0005532585.localdomain systemd-sysv-generator[255780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:41 np0005532585.localdomain sudo[255749]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:41 np0005532585.localdomain sudo[255840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltrvymrnxclbanyhpkzlsregqwplskld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890659.79506-999-241354458670538/AnsiballZ_systemd.py
Nov 23 09:37:41 np0005532585.localdomain sudo[255840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:41 np0005532585.localdomain python3.9[255842]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:37:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:37:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:37:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:37:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146306 "" "Go-http-client/1.1"
Nov 23 09:37:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:37:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16657 "" "Go-http-client/1.1"
Nov 23 09:37:41 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:37:42 np0005532585.localdomain systemd-rc-local-generator[255870]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:37:42 np0005532585.localdomain systemd-sysv-generator[255876]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: Starting neutron_sriov_agent container...
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: tmp-crun.9UAnDO.mount: Deactivated successfully.
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:37:42 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1105a81fdf35eedfaee35be7ff37c0efd44ec8eeca2ada53174bfd17baecbd5e/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 23 09:37:42 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1105a81fdf35eedfaee35be7ff37c0efd44ec8eeca2ada53174bfd17baecbd5e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:37:42 np0005532585.localdomain podman[255885]: 2025-11-23 09:37:42.437398626 +0000 UTC m=+0.130837384 container init 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=neutron_sriov_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:37:42 np0005532585.localdomain podman[255883]: 2025-11-23 09:37:42.444238877 +0000 UTC m=+0.140512523 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + sudo -E kolla_set_configs
Nov 23 09:37:42 np0005532585.localdomain podman[255885]: 2025-11-23 09:37:42.49920058 +0000 UTC m=+0.192639348 container start 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=neutron_sriov_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:37:42 np0005532585.localdomain podman[255885]: neutron_sriov_agent
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: Started neutron_sriov_agent container.
Nov 23 09:37:42 np0005532585.localdomain podman[255883]: 2025-11-23 09:37:42.511327435 +0000 UTC m=+0.207601051 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 23 09:37:42 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Validating config file
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Copying service configuration files
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Writing out command to execute
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: ++ cat /run_command
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + CMD=/usr/bin/neutron-sriov-nic-agent
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + ARGS=
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + sudo kolla_copy_cacerts
Nov 23 09:37:42 np0005532585.localdomain sudo[255840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + [[ ! -n '' ]]
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + . kolla_extend_start
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + umask 0022
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: + exec /usr/bin/neutron-sriov-nic-agent
Nov 23 09:37:42 np0005532585.localdomain neutron_sriov_agent[255910]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Nov 23 09:37:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:37:43 np0005532585.localdomain podman[255955]: 2025-11-23 09:37:43.268228106 +0000 UTC m=+0.079730080 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7)
Nov 23 09:37:43 np0005532585.localdomain podman[255955]: 2025-11-23 09:37:43.283665325 +0000 UTC m=+0.095167309 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 23 09:37:43 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.136 2 INFO neutron.common.config [-] Logging enabled!
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.136 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.137 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.137 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.137 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.137 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.137 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005532585.localdomain'}
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.137 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a53f591c-c9d9-4d52-87d5-f281ebdfee7d - - - - - -] RPC agent_id: nic-switch-agent.np0005532585.localdomain
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.142 2 INFO neutron.agent.agent_extensions_manager [None req-a53f591c-c9d9-4d52-87d5-f281ebdfee7d - - - - - -] Loaded agent extensions: ['qos']
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.142 2 INFO neutron.agent.agent_extensions_manager [None req-a53f591c-c9d9-4d52-87d5-f281ebdfee7d - - - - - -] Initializing agent extension 'qos'
Nov 23 09:37:44 np0005532585.localdomain sudo[256066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osyesenuknjfuplnayawbdeepdsqyvkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890663.9284256-1083-205843656722066/AnsiballZ_systemd.py
Nov 23 09:37:44 np0005532585.localdomain sudo[256066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.324 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a53f591c-c9d9-4d52-87d5-f281ebdfee7d - - - - - -] Agent initialized successfully, now running... 
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.326 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a53f591c-c9d9-4d52-87d5-f281ebdfee7d - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[255910]: 2025-11-23 09:37:44.326 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a53f591c-c9d9-4d52-87d5-f281ebdfee7d - - - - - -] Agent out of sync with plugin!
Nov 23 09:37:44 np0005532585.localdomain python3.9[256068]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: libpod-43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d.scope: Deactivated successfully.
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: libpod-43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d.scope: Consumed 1.764s CPU time.
Nov 23 09:37:44 np0005532585.localdomain podman[256072]: 2025-11-23 09:37:44.638501344 +0000 UTC m=+0.071512975 container died 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: tmp-crun.RnL3gQ.mount: Deactivated successfully.
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d-userdata-shm.mount: Deactivated successfully.
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1105a81fdf35eedfaee35be7ff37c0efd44ec8eeca2ada53174bfd17baecbd5e-merged.mount: Deactivated successfully.
Nov 23 09:37:44 np0005532585.localdomain podman[256072]: 2025-11-23 09:37:44.699550155 +0000 UTC m=+0.132561756 container cleanup 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:37:44 np0005532585.localdomain podman[256072]: neutron_sriov_agent
Nov 23 09:37:44 np0005532585.localdomain podman[256086]: 2025-11-23 09:37:44.701708942 +0000 UTC m=+0.060224187 container cleanup 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=neutron_sriov_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, container_name=neutron_sriov_agent)
Nov 23 09:37:44 np0005532585.localdomain podman[256098]: 2025-11-23 09:37:44.785244829 +0000 UTC m=+0.055585372 container cleanup 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, org.label-schema.build-date=20251118)
Nov 23 09:37:44 np0005532585.localdomain podman[256098]: neutron_sriov_agent
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: Starting neutron_sriov_agent container...
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:37:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1105a81fdf35eedfaee35be7ff37c0efd44ec8eeca2ada53174bfd17baecbd5e/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 23 09:37:44 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1105a81fdf35eedfaee35be7ff37c0efd44ec8eeca2ada53174bfd17baecbd5e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:37:44 np0005532585.localdomain podman[256110]: 2025-11-23 09:37:44.92993495 +0000 UTC m=+0.111136643 container init 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Nov 23 09:37:44 np0005532585.localdomain podman[256110]: 2025-11-23 09:37:44.940120546 +0000 UTC m=+0.121322299 container start 43c8d75c20c501be9b8d54b728d2240cabd1a5ada01462ebeef400506368453d (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'd192b9b29ffa9c1ea30e5452621ed5ee1d0c84bd088d5bea944d2011b4e04396'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:37:44 np0005532585.localdomain podman[256110]: neutron_sriov_agent
Nov 23 09:37:44 np0005532585.localdomain neutron_sriov_agent[256124]: + sudo -E kolla_set_configs
Nov 23 09:37:44 np0005532585.localdomain systemd[1]: Started neutron_sriov_agent container.
Nov 23 09:37:44 np0005532585.localdomain sudo[256066]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Validating config file
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Copying service configuration files
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Writing out command to execute
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: ++ cat /run_command
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + CMD=/usr/bin/neutron-sriov-nic-agent
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + ARGS=
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + sudo kolla_copy_cacerts
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + [[ ! -n '' ]]
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + . kolla_extend_start
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + umask 0022
Nov 23 09:37:45 np0005532585.localdomain neutron_sriov_agent[256124]: + exec /usr/bin/neutron-sriov-nic-agent
Nov 23 09:37:45 np0005532585.localdomain sshd[250915]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:37:45 np0005532585.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Nov 23 09:37:45 np0005532585.localdomain systemd[1]: session-57.scope: Consumed 22.739s CPU time.
Nov 23 09:37:45 np0005532585.localdomain systemd-logind[761]: Session 57 logged out. Waiting for processes to exit.
Nov 23 09:37:45 np0005532585.localdomain systemd-logind[761]: Removed session 57.
Nov 23 09:37:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:45.971 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:45.972 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:45.973 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:45 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:45.973 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:46.005 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:46.007 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21256 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D6FEAB0000000001030307) 
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.577 2 INFO neutron.common.config [-] Logging enabled!
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.577 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.578 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.578 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.578 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.578 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.578 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005532585.localdomain'}
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.579 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-51eeb97d-c88c-40b5-a0f5-e50b9624dccf - - - - - -] RPC agent_id: nic-switch-agent.np0005532585.localdomain
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.583 2 INFO neutron.agent.agent_extensions_manager [None req-51eeb97d-c88c-40b5-a0f5-e50b9624dccf - - - - - -] Loaded agent extensions: ['qos']
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.583 2 INFO neutron.agent.agent_extensions_manager [None req-51eeb97d-c88c-40b5-a0f5-e50b9624dccf - - - - - -] Initializing agent extension 'qos'
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.721 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-51eeb97d-c88c-40b5-a0f5-e50b9624dccf - - - - - -] Agent initialized successfully, now running... 
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.722 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-51eeb97d-c88c-40b5-a0f5-e50b9624dccf - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Nov 23 09:37:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:37:46.722 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-51eeb97d-c88c-40b5-a0f5-e50b9624dccf - - - - - -] Agent out of sync with plugin!
Nov 23 09:37:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:37:47 np0005532585.localdomain podman[256158]: 2025-11-23 09:37:47.022323952 +0000 UTC m=+0.079662518 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:37:47 np0005532585.localdomain podman[256158]: 2025-11-23 09:37:47.052986792 +0000 UTC m=+0.110325298 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 09:37:47 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:37:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21257 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D702A00000000001030307) 
Nov 23 09:37:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42264 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D706200000000001030307) 
Nov 23 09:37:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:37:49 np0005532585.localdomain podman[256178]: 2025-11-23 09:37:49.02707323 +0000 UTC m=+0.083678593 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:37:49 np0005532585.localdomain podman[256178]: 2025-11-23 09:37:49.038591167 +0000 UTC m=+0.095196570 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:37:49 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:37:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21258 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D70AA00000000001030307) 
Nov 23 09:37:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=256 DF PROTO=TCP SPT=43814 DPT=9102 SEQ=423925314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D70E200000000001030307) 
Nov 23 09:37:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:37:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:51.008 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:51.009 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:51.010 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:51.010 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:51.040 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:51.041 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:51 np0005532585.localdomain podman[256201]: 2025-11-23 09:37:51.047587786 +0000 UTC m=+0.100864784 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:37:51 np0005532585.localdomain podman[256201]: 2025-11-23 09:37:51.058136113 +0000 UTC m=+0.111413111 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:37:51 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:37:51 np0005532585.localdomain sshd[256220]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:37:51 np0005532585.localdomain sshd[256220]: Accepted publickey for zuul from 192.168.122.30 port 59916 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:37:51 np0005532585.localdomain systemd-logind[761]: New session 58 of user zuul.
Nov 23 09:37:51 np0005532585.localdomain systemd[1]: Started Session 58 of User zuul.
Nov 23 09:37:51 np0005532585.localdomain sshd[256220]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:37:52 np0005532585.localdomain python3.9[256331]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:37:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21259 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D71A610000000001030307) 
Nov 23 09:37:53 np0005532585.localdomain sudo[256443]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbdjerdcydbgwytqvbkitxgkrmacsotm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890673.3745308-66-170498050185300/AnsiballZ_setup.py
Nov 23 09:37:53 np0005532585.localdomain sudo[256443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:53 np0005532585.localdomain python3.9[256445]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:37:54 np0005532585.localdomain sudo[256443]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:54 np0005532585.localdomain sudo[256506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcaragkkycwrurbwdynhosoxachawvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890673.3745308-66-170498050185300/AnsiballZ_dnf.py
Nov 23 09:37:54 np0005532585.localdomain sudo[256506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:55 np0005532585.localdomain python3.9[256508]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:37:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:56.042 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:56.044 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:37:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:56.045 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:37:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:56.045 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:56.074 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:37:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:37:56.075 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:37:58 np0005532585.localdomain sudo[256506]: pam_unix(sudo:session): session closed for user root
Nov 23 09:37:59 np0005532585.localdomain sudo[256618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hndcbiwirpzpxfetphrmhmutxcgenyms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890678.4792018-102-188811411990798/AnsiballZ_systemd.py
Nov 23 09:37:59 np0005532585.localdomain sudo[256618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:37:59 np0005532585.localdomain python3.9[256620]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:37:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:37:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:38:00 np0005532585.localdomain sudo[256618]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:01.076 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:01.078 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:01.078 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:01.078 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:01.104 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:01.105 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:01 np0005532585.localdomain sudo[256731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjyglvayiefqbfiyakgqidbkpytoejhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890680.9146547-129-236655523386796/AnsiballZ_file.py
Nov 23 09:38:01 np0005532585.localdomain sudo[256731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21260 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D73A200000000001030307) 
Nov 23 09:38:01 np0005532585.localdomain python3.9[256733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:01 np0005532585.localdomain sudo[256731]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:01 np0005532585.localdomain sudo[256841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahqcqmaezgwtcoqjrmauvywlxpelsxpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890681.6869087-129-97890769739620/AnsiballZ_file.py
Nov 23 09:38:01 np0005532585.localdomain sudo[256841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:02 np0005532585.localdomain python3.9[256843]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:02 np0005532585.localdomain sudo[256841]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:02 np0005532585.localdomain sudo[256951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkviwivbsrmrqcnyxmzasvavakkyyaty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890682.3157687-129-27605867735360/AnsiballZ_file.py
Nov 23 09:38:02 np0005532585.localdomain sudo[256951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:38:02 np0005532585.localdomain podman[256954]: 2025-11-23 09:38:02.761307937 +0000 UTC m=+0.101859806 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:38:02 np0005532585.localdomain podman[256954]: 2025-11-23 09:38:02.796681492 +0000 UTC m=+0.137233331 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Nov 23 09:38:02 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:38:02 np0005532585.localdomain python3.9[256953]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:02 np0005532585.localdomain sudo[256951]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:03 np0005532585.localdomain sudo[257080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rknkyadzmwbtaeweqtordmvwwffjhbqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890682.9689102-129-229188741698927/AnsiballZ_file.py
Nov 23 09:38:03 np0005532585.localdomain sudo[257080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:03 np0005532585.localdomain python3.9[257082]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:03 np0005532585.localdomain sudo[257080]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:03 np0005532585.localdomain sudo[257190]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzchauovgamqxcozdkgymyrreoawgjuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890683.6176817-129-150221134646407/AnsiballZ_file.py
Nov 23 09:38:03 np0005532585.localdomain sudo[257190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:04 np0005532585.localdomain python3.9[257192]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:04 np0005532585.localdomain sudo[257190]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:04 np0005532585.localdomain sudo[257300]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqoknylvvlyxtawtivkjwbkuawiwmfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890684.2177272-129-189241600577829/AnsiballZ_file.py
Nov 23 09:38:04 np0005532585.localdomain sudo[257300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:04 np0005532585.localdomain python3.9[257302]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:05 np0005532585.localdomain sudo[257300]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:05 np0005532585.localdomain sudo[257410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiobjtzuyphchkrokzbjhmkvyrjngdpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890685.1264093-129-164336800924338/AnsiballZ_file.py
Nov 23 09:38:05 np0005532585.localdomain sudo[257410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:05 np0005532585.localdomain python3.9[257412]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:05 np0005532585.localdomain sudo[257410]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.106 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.107 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.108 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.108 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.108 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.110 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:06 np0005532585.localdomain sudo[257520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqlfzlxuwyrpbkehjbszitatolwcwtie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890685.7802536-279-57963865623333/AnsiballZ_stat.py
Nov 23 09:38:06 np0005532585.localdomain sudo[257520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:06 np0005532585.localdomain python3.9[257522]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:06 np0005532585.localdomain sudo[257520]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:06.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:06 np0005532585.localdomain sudo[257608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-givszzimtfrkpzxoahzkqqplyufchuhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890685.7802536-279-57963865623333/AnsiballZ_copy.py
Nov 23 09:38:06 np0005532585.localdomain sudo[257608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:07 np0005532585.localdomain python3.9[257610]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890685.7802536-279-57963865623333/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:07 np0005532585.localdomain sudo[257608]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:07.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:07.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:38:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:07.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:38:07 np0005532585.localdomain python3.9[257718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:38:08 np0005532585.localdomain systemd[1]: tmp-crun.LHw5zX.mount: Deactivated successfully.
Nov 23 09:38:08 np0005532585.localdomain podman[257752]: 2025-11-23 09:38:08.027826054 +0000 UTC m=+0.086232632 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:38:08 np0005532585.localdomain podman[257752]: 2025-11-23 09:38:08.036846273 +0000 UTC m=+0.095252811 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:38:08 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:38:08 np0005532585.localdomain python3.9[257827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890687.3144207-324-47582242386643/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:08.771 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:38:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:08.772 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:38:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:08.772 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:38:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:08.772 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:38:08 np0005532585.localdomain python3.9[257935]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:38:09.249 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:38:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:38:09.250 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:38:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:38:09.251 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:38:09 np0005532585.localdomain python3.9[258021]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890688.414682-324-216901770846207/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:09 np0005532585.localdomain python3.9[258129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:10 np0005532585.localdomain sudo[258203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:38:10 np0005532585.localdomain sudo[258203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:38:10 np0005532585.localdomain sudo[258203]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:10 np0005532585.localdomain sudo[258234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:38:10 np0005532585.localdomain sudo[258234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:38:10 np0005532585.localdomain python3.9[258229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890689.4741945-324-97270141632002/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=819ea45525e58d8cab582899f1693a034ecc8e6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.623 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.638 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.639 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.640 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.640 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.640 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.658 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.659 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.659 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.660 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:38:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:10.660 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.145 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.147 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:11 np0005532585.localdomain podman[258343]: 2025-11-23 09:38:11.159408909 +0000 UTC m=+0.129154341 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, build-date=2025-09-24T08:57:55, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.216 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.217 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:38:11 np0005532585.localdomain podman[258343]: 2025-11-23 09:38:11.26825875 +0000 UTC m=+0.238004162 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, release=553, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55)
Nov 23 09:38:11 np0005532585.localdomain rsyslogd[760]: imjournal: 6901 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.430 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.432 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12194MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.433 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.433 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.490 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.490 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.491 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:38:11 np0005532585.localdomain sudo[258234]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:11.531 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:38:11 np0005532585.localdomain sudo[258431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:38:11 np0005532585.localdomain sudo[258431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:38:11 np0005532585.localdomain sudo[258431]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:11 np0005532585.localdomain sudo[258466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:38:11 np0005532585.localdomain sudo[258466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:38:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:38:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:38:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:38:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1"
Nov 23 09:38:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:38:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16781 "" "Go-http-client/1.1"
Nov 23 09:38:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:12.025 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:38:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:12.035 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:38:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:12.046 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:38:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:12.048 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:38:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:12.048 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:38:12 np0005532585.localdomain sudo[258466]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:12 np0005532585.localdomain python3.9[258611]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:12 np0005532585.localdomain sudo[258612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:38:12 np0005532585.localdomain sudo[258612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:38:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:38:12 np0005532585.localdomain sudo[258612]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:12 np0005532585.localdomain podman[258630]: 2025-11-23 09:38:12.941814962 +0000 UTC m=+0.068271016 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 09:38:12 np0005532585.localdomain podman[258630]: 2025-11-23 09:38:12.97472523 +0000 UTC m=+0.101181244 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:38:12 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:38:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:13.125 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:13.126 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:13.157 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:13.157 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:13 np0005532585.localdomain python3.9[258740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890692.355187-498-158147993971947/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=1067e04911e84d9dc262158a63dd8e464b0e5dfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:13.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:38:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:13.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:38:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:38:14 np0005532585.localdomain podman[258796]: 2025-11-23 09:38:14.017990322 +0000 UTC m=+0.075523611 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:38:14 np0005532585.localdomain podman[258796]: 2025-11-23 09:38:14.035295377 +0000 UTC m=+0.092828626 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41)
Nov 23 09:38:14 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:38:14 np0005532585.localdomain python3.9[258868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:14 np0005532585.localdomain python3.9[258954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890693.5041487-543-89770355955367/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:15 np0005532585.localdomain python3.9[259062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:15 np0005532585.localdomain python3.9[259148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890695.0738232-543-211759360281348/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:16.150 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:16.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:16.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:16.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:16.202 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:16.203 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49020 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D773DB0000000001030307) 
Nov 23 09:38:16 np0005532585.localdomain python3.9[259256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:17 np0005532585.localdomain python3.9[259311]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:38:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49021 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D777E00000000001030307) 
Nov 23 09:38:17 np0005532585.localdomain podman[259312]: 2025-11-23 09:38:17.350078747 +0000 UTC m=+0.076962574 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:38:17 np0005532585.localdomain podman[259312]: 2025-11-23 09:38:17.360229052 +0000 UTC m=+0.087112949 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:38:17 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:38:17 np0005532585.localdomain python3.9[259437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21261 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D77A200000000001030307) 
Nov 23 09:38:18 np0005532585.localdomain python3.9[259523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890697.382037-630-187159174412963/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49022 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D77FE00000000001030307) 
Nov 23 09:38:19 np0005532585.localdomain python3.9[259631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:38:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:38:20 np0005532585.localdomain sudo[259749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqqghmrfcytdchrhgkjlukzbgldscadp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890699.7334473-735-271832623958029/AnsiballZ_file.py
Nov 23 09:38:20 np0005532585.localdomain sudo[259749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:20 np0005532585.localdomain podman[259724]: 2025-11-23 09:38:20.021839293 +0000 UTC m=+0.073577769 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:38:20 np0005532585.localdomain podman[259724]: 2025-11-23 09:38:20.05853604 +0000 UTC m=+0.110274496 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:38:20 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:38:20 np0005532585.localdomain python3.9[259754]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:20 np0005532585.localdomain sudo[259749]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42265 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D784200000000001030307) 
Nov 23 09:38:20 np0005532585.localdomain sudo[259874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvfbovddzjjgamwwvriwrryeczpaxbar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890700.3900716-759-99574094330031/AnsiballZ_stat.py
Nov 23 09:38:20 np0005532585.localdomain sudo[259874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:20 np0005532585.localdomain python3.9[259876]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:20 np0005532585.localdomain sudo[259874]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:21 np0005532585.localdomain sudo[259931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efaacmyaiejdudwnkdzzdzjrsltmmcgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890700.3900716-759-99574094330031/AnsiballZ_file.py
Nov 23 09:38:21 np0005532585.localdomain sudo[259931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:38:21 np0005532585.localdomain podman[259934]: 2025-11-23 09:38:21.178335301 +0000 UTC m=+0.078701279 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:38:21 np0005532585.localdomain podman[259934]: 2025-11-23 09:38:21.192220661 +0000 UTC m=+0.092586619 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 09:38:21 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.244 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.246 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.246 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.247 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.248 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.249 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:21.252 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:21 np0005532585.localdomain python3.9[259933]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:21 np0005532585.localdomain sudo[259931]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:21 np0005532585.localdomain sudo[260061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcftzdsyucuhcubjxofzpncostycuewp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890701.4188304-759-204156238817542/AnsiballZ_stat.py
Nov 23 09:38:21 np0005532585.localdomain sudo[260061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:21 np0005532585.localdomain python3.9[260063]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:22 np0005532585.localdomain sudo[260061]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:22 np0005532585.localdomain sudo[260118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxagoksgsrrhwvqeyjcfiwwusqfbgkkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890701.4188304-759-204156238817542/AnsiballZ_file.py
Nov 23 09:38:22 np0005532585.localdomain sudo[260118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:22 np0005532585.localdomain python3.9[260120]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:22 np0005532585.localdomain sudo[260118]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:23 np0005532585.localdomain sudo[260228]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlqxhccsbxnukcmhmtsfrlhcdvihthoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890702.7637537-828-274193106783125/AnsiballZ_file.py
Nov 23 09:38:23 np0005532585.localdomain sudo[260228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:23 np0005532585.localdomain python3.9[260230]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:23 np0005532585.localdomain sudo[260228]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49023 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D78FA00000000001030307) 
Nov 23 09:38:23 np0005532585.localdomain sudo[260338]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmnpzpggwjjsiziiazgzekizlvzqsudn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890703.4329808-852-129695780959019/AnsiballZ_stat.py
Nov 23 09:38:23 np0005532585.localdomain sudo[260338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:23 np0005532585.localdomain python3.9[260340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:23 np0005532585.localdomain sudo[260338]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:24 np0005532585.localdomain sudo[260395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyjqsgdvthjahfltptetwdmffrmwtiuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890703.4329808-852-129695780959019/AnsiballZ_file.py
Nov 23 09:38:24 np0005532585.localdomain sudo[260395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:24 np0005532585.localdomain python3.9[260397]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:24 np0005532585.localdomain sudo[260395]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:25 np0005532585.localdomain sudo[260505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiuvkevjgiwqpbsohmajcgwjpyyatexj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890705.4984357-888-158496496805866/AnsiballZ_stat.py
Nov 23 09:38:25 np0005532585.localdomain sudo[260505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:25 np0005532585.localdomain python3.9[260507]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:25 np0005532585.localdomain sudo[260505]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:26 np0005532585.localdomain sudo[260562]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfhhrnvwutjryeotzttxxyijrljruejx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890705.4984357-888-158496496805866/AnsiballZ_file.py
Nov 23 09:38:26 np0005532585.localdomain sudo[260562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.278 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.281 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.282 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:26.284 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:26 np0005532585.localdomain python3.9[260564]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:26 np0005532585.localdomain sudo[260562]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:26 np0005532585.localdomain sudo[260672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdfrwgqlkgujwtyjkcozashrmmbeuunp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890706.5816932-924-133464828392740/AnsiballZ_systemd.py
Nov 23 09:38:26 np0005532585.localdomain sudo[260672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:27 np0005532585.localdomain python3.9[260674]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:38:27 np0005532585.localdomain systemd-rc-local-generator[260701]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:38:27 np0005532585.localdomain systemd-sysv-generator[260705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:27 np0005532585.localdomain sudo[260672]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:28 np0005532585.localdomain sudo[260820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlslbcdzpwwrumuxnzqgfyuzsfpnlasd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890707.7926042-948-54100069934964/AnsiballZ_stat.py
Nov 23 09:38:28 np0005532585.localdomain sudo[260820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:28 np0005532585.localdomain python3.9[260822]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:28 np0005532585.localdomain sudo[260820]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:28 np0005532585.localdomain sudo[260877]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trhztilqicypvkcdsjwelmxqqcyxunla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890707.7926042-948-54100069934964/AnsiballZ_file.py
Nov 23 09:38:28 np0005532585.localdomain sudo[260877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:28 np0005532585.localdomain python3.9[260879]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:28 np0005532585.localdomain sudo[260877]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:29 np0005532585.localdomain sudo[260987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywhogyuwlctszockuyaxuxgbfisozfuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890708.8859012-984-174777366787293/AnsiballZ_stat.py
Nov 23 09:38:29 np0005532585.localdomain sudo[260987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:29 np0005532585.localdomain python3.9[260989]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:29 np0005532585.localdomain sudo[260987]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:29 np0005532585.localdomain sudo[261044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agopqkpfyujuxtprtyyiscwzxlieutjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890708.8859012-984-174777366787293/AnsiballZ_file.py
Nov 23 09:38:29 np0005532585.localdomain sudo[261044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:29 np0005532585.localdomain python3.9[261046]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:29 np0005532585.localdomain sudo[261044]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:38:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:38:30 np0005532585.localdomain sudo[261154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtgvacrrzozciudxaxmrxadwjfwizpeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890710.0031805-1020-139752698144560/AnsiballZ_systemd.py
Nov 23 09:38:30 np0005532585.localdomain sudo[261154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:30 np0005532585.localdomain python3.9[261156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:38:30 np0005532585.localdomain systemd-rc-local-generator[261183]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:38:30 np0005532585.localdomain systemd-sysv-generator[261186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:30 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:31 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:38:31 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:38:31 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:38:31 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:38:31 np0005532585.localdomain sudo[261154]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:31.313 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:31.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:31.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:31.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:31.316 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:31.319 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:31 np0005532585.localdomain sudo[261306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfdkhiwguututvmrolihchvgoitabstm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890711.4384322-1050-245774058680195/AnsiballZ_file.py
Nov 23 09:38:31 np0005532585.localdomain sudo[261306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49024 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7B0200000000001030307) 
Nov 23 09:38:31 np0005532585.localdomain python3.9[261308]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:38:31 np0005532585.localdomain sudo[261306]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:32 np0005532585.localdomain sudo[261416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqtyqbwkuygajasnwlpjwgdtwglpyhtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890712.177706-1074-151086875573007/AnsiballZ_stat.py
Nov 23 09:38:32 np0005532585.localdomain sudo[261416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:32 np0005532585.localdomain python3.9[261418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:38:32 np0005532585.localdomain sudo[261416]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:38:33 np0005532585.localdomain podman[261473]: 2025-11-23 09:38:33.045963225 +0000 UTC m=+0.093949940 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:38:33 np0005532585.localdomain sudo[261516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kthkpmkusmnurssopjiieuvpxuwvrwti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890712.177706-1074-151086875573007/AnsiballZ_copy.py
Nov 23 09:38:33 np0005532585.localdomain sudo[261516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:33 np0005532585.localdomain podman[261473]: 2025-11-23 09:38:33.060294909 +0000 UTC m=+0.108281664 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:38:33 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:38:33 np0005532585.localdomain python3.9[261525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890712.177706-1074-151086875573007/.source.json _original_basename=.gi1o4vau follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:33 np0005532585.localdomain sudo[261516]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:33 np0005532585.localdomain sudo[261633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkhjkgkgvpsxpvqqlxjacakxphuxbpzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890713.4576163-1119-115646796939980/AnsiballZ_file.py
Nov 23 09:38:33 np0005532585.localdomain sudo[261633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:33 np0005532585.localdomain python3.9[261635]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:33 np0005532585.localdomain sudo[261633]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:34 np0005532585.localdomain sudo[261743]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixcrlwmnvcysoaeibmrhdkzffvratmvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890714.2324715-1143-35312421423136/AnsiballZ_stat.py
Nov 23 09:38:34 np0005532585.localdomain sudo[261743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:34 np0005532585.localdomain sudo[261743]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:35 np0005532585.localdomain sudo[261831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uoxwtnimtmwsoumcrouloxlrfsmsbmxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890714.2324715-1143-35312421423136/AnsiballZ_copy.py
Nov 23 09:38:35 np0005532585.localdomain sudo[261831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:35 np0005532585.localdomain sudo[261831]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:36 np0005532585.localdomain sudo[261941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbrlshielgpugfrxguazwhxkxxldxtrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890715.6371884-1194-121709307626163/AnsiballZ_container_config_data.py
Nov 23 09:38:36 np0005532585.localdomain sudo[261941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:36 np0005532585.localdomain python3.9[261943]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Nov 23 09:38:36 np0005532585.localdomain sudo[261941]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:36.352 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:36.353 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:36.354 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:36.354 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:36.355 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:36.357 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:36 np0005532585.localdomain sudo[262051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oibltlhrsneabktgxlvjsflmkjywfsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890716.4974763-1221-52327942733980/AnsiballZ_container_config_hash.py
Nov 23 09:38:36 np0005532585.localdomain sudo[262051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:37 np0005532585.localdomain python3.9[262053]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:38:37 np0005532585.localdomain sudo[262051]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:37 np0005532585.localdomain sudo[262161]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykxtlsnvpcallyejtyejwexrpvbmubkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890717.4131536-1248-245610277534517/AnsiballZ_podman_container_info.py
Nov 23 09:38:37 np0005532585.localdomain sudo[262161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:38 np0005532585.localdomain python3.9[262163]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 09:38:38 np0005532585.localdomain sudo[262161]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:38:39 np0005532585.localdomain podman[262209]: 2025-11-23 09:38:39.032543213 +0000 UTC m=+0.081802915 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:38:39 np0005532585.localdomain podman[262209]: 2025-11-23 09:38:39.070396525 +0000 UTC m=+0.119656287 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:38:39 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:38:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:41.358 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:41.359 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:41.359 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:41.359 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:41.382 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:41.383 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:38:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:38:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:38:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1"
Nov 23 09:38:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:38:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16777 "" "Go-http-client/1.1"
Nov 23 09:38:42 np0005532585.localdomain sudo[262322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdjbqfpcoeifdakjoxpiaerblzdoekmp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890722.2999117-1287-227116797586968/AnsiballZ_edpm_container_manage.py
Nov 23 09:38:42 np0005532585.localdomain sudo[262322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:43 np0005532585.localdomain python3[262324]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:38:43 np0005532585.localdomain podman[262361]: 
Nov 23 09:38:43 np0005532585.localdomain podman[262361]: 2025-11-23 09:38:43.409647623 +0000 UTC m=+0.075481178 container create a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, tcib_managed=true)
Nov 23 09:38:43 np0005532585.localdomain podman[262361]: 2025-11-23 09:38:43.37822843 +0000 UTC m=+0.044062005 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:38:43 np0005532585.localdomain python3[262324]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:38:43 np0005532585.localdomain sudo[262322]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:38:44 np0005532585.localdomain podman[262474]: 2025-11-23 09:38:44.041167291 +0000 UTC m=+0.086094467 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:38:44 np0005532585.localdomain sudo[262519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erukjqtlnlevpzgtxcjdlshuzjawyoyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890723.7780457-1311-260083450214937/AnsiballZ_stat.py
Nov 23 09:38:44 np0005532585.localdomain sudo[262519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:38:44 np0005532585.localdomain podman[262474]: 2025-11-23 09:38:44.089303162 +0000 UTC m=+0.134230368 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:38:44 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:38:44 np0005532585.localdomain podman[262534]: 2025-11-23 09:38:44.164441529 +0000 UTC m=+0.078216783 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:38:44 np0005532585.localdomain podman[262534]: 2025-11-23 09:38:44.177231786 +0000 UTC m=+0.091007020 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:38:44 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:38:44 np0005532585.localdomain python3.9[262533]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:38:44 np0005532585.localdomain sudo[262519]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:44 np0005532585.localdomain sudo[262662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znegirtbxyjkydnbutfktezdferhoyaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890724.583628-1338-201697176261499/AnsiballZ_file.py
Nov 23 09:38:44 np0005532585.localdomain sudo[262662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:45 np0005532585.localdomain python3.9[262664]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:45 np0005532585.localdomain sudo[262662]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:45 np0005532585.localdomain sudo[262717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmglnwmzmlcmykfoslsftctqmkovupsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890724.583628-1338-201697176261499/AnsiballZ_stat.py
Nov 23 09:38:45 np0005532585.localdomain sudo[262717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:45 np0005532585.localdomain python3.9[262719]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:38:45 np0005532585.localdomain sudo[262717]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:45 np0005532585.localdomain sudo[262826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhhtracdrktrhzmuvmbjiapswiiqtduz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890725.5548944-1338-227240907472375/AnsiballZ_copy.py
Nov 23 09:38:45 np0005532585.localdomain sudo[262826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:46 np0005532585.localdomain python3.9[262828]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890725.5548944-1338-227240907472375/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:38:46 np0005532585.localdomain sudo[262826]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12128 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7E90B0000000001030307) 
Nov 23 09:38:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:46.382 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:46 np0005532585.localdomain sudo[262881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lckpntebxlteqjkhzzkmdgrrgtujtqpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890725.5548944-1338-227240907472375/AnsiballZ_systemd.py
Nov 23 09:38:46 np0005532585.localdomain sudo[262881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:46 np0005532585.localdomain python3.9[262883]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:38:46 np0005532585.localdomain systemd-rc-local-generator[262904]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:38:46 np0005532585.localdomain systemd-sysv-generator[262908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:46 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:47 np0005532585.localdomain sudo[262881]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12129 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7ED200000000001030307) 
Nov 23 09:38:47 np0005532585.localdomain sudo[262971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uryqrpedavyirfafyikytbawoeyzdqqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890725.5548944-1338-227240907472375/AnsiballZ_systemd.py
Nov 23 09:38:47 np0005532585.localdomain sudo[262971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:38:47 np0005532585.localdomain podman[262974]: 2025-11-23 09:38:47.582100636 +0000 UTC m=+0.086284094 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:38:47 np0005532585.localdomain podman[262974]: 2025-11-23 09:38:47.615196741 +0000 UTC m=+0.119380159 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 23 09:38:47 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:38:47 np0005532585.localdomain python3.9[262973]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:38:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49025 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7F0210000000001030307) 
Nov 23 09:38:48 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:38:48 np0005532585.localdomain systemd-rc-local-generator[263018]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:38:48 np0005532585.localdomain systemd-sysv-generator[263023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:38:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 23 09:38:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:38:49 np0005532585.localdomain podman[263032]: 2025-11-23 09:38:49.319439291 +0000 UTC m=+0.107857911 container init a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:38:49 np0005532585.localdomain podman[263032]: 2025-11-23 09:38:49.330440982 +0000 UTC m=+0.118859602 container start a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 09:38:49 np0005532585.localdomain podman[263032]: neutron_dhcp_agent
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + sudo -E kolla_set_configs
Nov 23 09:38:49 np0005532585.localdomain systemd[1]: Started neutron_dhcp_agent container.
Nov 23 09:38:49 np0005532585.localdomain sudo[262971]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12130 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7F5200000000001030307) 
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Validating config file
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Copying service configuration files
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Writing out command to execute
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: ++ cat /run_command
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + CMD=/usr/bin/neutron-dhcp-agent
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + ARGS=
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + sudo kolla_copy_cacerts
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + [[ ! -n '' ]]
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + . kolla_extend_start
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: Running command: '/usr/bin/neutron-dhcp-agent'
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + umask 0022
Nov 23 09:38:49 np0005532585.localdomain neutron_dhcp_agent[263045]: + exec /usr/bin/neutron-dhcp-agent
Nov 23 09:38:49 np0005532585.localdomain sudo[263166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkajbqisxcyuxtecxuhtclsuoufnuzvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890729.572137-1422-79243832291467/AnsiballZ_systemd.py
Nov 23 09:38:49 np0005532585.localdomain sudo[263166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:38:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21262 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7F8200000000001030307) 
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:38:50 np0005532585.localdomain python3.9[263168]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Nov 23 09:38:50 np0005532585.localdomain podman[263170]: 2025-11-23 09:38:50.306737289 +0000 UTC m=+0.102947180 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:38:50 np0005532585.localdomain podman[263170]: 2025-11-23 09:38:50.318625937 +0000 UTC m=+0.114835878 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: libpod-a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8.scope: Deactivated successfully.
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: libpod-a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8.scope: Consumed 1.009s CPU time.
Nov 23 09:38:50 np0005532585.localdomain podman[263184]: 2025-11-23 09:38:50.352118584 +0000 UTC m=+0.076864652 container died a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:38:50 np0005532585.localdomain podman[263184]: 2025-11-23 09:38:50.453924737 +0000 UTC m=+0.178670805 container cleanup a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:38:50 np0005532585.localdomain podman[263184]: neutron_dhcp_agent
Nov 23 09:38:50 np0005532585.localdomain podman[263237]: error opening file `/run/crun/a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8/status`: No such file or directory
Nov 23 09:38:50 np0005532585.localdomain podman[263226]: 2025-11-23 09:38:50.558079642 +0000 UTC m=+0.069925346 container cleanup a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3)
Nov 23 09:38:50 np0005532585.localdomain podman[263226]: neutron_dhcp_agent
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:38:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 23 09:38:50 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:38:50 np0005532585.localdomain podman[263239]: 2025-11-23 09:38:50.697219212 +0000 UTC m=+0.107761648 container init a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:38:50 np0005532585.localdomain podman[263239]: 2025-11-23 09:38:50.706383205 +0000 UTC m=+0.116925641 container start a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=neutron_dhcp_agent)
Nov 23 09:38:50 np0005532585.localdomain podman[263239]: neutron_dhcp_agent
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + sudo -E kolla_set_configs
Nov 23 09:38:50 np0005532585.localdomain systemd[1]: Started neutron_dhcp_agent container.
Nov 23 09:38:50 np0005532585.localdomain sudo[263166]: pam_unix(sudo:session): session closed for user root
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Validating config file
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Copying service configuration files
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Writing out command to execute
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: ++ cat /run_command
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + CMD=/usr/bin/neutron-dhcp-agent
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + ARGS=
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + sudo kolla_copy_cacerts
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + [[ ! -n '' ]]
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + . kolla_extend_start
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: Running command: '/usr/bin/neutron-dhcp-agent'
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + umask 0022
Nov 23 09:38:50 np0005532585.localdomain neutron_dhcp_agent[263254]: + exec /usr/bin/neutron-dhcp-agent
Nov 23 09:38:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:38:51 np0005532585.localdomain podman[263286]: 2025-11-23 09:38:51.373436654 +0000 UTC m=+0.080644308 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:38:51 np0005532585.localdomain podman[263286]: 2025-11-23 09:38:51.384008162 +0000 UTC m=+0.091215796 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 23 09:38:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:51.384 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:51.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:38:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:51.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:38:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:51.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:51 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:38:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:51.412 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:51.412 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:38:51 np0005532585.localdomain sshd[256220]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:38:51 np0005532585.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Nov 23 09:38:51 np0005532585.localdomain systemd[1]: session-58.scope: Consumed 33.945s CPU time.
Nov 23 09:38:51 np0005532585.localdomain systemd-logind[761]: Session 58 logged out. Waiting for processes to exit.
Nov 23 09:38:51 np0005532585.localdomain systemd-logind[761]: Removed session 58.
Nov 23 09:38:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.020 263258 INFO neutron.common.config [-] Logging enabled!
Nov 23 09:38:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.020 263258 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Nov 23 09:38:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.407 263258 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Nov 23 09:38:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.843 263258 INFO neutron.agent.dhcp.agent [None req-bae6fc77-6091-481a-9d57-864431681128 - - - - - -] All active networks have been fetched through RPC.
Nov 23 09:38:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.844 263258 INFO neutron.agent.dhcp.agent [None req-bae6fc77-6091-481a-9d57-864431681128 - - - - - -] Synchronizing state complete
Nov 23 09:38:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.884 263258 INFO neutron.agent.dhcp.agent [None req-bae6fc77-6091-481a-9d57-864431681128 - - - - - -] DHCP agent started
Nov 23 09:38:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12131 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D804E10000000001030307) 
Nov 23 09:38:53 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:53.661 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:38:53.661 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:38:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:38:53.663 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:38:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:38:53.663 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:38:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:38:56.449 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:38:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:38:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:38:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:39:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:38:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:39:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:39:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:39:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.486 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.488 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.488 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.488 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.490 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.490 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:01.493 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12132 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D826200000000001030307) 
Nov 23 09:39:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:39:04 np0005532585.localdomain podman[263305]: 2025-11-23 09:39:04.04701546 +0000 UTC m=+0.103871668 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:39:04 np0005532585.localdomain podman[263305]: 2025-11-23 09:39:04.061260722 +0000 UTC m=+0.118116940 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 09:39:04 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.525 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.527 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.527 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.527 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.528 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.529 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:06.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:07.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:07.739 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:39:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:07.739 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:39:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:07.740 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:39:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:07.740 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:39:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:07.740 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.167 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.229 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.229 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.451 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.453 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12131MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.453 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.454 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.545 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.545 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.545 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:39:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:08.586 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:39:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:09.007 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:39:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:09.013 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:39:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:09.028 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:39:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:09.030 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:39:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:09.030 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:39:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:39:09.250 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:39:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:39:09.251 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:39:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:39:09.252 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:39:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:39:10 np0005532585.localdomain systemd[1]: tmp-crun.XaNqJA.mount: Deactivated successfully.
Nov 23 09:39:10 np0005532585.localdomain podman[263368]: 2025-11-23 09:39:10.026578589 +0000 UTC m=+0.083442325 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:39:10 np0005532585.localdomain podman[263368]: 2025-11-23 09:39:10.039341374 +0000 UTC m=+0.096205100 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:39:10 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.802 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.817 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0878f08-4222-4816-b4bb-d9d20b6659ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.803666', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c17d04-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': '4962218054170a022a62943110517082a2f7b853f7c80a93b7cf61e4424540c4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.803666', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c1929e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': 'beb77bb0339223dca0baf979821f49b8ecd4aa9f1426218bace13103bc750937'}]}, 'timestamp': '2025-11-23 09:39:10.818596', '_unique_id': '64253285b2a74dadb1a8568f709155a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.821 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.821 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.822 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44b7c9b4-0261-404c-a1a9-08237a0e1d11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.821547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c217be-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': '5a7eb301175b68256ef30bee7de54e2bf37266f2db4d81662f201e3af481663d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.821547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c22998-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': '6a347af00373fdb936e7a020fbd5ab113ad4bb6a10d8d68811f4b92137563332'}]}, 'timestamp': '2025-11-23 09:39:10.822474', '_unique_id': 'de4a7bf0be17439e93f314477b64aadc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.828 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a0c0eb5-8e35-41e3-b9ba-230c81e1ada0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.824778', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43c31f42-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'a62cba49d08af3e1c673e6eeebe18f4af8afaf1254b7e580e7edbbccf7bd170a'}]}, 'timestamp': '2025-11-23 09:39:10.828796', '_unique_id': 'a943ba3ff28e4080919379e8b8ee1ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.830 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.831 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.831 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df2a77d4-12a9-416d-9abf-02f02436d7ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.831235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c39292-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': 'a7a5cf900839f6c62cf8c9d6205921c2a7d22b26ece3afe929415e495a6bb01a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.831235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c3a6f6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': 'c9dd2ebc7da0e4b18a6d301910ee9eefe4aa0dd45d762ebdda5853bcf2c91a79'}]}, 'timestamp': '2025-11-23 09:39:10.832260', '_unique_id': '79cd07a1a12b4ff2a19ec30bfabf987f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.834 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.834 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3848c7b-5e0d-4f50-a6d2-a285949e06e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.834608', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43c41668-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': '5ade14051555de8f046039e6964670c4da7871decfca98c03014e24c4a3ec73b'}]}, 'timestamp': '2025-11-23 09:39:10.835146', '_unique_id': 'e4d5f01a08374dffb386e64699eef793'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f50a8d52-c7ff-42bf-b829-6d3d4e8032f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.837383', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c9ed36-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'cc49b6461dd594ca63eb64244a5f3c389654e0915378482ba838282e41f23006'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.837383', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c9feb6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '89ef3c60056aa6d83c39b1c95a2d80ac1df5cbda5e676834aa7c433c18116742'}]}, 'timestamp': '2025-11-23 09:39:10.873775', '_unique_id': 'a1e9eaaf356d4b95bd1dc2dc72130b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e1ceecf-8180-4b83-9c32-9a1910f9754b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.876154', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43ca6ce8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': '36600c616d87bc8f9baa24efa79d5ef684355d8f3e1759aeefd8d57be5dc8563'}]}, 'timestamp': '2025-11-23 09:39:10.876623', '_unique_id': 'd9c9c26ece3d4438bb9036057ee17c84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.878 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e0d3a44-dcdd-43be-ad8c-a245153fb8ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.878754', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43cad318-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'c493fa6a6332b55a95945b77c6a1c8635b687485499ba974a55e3b41e1b4f18d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.878754', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43cae330-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'a02546345b2c3e65dd8592c55eba3a6159a1e826b082139eb36fbff3451082cc'}]}, 'timestamp': '2025-11-23 09:39:10.879617', '_unique_id': '96062e6e82a34797b56eaedae9574a26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d70cb48-1605-4123-827c-be36a75b3707', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.881783', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cb49ba-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'd317eea9db6a0ad2663087a992bfc8faf3c945e227dd186b7f80be398bf3f1b7'}]}, 'timestamp': '2025-11-23 09:39:10.882271', '_unique_id': 'de235eb9297646809b1a00a2b6873b68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.884 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 9228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04f126ba-a43a-4f25-8bda-edfd74973a53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9228, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.884432', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cbafd6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'dc9db59f33dc5a54907f6ece44f2890accbbb3fd0dcdac97abb5d1b11518f355'}]}, 'timestamp': '2025-11-23 09:39:10.884887', '_unique_id': '4d8ffd54efc745c187c27ea00aac1fed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.887 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26aedab3-7212-49d5-a9a3-3337cf36e89c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.887144', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cc1a0c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'b8fba024a510342b770d8f97eda3a87a07f7b6672b5ef99601b379116419d83b'}]}, 'timestamp': '2025-11-23 09:39:10.887607', '_unique_id': 'cb185879020943ea82d8d17edf2f65e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd6a0b5f-6deb-4409-93f5-b62fcdd40a85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.889954', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cc87f8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': '94d7c571098ef1c9620017a72e2cfd3419aa99fb78ad2750a0b74bfe40659fbd'}]}, 'timestamp': '2025-11-23 09:39:10.890418', '_unique_id': 'ccdecf10cd164a09b5987ffee4b06872'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f15e928-1c31-4a4b-b18d-29c18e50feba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.892682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43ccf2f6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'b772589ab329dbdedbe46eddb49221cc322e3e392f6f4e304f67a64bf2bb2d36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.892682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43cd0340-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '9056fcb66f5d3b63bb9f46dd0995de456392bf0563ffb6d9f9b8769432560f1b'}]}, 'timestamp': '2025-11-23 09:39:10.893545', '_unique_id': '7ff9512b81ff45d08af09d4f104b141d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 54830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50b4e12c-df95-4d91-bbb6-693fdecb636c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54830000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:39:10.895671', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '43d072a0-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.09307574, 'message_signature': '6dadfd81686c8c62c2a3ff1befe2434bb6b1459a8039604a2b8974cc5e8ee926'}]}, 'timestamp': '2025-11-23 09:39:10.916179', '_unique_id': 'd7ae50739e9b450eb80b49408ff56bf7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1db215b-eedf-4eeb-94fb-adca9d0407f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.918369', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43d0dd8a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'f204cf10c2fc131203f6a77e663037374c1609cd3a7255031a40616b5f6a3dec'}]}, 'timestamp': '2025-11-23 09:39:10.918822', '_unique_id': '30b5b067247a462a8e00e51cb23e610b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ae677ab-50c9-4a8f-aae2-841c24afc1c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:39:10.921122', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '43d14c84-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.09307574, 'message_signature': 'bbab5a85930dc69ba38606c852ce3d9364f5f2ea78f6d24ea3d672aeebe56b14'}]}, 'timestamp': '2025-11-23 09:39:10.921690', '_unique_id': '28aa91434eb4440a91b29040f433eeb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1adf894-1b5a-4267-a7a3-d368c6170681', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.924071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43d1bf98-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'a5b3f1f476529e45cd836d8a07ba54fb2f4a6a2898fa3628da6bb4d6233f892d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.924071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43d1d582-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '16798d94994d709bd80ce2cb222070dfc215dfdb286d127c164860cf5129de6e'}]}, 'timestamp': '2025-11-23 09:39:10.925164', '_unique_id': '288100af3478401eaf827c4f1c2161cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14e0beb7-c443-4f99-8d0c-f186ab448fbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.927599', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43d24c88-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'dff636da534bfd06993639e6a348b83f61eacf0c25bdda03189af805804c60d9'}]}, 'timestamp': '2025-11-23 09:39:10.928325', '_unique_id': '1dd77085c3fe4459b3d8d1b71412d8e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8b8765c-7437-4a30-a4ee-21550d83526a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.931119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43d2d2ca-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '68a39d5b98271526dbaaad47a511a4de496c028b5c3e70da19076d697785c3a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.931119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43d2ea62-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'a6de9e5bfcabdfb5976d3fce9dbf4a64cb181cfffff8a529926a26a26c7d0a78'}]}, 'timestamp': '2025-11-23 09:39:10.932245', '_unique_id': '0805cc5e36b7403382226af8150f6426'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a0ad76c-b9c7-4e13-9a62-302295db2db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.934535', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43d35880-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'c99ae2a34b6194d52a1d5c67793e41b90e9a710d13dd85a6b3a83a27772f6d2b'}]}, 'timestamp': '2025-11-23 09:39:10.935157', '_unique_id': '55e577e029d343699be8a1e030358bc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.938 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07bb8dcd-d345-482c-bef4-ec63977ad005', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.937483', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43d3cb62-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'fb139db06d6eb05f9a6571c2afbf7a6c8d6d6c272bb21c04a3749b7227de3967'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.937483', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43d3e03e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'd54e0dac08c50ea554494ba5aea9891ab83def2a88301078b0bee464a0e2012f'}]}, 'timestamp': '2025-11-23 09:39:10.938547', '_unique_id': 'd1d52ea509d345aab6dfdda56d7730b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:39:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.031 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.032 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.032 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.032 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.529 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.532 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.532 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.533 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.565 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.566 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.709 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.710 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.710 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:39:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:11.711 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:39:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:39:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:39:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:39:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:39:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:39:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17223 "" "Go-http-client/1.1"
Nov 23 09:39:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:12.788 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:39:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:12.809 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:39:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:12.809 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:39:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:12.810 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:12.810 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:12 np0005532585.localdomain sshd[263400]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:39:13 np0005532585.localdomain sudo[263392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:39:13 np0005532585.localdomain sudo[263392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:39:13 np0005532585.localdomain sudo[263392]: pam_unix(sudo:session): session closed for user root
Nov 23 09:39:13 np0005532585.localdomain sudo[263411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:39:13 np0005532585.localdomain sudo[263411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:39:13 np0005532585.localdomain sudo[263411]: pam_unix(sudo:session): session closed for user root
Nov 23 09:39:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:13.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:13.718 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:14 np0005532585.localdomain sudo[263461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:39:14 np0005532585.localdomain sudo[263461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:39:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:39:14 np0005532585.localdomain sudo[263461]: pam_unix(sudo:session): session closed for user root
Nov 23 09:39:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:39:14 np0005532585.localdomain podman[263479]: 2025-11-23 09:39:14.478478446 +0000 UTC m=+0.085592262 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:39:14 np0005532585.localdomain sshd[263400]: Connection closed by authenticating user root 80.94.95.116 port 40146 [preauth]
Nov 23 09:39:14 np0005532585.localdomain podman[263480]: 2025-11-23 09:39:14.523271643 +0000 UTC m=+0.128485770 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:39:14 np0005532585.localdomain podman[263480]: 2025-11-23 09:39:14.540243939 +0000 UTC m=+0.145458136 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:39:14 np0005532585.localdomain podman[263479]: 2025-11-23 09:39:14.546328907 +0000 UTC m=+0.153442713 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 09:39:14 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:39:14 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:39:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:14.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:39:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:14.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:39:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23316 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D85E3B0000000001030307) 
Nov 23 09:39:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:16.566 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:16.568 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:16.569 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:16.569 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:16.604 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:16.605 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23317 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D862600000000001030307) 
Nov 23 09:39:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:39:18 np0005532585.localdomain podman[263522]: 2025-11-23 09:39:18.027052677 +0000 UTC m=+0.079651279 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:39:18 np0005532585.localdomain podman[263522]: 2025-11-23 09:39:18.035713205 +0000 UTC m=+0.088311797 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:39:18 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:39:18 np0005532585.localdomain sshd[263540]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:39:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12133 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D866200000000001030307) 
Nov 23 09:39:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23318 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D86A610000000001030307) 
Nov 23 09:39:19 np0005532585.localdomain sshd[263540]: Invalid user admin from 65.20.167.78 port 35681
Nov 23 09:39:20 np0005532585.localdomain sshd[263540]: Connection closed by invalid user admin 65.20.167.78 port 35681 [preauth]
Nov 23 09:39:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49026 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D86E200000000001030307) 
Nov 23 09:39:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:39:21 np0005532585.localdomain podman[263542]: 2025-11-23 09:39:21.006187331 +0000 UTC m=+0.058160563 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:39:21 np0005532585.localdomain podman[263542]: 2025-11-23 09:39:21.036670545 +0000 UTC m=+0.088643777 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:39:21 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.605 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.607 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.608 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.608 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.642 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.642 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:21 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:21.644 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:39:22 np0005532585.localdomain podman[263563]: 2025-11-23 09:39:22.026513111 +0000 UTC m=+0.084898121 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 23 09:39:22 np0005532585.localdomain podman[263563]: 2025-11-23 09:39:22.042885988 +0000 UTC m=+0.101270998 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:39:22 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:39:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23319 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D87A210000000001030307) 
Nov 23 09:39:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:39:23Z|00048|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 23 09:39:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:26.668 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:26.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:26.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5025 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:26.671 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:26.671 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:26 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:26.674 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:39:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:39:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23320 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D89A210000000001030307) 
Nov 23 09:39:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:31.675 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:31.678 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:31.678 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:31.679 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:31.712 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:31 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:31.713 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:39:35 np0005532585.localdomain podman[263581]: 2025-11-23 09:39:35.014708332 +0000 UTC m=+0.074726256 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 23 09:39:35 np0005532585.localdomain podman[263581]: 2025-11-23 09:39:35.025475846 +0000 UTC m=+0.085493770 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:39:35 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:39:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:36.715 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:36.718 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:36.718 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:36.719 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:36.755 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:36 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:36.757 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:39:41 np0005532585.localdomain podman[263600]: 2025-11-23 09:39:41.030929345 +0000 UTC m=+0.079763011 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:39:41 np0005532585.localdomain podman[263600]: 2025-11-23 09:39:41.042731991 +0000 UTC m=+0.091565657 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:39:41 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:39:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:41.758 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:41.760 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:41.760 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:41.761 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:41.790 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:41 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:41.791 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:39:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:39:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:39:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:39:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:39:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1"
Nov 23 09:39:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:39:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:39:45 np0005532585.localdomain podman[263624]: 2025-11-23 09:39:45.028961747 +0000 UTC m=+0.083850188 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:39:45 np0005532585.localdomain podman[263625]: 2025-11-23 09:39:45.077964824 +0000 UTC m=+0.130102719 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 23 09:39:45 np0005532585.localdomain podman[263624]: 2025-11-23 09:39:45.091763142 +0000 UTC m=+0.146651583 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:39:45 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:39:45 np0005532585.localdomain podman[263625]: 2025-11-23 09:39:45.114780604 +0000 UTC m=+0.166918469 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:39:45 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:39:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6694 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8D36B0000000001030307) 
Nov 23 09:39:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:46.792 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:46.794 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:46.794 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:46.794 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:46.821 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:46 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:46.822 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6695 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8D7610000000001030307) 
Nov 23 09:39:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23321 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8DA200000000001030307) 
Nov 23 09:39:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:39:49 np0005532585.localdomain podman[263669]: 2025-11-23 09:39:49.009798026 +0000 UTC m=+0.072580048 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:39:49 np0005532585.localdomain podman[263669]: 2025-11-23 09:39:49.020337282 +0000 UTC m=+0.083119304 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 23 09:39:49 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:39:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6696 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8DF600000000001030307) 
Nov 23 09:39:49 np0005532585.localdomain sshd[263687]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:39:49 np0005532585.localdomain sshd[263687]: Accepted publickey for zuul from 192.168.122.30 port 51714 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:39:49 np0005532585.localdomain systemd-logind[761]: New session 59 of user zuul.
Nov 23 09:39:49 np0005532585.localdomain systemd[1]: Started Session 59 of User zuul.
Nov 23 09:39:49 np0005532585.localdomain sshd[263687]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:39:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12134 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8E4200000000001030307) 
Nov 23 09:39:51 np0005532585.localdomain python3.9[263798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:39:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:51.823 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:39:52 np0005532585.localdomain podman[263820]: 2025-11-23 09:39:52.028256034 +0000 UTC m=+0.085108594 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:39:52 np0005532585.localdomain podman[263820]: 2025-11-23 09:39:52.063314672 +0000 UTC m=+0.120167262 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:39:52 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:39:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:39:52 np0005532585.localdomain systemd[1]: tmp-crun.fT4Klo.mount: Deactivated successfully.
Nov 23 09:39:52 np0005532585.localdomain podman[263844]: 2025-11-23 09:39:52.188534499 +0000 UTC m=+0.085347791 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:39:52 np0005532585.localdomain podman[263844]: 2025-11-23 09:39:52.204284793 +0000 UTC m=+0.101098085 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 23 09:39:52 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:39:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6697 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8EF210000000001030307) 
Nov 23 09:39:53 np0005532585.localdomain python3.9[263953]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:39:53 np0005532585.localdomain network[263970]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:39:53 np0005532585.localdomain network[263971]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:39:53 np0005532585.localdomain network[263972]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:39:54 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:39:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:56.826 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:56.828 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:39:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:56.828 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:39:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:56.829 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:56.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:39:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:39:56.875 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:39:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:39:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:40:00 np0005532585.localdomain sudo[264204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmpnejievhfpeymbeuxlydzxwfbvauje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890800.1361825-102-247516507348393/AnsiballZ_setup.py
Nov 23 09:40:00 np0005532585.localdomain sudo[264204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:00 np0005532585.localdomain python3.9[264206]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 09:40:01 np0005532585.localdomain sudo[264204]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:01 np0005532585.localdomain sudo[264267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdamdjzfxlkpzjaanjuzxnkuybedjyqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890800.1361825-102-247516507348393/AnsiballZ_dnf.py
Nov 23 09:40:01 np0005532585.localdomain sudo[264267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:01 np0005532585.localdomain python3.9[264269]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:40:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6698 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D910210000000001030307) 
Nov 23 09:40:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:01.876 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:01.879 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:01.879 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:01.879 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:01.909 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:01 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:01.910 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:02.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:02.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 09:40:04 np0005532585.localdomain sudo[264267]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:40:06 np0005532585.localdomain podman[264327]: 2025-11-23 09:40:06.030977991 +0000 UTC m=+0.088551122 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 09:40:06 np0005532585.localdomain podman[264327]: 2025-11-23 09:40:06.039354033 +0000 UTC m=+0.096927124 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:40:06 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:40:06 np0005532585.localdomain sudo[264399]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nntisyiuwhzsqpgzzzqwvjslclxlybqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890805.8257318-138-26347434391079/AnsiballZ_stat.py
Nov 23 09:40:06 np0005532585.localdomain sudo[264399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:06 np0005532585.localdomain python3.9[264401]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:40:06 np0005532585.localdomain sudo[264399]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.735 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.736 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.736 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.749 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.910 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.912 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.913 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.913 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.943 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:06 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:06.944 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:07 np0005532585.localdomain sudo[264509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajyjyeceypcicbndwnzmwjcxibbkvqgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890806.971308-168-115573724521954/AnsiballZ_command.py
Nov 23 09:40:07 np0005532585.localdomain sudo[264509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:07 np0005532585.localdomain python3.9[264511]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:40:07 np0005532585.localdomain sudo[264509]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:08 np0005532585.localdomain sudo[264620]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgnfxmwpjmmqxboiraakqkhdznjxlqwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890808.5655437-198-269804277817526/AnsiballZ_stat.py
Nov 23 09:40:08 np0005532585.localdomain sudo[264620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:09 np0005532585.localdomain python3.9[264622]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:40:09 np0005532585.localdomain sudo[264620]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:40:09.251 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:40:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:40:09.252 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:40:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:40:09.253 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.730 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.730 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.768 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.768 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.769 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.769 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:40:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:09.770 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:40:09 np0005532585.localdomain sudo[264734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltwlgnbtsdswjrtwvnhwzvpuytdwjbmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890809.4110181-231-199497974614057/AnsiballZ_lineinfile.py
Nov 23 09:40:09 np0005532585.localdomain sudo[264734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:10 np0005532585.localdomain python3.9[264754]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:10 np0005532585.localdomain sudo[264734]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.185 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.298 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.299 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.530 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.532 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12150MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.533 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.533 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.774 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.775 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.775 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:40:10 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:10.840 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:40:11 np0005532585.localdomain sudo[264884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brurptxjoliyexfjtmdpzygehatpulrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890810.4821515-259-215248114740434/AnsiballZ_systemd_service.py
Nov 23 09:40:11 np0005532585.localdomain sudo[264884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.306 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.314 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:40:11 np0005532585.localdomain podman[264886]: 2025-11-23 09:40:11.325362967 +0000 UTC m=+0.086062983 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:40:11 np0005532585.localdomain podman[264886]: 2025-11-23 09:40:11.333137941 +0000 UTC m=+0.093838017 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:40:11 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.371 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.373 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.373 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:40:11 np0005532585.localdomain python3.9[264887]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:40:11 np0005532585.localdomain sudo[264884]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:40:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:40:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:40:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:40:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:40:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17227 "" "Go-http-client/1.1"
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.944 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.948 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.948 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.949 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.972 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:11.973 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.355 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.356 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.356 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.356 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:40:12 np0005532585.localdomain sudo[265019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssagduneauszzdnimwktkmzlixvfhrdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890812.1328888-282-193201693724042/AnsiballZ_systemd_service.py
Nov 23 09:40:12 np0005532585.localdomain sudo[265019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:12 np0005532585.localdomain python3.9[265021]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.747 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.747 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.747 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:40:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:12.748 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:40:12 np0005532585.localdomain sudo[265019]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:13 np0005532585.localdomain sudo[265131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wycmiueaefhklkccseksxpxlsayhyqok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890813.411941-316-201710541662660/AnsiballZ_service_facts.py
Nov 23 09:40:13 np0005532585.localdomain sudo[265131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:13.840 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:40:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:13.862 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:40:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:13.862 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:40:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:13.863 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:13.864 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:13 np0005532585.localdomain python3.9[265133]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:40:14 np0005532585.localdomain network[265150]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:40:14 np0005532585.localdomain network[265151]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:40:14 np0005532585.localdomain network[265152]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:40:14 np0005532585.localdomain sudo[265158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:40:14 np0005532585.localdomain sudo[265158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:40:14 np0005532585.localdomain sudo[265158]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:14 np0005532585.localdomain sudo[265177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:40:14 np0005532585.localdomain sudo[265177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:40:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:14.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:14.737 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:40:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:40:15 np0005532585.localdomain podman[265242]: 2025-11-23 09:40:15.212851681 +0000 UTC m=+0.062557978 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 23 09:40:15 np0005532585.localdomain podman[265232]: 2025-11-23 09:40:15.279387784 +0000 UTC m=+0.142327585 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:40:15 np0005532585.localdomain podman[265242]: 2025-11-23 09:40:15.306809492 +0000 UTC m=+0.156515859 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_id=edpm, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 23 09:40:15 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:40:15 np0005532585.localdomain podman[265232]: 2025-11-23 09:40:15.332395122 +0000 UTC m=+0.195334933 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:40:15 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:40:15 np0005532585.localdomain sudo[265177]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:15.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:15.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:40:15 np0005532585.localdomain sudo[265302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:40:15 np0005532585.localdomain sudo[265302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:40:15 np0005532585.localdomain sudo[265302]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40350 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9489D0000000001030307) 
Nov 23 09:40:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:16.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:16.974 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:16.976 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:16.976 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:16.976 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:17 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:17.014 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:17 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:17.014 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40351 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D94CA00000000001030307) 
Nov 23 09:40:17 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:40:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6699 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D950200000000001030307) 
Nov 23 09:40:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:40:19 np0005532585.localdomain podman[265397]: 2025-11-23 09:40:19.140295565 +0000 UTC m=+0.075603207 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:40:19 np0005532585.localdomain podman[265397]: 2025-11-23 09:40:19.149211113 +0000 UTC m=+0.084518755 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 09:40:19 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:40:19 np0005532585.localdomain sudo[265131]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40352 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D954A00000000001030307) 
Nov 23 09:40:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23322 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D958200000000001030307) 
Nov 23 09:40:20 np0005532585.localdomain sudo[265531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smighwyyjlvykcbjiuckjvhugpqayvul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890820.0231216-345-201994050789519/AnsiballZ_file.py
Nov 23 09:40:20 np0005532585.localdomain sudo[265531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:20 np0005532585.localdomain python3.9[265533]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 09:40:20 np0005532585.localdomain sudo[265531]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:21 np0005532585.localdomain sudo[265641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-remzvpsyygkkwqygrdxnkwwezmazmezo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890820.8425436-369-160236867425880/AnsiballZ_modprobe.py
Nov 23 09:40:21 np0005532585.localdomain sudo[265641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:21 np0005532585.localdomain python3.9[265643]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 23 09:40:21 np0005532585.localdomain sudo[265641]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:22.015 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:22.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:22.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:22.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:22.042 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:22.042 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:40:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:40:22 np0005532585.localdomain sudo[265751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtrwolcluxbykddezeegpyjivqpyfdvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890821.667207-393-150442323574807/AnsiballZ_stat.py
Nov 23 09:40:22 np0005532585.localdomain sudo[265751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:23 np0005532585.localdomain systemd[1]: tmp-crun.oaA4JS.mount: Deactivated successfully.
Nov 23 09:40:23 np0005532585.localdomain podman[265753]: 2025-11-23 09:40:23.032957249 +0000 UTC m=+0.087137658 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:40:23 np0005532585.localdomain podman[265754]: 2025-11-23 09:40:23.018379522 +0000 UTC m=+0.071665063 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:40:23 np0005532585.localdomain podman[265754]: 2025-11-23 09:40:23.101424621 +0000 UTC m=+0.154710212 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:40:23 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:40:23 np0005532585.localdomain podman[265753]: 2025-11-23 09:40:23.121623853 +0000 UTC m=+0.175804322 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 23 09:40:23 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:40:23 np0005532585.localdomain python3.9[265755]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:23 np0005532585.localdomain sudo[265751]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:23 np0005532585.localdomain sudo[265846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htoeabnwtxqabbrnhnqpphonohifaihl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890821.667207-393-150442323574807/AnsiballZ_file.py
Nov 23 09:40:23 np0005532585.localdomain sudo[265846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40353 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D964600000000001030307) 
Nov 23 09:40:23 np0005532585.localdomain python3.9[265848]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:23 np0005532585.localdomain sudo[265846]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:24 np0005532585.localdomain sudo[265956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgcethpseiwhyrjnphmesfncjiccomyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890823.8985755-432-85603588997361/AnsiballZ_lineinfile.py
Nov 23 09:40:24 np0005532585.localdomain sudo[265956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:24 np0005532585.localdomain python3.9[265958]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:24 np0005532585.localdomain sudo[265956]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:24 np0005532585.localdomain sudo[266066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stoxpocmennmvaqrphhogvmcmkezjdpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890824.6781688-459-118462719654698/AnsiballZ_file.py
Nov 23 09:40:24 np0005532585.localdomain sudo[266066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:25 np0005532585.localdomain python3.9[266068]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:25 np0005532585.localdomain sudo[266066]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:25 np0005532585.localdomain sudo[266176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkokzmtjbzbmdwhtiybgeylrccbymzhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890825.4824922-486-68519042318986/AnsiballZ_stat.py
Nov 23 09:40:25 np0005532585.localdomain sudo[266176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:25 np0005532585.localdomain python3.9[266178]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:40:26 np0005532585.localdomain sudo[266176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:26 np0005532585.localdomain sudo[266288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiwzebctnzicocjvvxyscpmzklnalzyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890826.280065-513-198876456282255/AnsiballZ_stat.py
Nov 23 09:40:26 np0005532585.localdomain sudo[266288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:26 np0005532585.localdomain python3.9[266290]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:40:26 np0005532585.localdomain sudo[266288]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:27.043 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:27.045 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:27.045 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:27.046 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:27.086 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:27.087 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:27 np0005532585.localdomain sudo[266400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsglujredrthonlpziomvkumujsdwyjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890827.112934-540-64167473530899/AnsiballZ_command.py
Nov 23 09:40:27 np0005532585.localdomain sudo[266400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:27 np0005532585.localdomain python3.9[266402]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:40:27 np0005532585.localdomain sudo[266400]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:28 np0005532585.localdomain sudo[266511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chagtfxkqmihsvixtuqybttdkxeuklxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890827.9433372-570-101680330505727/AnsiballZ_replace.py
Nov 23 09:40:28 np0005532585.localdomain sudo[266511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:28 np0005532585.localdomain python3.9[266513]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:28 np0005532585.localdomain sudo[266511]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:29 np0005532585.localdomain sudo[266621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofgahziqaisfoqzmdkyyekkfodpcjpid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890828.8800433-597-55442716138045/AnsiballZ_lineinfile.py
Nov 23 09:40:29 np0005532585.localdomain sudo[266621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:29 np0005532585.localdomain python3.9[266623]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:29 np0005532585.localdomain sudo[266621]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:29 np0005532585.localdomain sudo[266731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvjotvkoopzfzoevedvhbsdjuuqkdmal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890829.5134397-597-203881516247804/AnsiballZ_lineinfile.py
Nov 23 09:40:29 np0005532585.localdomain sudo[266731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:40:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:40:29 np0005532585.localdomain python3.9[266733]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:30 np0005532585.localdomain sudo[266731]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:30 np0005532585.localdomain sudo[266841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifqtulucqoluxktcyomjntzxlysoshtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890830.1291845-597-240063944529245/AnsiballZ_lineinfile.py
Nov 23 09:40:30 np0005532585.localdomain sudo[266841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:30 np0005532585.localdomain python3.9[266843]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:30 np0005532585.localdomain sudo[266841]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:30 np0005532585.localdomain sudo[266951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtspqhlfmmwoikggvnndklwkblybpqip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890830.7170606-597-219261764333474/AnsiballZ_lineinfile.py
Nov 23 09:40:30 np0005532585.localdomain sudo[266951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:31 np0005532585.localdomain python3.9[266953]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:31 np0005532585.localdomain sudo[266951]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40354 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D984200000000001030307) 
Nov 23 09:40:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:32.087 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:32.089 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:32 np0005532585.localdomain sudo[267061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtuhsbwdspreeuzrdjynzituqwhmblvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890831.8350961-685-70535601047590/AnsiballZ_stat.py
Nov 23 09:40:32 np0005532585.localdomain sudo[267061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:32 np0005532585.localdomain python3.9[267063]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:40:32 np0005532585.localdomain sudo[267061]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:34 np0005532585.localdomain sudo[267173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufrmzbfwoppmlxsgwimbjyxjndarmkut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890833.8467793-714-61464569641580/AnsiballZ_file.py
Nov 23 09:40:34 np0005532585.localdomain sudo[267173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:34 np0005532585.localdomain python3.9[267175]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:34 np0005532585.localdomain sudo[267173]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:34 np0005532585.localdomain sudo[267283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yswnwmypjurqdoccxrflrabhsvpaxjyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890834.5705898-738-225147318614438/AnsiballZ_stat.py
Nov 23 09:40:34 np0005532585.localdomain sudo[267283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:35 np0005532585.localdomain python3.9[267285]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:35 np0005532585.localdomain sudo[267283]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:35 np0005532585.localdomain sudo[267340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwgwhjlecgrshbibwhmkbmnldglpanci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890834.5705898-738-225147318614438/AnsiballZ_file.py
Nov 23 09:40:35 np0005532585.localdomain sudo[267340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:35 np0005532585.localdomain python3.9[267342]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:35 np0005532585.localdomain sudo[267340]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:36 np0005532585.localdomain sudo[267450]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phkeojltiifgltwgyyeneqvlwyknsypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890835.9642267-738-195958481557408/AnsiballZ_stat.py
Nov 23 09:40:36 np0005532585.localdomain sudo[267450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:40:36 np0005532585.localdomain podman[267453]: 2025-11-23 09:40:36.388960791 +0000 UTC m=+0.115369501 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 23 09:40:36 np0005532585.localdomain podman[267453]: 2025-11-23 09:40:36.402433472 +0000 UTC m=+0.128842172 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:40:36 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:40:36 np0005532585.localdomain python3.9[267452]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:36 np0005532585.localdomain sudo[267450]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:36 np0005532585.localdomain sudo[267526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ektzefdxwmczzjrbsjgjueafzddzktcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890835.9642267-738-195958481557408/AnsiballZ_file.py
Nov 23 09:40:36 np0005532585.localdomain sudo[267526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:36 np0005532585.localdomain python3.9[267528]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:36 np0005532585.localdomain sudo[267526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:37.090 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:37.092 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:37 np0005532585.localdomain sudo[267636]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwcahdbnmvspwjjyesbrxdjamzkolbop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890837.511552-807-27193768323510/AnsiballZ_file.py
Nov 23 09:40:37 np0005532585.localdomain sudo[267636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:37 np0005532585.localdomain python3.9[267638]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:38 np0005532585.localdomain sudo[267636]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:39 np0005532585.localdomain sudo[267746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkmffbydhvqyjcwbdhvsdktukrufuvat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890838.8959873-832-30934314376294/AnsiballZ_stat.py
Nov 23 09:40:39 np0005532585.localdomain sudo[267746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:39 np0005532585.localdomain python3.9[267748]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:39 np0005532585.localdomain sudo[267746]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:39 np0005532585.localdomain sudo[267803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xprazrawdbypafbukmsfetpgdkplkixy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890838.8959873-832-30934314376294/AnsiballZ_file.py
Nov 23 09:40:39 np0005532585.localdomain sudo[267803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:39 np0005532585.localdomain python3.9[267805]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:39 np0005532585.localdomain sudo[267803]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:40 np0005532585.localdomain sudo[267913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bepegnvvnejgodjqxwsytxpgagnrpdfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890840.036978-867-119123267763900/AnsiballZ_stat.py
Nov 23 09:40:40 np0005532585.localdomain sudo[267913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:40 np0005532585.localdomain python3.9[267915]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:40 np0005532585.localdomain sudo[267913]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:40 np0005532585.localdomain sudo[267970]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjtfqrxnifdrlzvluzqzedmcgngcmtli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890840.036978-867-119123267763900/AnsiballZ_file.py
Nov 23 09:40:40 np0005532585.localdomain sudo[267970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:40 np0005532585.localdomain python3.9[267972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:40 np0005532585.localdomain sudo[267970]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:40:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:40:41 np0005532585.localdomain sudo[268080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyojhjeegyotwjaveyoulzphnulvooim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890841.266944-904-238066548157027/AnsiballZ_systemd.py
Nov 23 09:40:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:40:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:40:41 np0005532585.localdomain sudo[268080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:40:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:40:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1"
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: tmp-crun.OrlnuT.mount: Deactivated successfully.
Nov 23 09:40:42 np0005532585.localdomain podman[268082]: 2025-11-23 09:40:42.026499776 +0000 UTC m=+0.092512256 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:40:42 np0005532585.localdomain podman[268082]: 2025-11-23 09:40:42.036538919 +0000 UTC m=+0.102551399 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:40:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:42.093 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:42.094 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:42 np0005532585.localdomain python3.9[268083]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:40:42 np0005532585.localdomain systemd-rc-local-generator[268130]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:40:42 np0005532585.localdomain systemd-sysv-generator[268133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:42 np0005532585.localdomain sudo[268080]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:43 np0005532585.localdomain sudo[268250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpbhqbngyobusqlaxgbzrewodmhuyxyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890842.8716009-927-45678766834766/AnsiballZ_stat.py
Nov 23 09:40:43 np0005532585.localdomain sudo[268250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:43 np0005532585.localdomain python3.9[268252]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:43 np0005532585.localdomain sudo[268250]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:43 np0005532585.localdomain sudo[268307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxdgcoqicyqmqjhmmxxcfbxhinscnmzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890842.8716009-927-45678766834766/AnsiballZ_file.py
Nov 23 09:40:43 np0005532585.localdomain sudo[268307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:43 np0005532585.localdomain python3.9[268309]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:43 np0005532585.localdomain sudo[268307]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:44 np0005532585.localdomain sudo[268417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnsfsufycuzqmquwbopibshvraszzhll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890844.0033925-963-119977541249059/AnsiballZ_stat.py
Nov 23 09:40:44 np0005532585.localdomain sudo[268417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:44 np0005532585.localdomain python3.9[268419]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:44 np0005532585.localdomain sudo[268417]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:44 np0005532585.localdomain sudo[268474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqjkzpdvdlhzqkhsferytiluuhddfins ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890844.0033925-963-119977541249059/AnsiballZ_file.py
Nov 23 09:40:44 np0005532585.localdomain sudo[268474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:44 np0005532585.localdomain python3.9[268476]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:44 np0005532585.localdomain sudo[268474]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:45 np0005532585.localdomain sudo[268584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orvrdsfmusvnateelltlyyzracnajrsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890845.1868343-999-185970536039813/AnsiballZ_systemd.py
Nov 23 09:40:45 np0005532585.localdomain sudo[268584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:40:45 np0005532585.localdomain podman[268588]: 2025-11-23 09:40:45.533968407 +0000 UTC m=+0.066303616 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 09:40:45 np0005532585.localdomain podman[268588]: 2025-11-23 09:40:45.547318344 +0000 UTC m=+0.079653543 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, architecture=x86_64)
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: tmp-crun.Jh0ZBS.mount: Deactivated successfully.
Nov 23 09:40:45 np0005532585.localdomain podman[268587]: 2025-11-23 09:40:45.607051244 +0000 UTC m=+0.138069672 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:40:45 np0005532585.localdomain podman[268587]: 2025-11-23 09:40:45.645436294 +0000 UTC m=+0.176454732 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:40:45 np0005532585.localdomain python3.9[268586]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:40:45 np0005532585.localdomain systemd-sysv-generator[268661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:40:45 np0005532585.localdomain systemd-rc-local-generator[268655]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:45 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:40:46 np0005532585.localdomain systemd[1]: Starting Create netns directory...
Nov 23 09:40:46 np0005532585.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 09:40:46 np0005532585.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 09:40:46 np0005532585.localdomain systemd[1]: Finished Create netns directory.
Nov 23 09:40:46 np0005532585.localdomain sudo[268584]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27679 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9BDCB0000000001030307) 
Nov 23 09:40:46 np0005532585.localdomain sudo[268781]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qehywltwthzmikfxspjjxyqiqbjllqqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890846.6406796-1029-104645319197981/AnsiballZ_file.py
Nov 23 09:40:46 np0005532585.localdomain sudo[268781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:47.094 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:47.097 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:47 np0005532585.localdomain python3.9[268783]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:47 np0005532585.localdomain sudo[268781]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27680 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9C1E00000000001030307) 
Nov 23 09:40:47 np0005532585.localdomain sudo[268891]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lormbazklllwpuppvciluyvmvblcyxcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890847.3864791-1053-199906655866788/AnsiballZ_stat.py
Nov 23 09:40:47 np0005532585.localdomain sudo[268891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:47 np0005532585.localdomain python3.9[268893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:47 np0005532585.localdomain sudo[268891]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40355 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9C4210000000001030307) 
Nov 23 09:40:48 np0005532585.localdomain sudo[268948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evkzyoxyclqknnxvdudxddewrpjgolqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890847.3864791-1053-199906655866788/AnsiballZ_file.py
Nov 23 09:40:48 np0005532585.localdomain sudo[268948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:48 np0005532585.localdomain python3.9[268950]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:48 np0005532585.localdomain sudo[268948]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:49 np0005532585.localdomain sudo[269058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmkwrlddrmnhamwsqnlqehzkjfdxntcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890848.8744156-1095-165992635397132/AnsiballZ_file.py
Nov 23 09:40:49 np0005532585.localdomain sudo[269058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:40:49 np0005532585.localdomain podman[269061]: 2025-11-23 09:40:49.262814417 +0000 UTC m=+0.075077080 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 23 09:40:49 np0005532585.localdomain podman[269061]: 2025-11-23 09:40:49.294763046 +0000 UTC m=+0.107025749 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:40:49 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:40:49 np0005532585.localdomain python3.9[269060]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:40:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27681 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9C9E00000000001030307) 
Nov 23 09:40:49 np0005532585.localdomain sudo[269058]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:49 np0005532585.localdomain sudo[269187]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edglketqeihvhaxbwamzdztqfcoerygj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890849.6874547-1119-178700749902879/AnsiballZ_stat.py
Nov 23 09:40:49 np0005532585.localdomain sudo[269187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:50 np0005532585.localdomain python3.9[269189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:40:50 np0005532585.localdomain sudo[269187]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:50.333 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:40:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:50.353 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 23 09:40:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:50.354 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:40:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:50.354 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:40:50 np0005532585.localdomain sudo[269244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urtckdgecsoqbhopqfuekosnbpmdirbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890849.6874547-1119-178700749902879/AnsiballZ_file.py
Nov 23 09:40:50 np0005532585.localdomain sudo[269244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:50 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:50.407 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:40:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6700 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9CE200000000001030307) 
Nov 23 09:40:50 np0005532585.localdomain python3.9[269246]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.rf31qfvi recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:50 np0005532585.localdomain sudo[269244]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:52 np0005532585.localdomain sudo[269354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgtmgoondplqwvujiaiskkiuqepypryu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890851.797438-1155-237919563113632/AnsiballZ_file.py
Nov 23 09:40:52 np0005532585.localdomain sudo[269354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:52.097 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:40:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:52.098 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:52.099 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:40:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:52.099 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:52.100 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:40:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:52.104 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:52 np0005532585.localdomain python3.9[269356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:40:52 np0005532585.localdomain sudo[269354]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:52 np0005532585.localdomain sudo[269464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjzlbyhybvcvpcldepmqvshfxgsxqzpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890852.5617952-1179-215044854668046/AnsiballZ_stat.py
Nov 23 09:40:52 np0005532585.localdomain sudo[269464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:53 np0005532585.localdomain sudo[269464]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:53 np0005532585.localdomain sudo[269521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybcsjmzcyhsprbrrpgzhnuxmolgwyvhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890852.5617952-1179-215044854668046/AnsiballZ_file.py
Nov 23 09:40:53 np0005532585.localdomain sudo[269521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:40:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:40:53 np0005532585.localdomain podman[269525]: 2025-11-23 09:40:53.405765814 +0000 UTC m=+0.076471365 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:40:53 np0005532585.localdomain podman[269525]: 2025-11-23 09:40:53.413833147 +0000 UTC m=+0.084538648 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:40:53 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:40:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27682 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9D9A00000000001030307) 
Nov 23 09:40:53 np0005532585.localdomain systemd[1]: tmp-crun.2VjDmq.mount: Deactivated successfully.
Nov 23 09:40:53 np0005532585.localdomain podman[269524]: 2025-11-23 09:40:53.473980578 +0000 UTC m=+0.146814834 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:40:53 np0005532585.localdomain podman[269524]: 2025-11-23 09:40:53.487188082 +0000 UTC m=+0.160022368 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 09:40:53 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:40:53 np0005532585.localdomain sudo[269521]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:55 np0005532585.localdomain sudo[269674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgzgyourrbbswrbzjyaxjzyicdexpkux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890854.7184486-1221-281459737847166/AnsiballZ_container_config_data.py
Nov 23 09:40:55 np0005532585.localdomain sudo[269674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:55 np0005532585.localdomain python3.9[269676]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 23 09:40:55 np0005532585.localdomain sudo[269674]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:56 np0005532585.localdomain sudo[269784]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsgpnjseqjxpptakxbsdshlvlieppmlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890855.7246854-1248-52063236214935/AnsiballZ_container_config_hash.py
Nov 23 09:40:56 np0005532585.localdomain sudo[269784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:56 np0005532585.localdomain python3.9[269786]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:40:56 np0005532585.localdomain sudo[269784]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:40:57.102 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:40:57 np0005532585.localdomain sudo[269894]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpfxjbkfjhpkndwowmzvcxllqrljpayy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890856.8088832-1275-52639083906225/AnsiballZ_podman_container_info.py
Nov 23 09:40:57 np0005532585.localdomain sudo[269894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:40:57 np0005532585.localdomain python3.9[269896]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 09:40:57 np0005532585.localdomain sudo[269894]: pam_unix(sudo:session): session closed for user root
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:40:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:40:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:41:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27683 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9FA210000000001030307) 
Nov 23 09:41:01 np0005532585.localdomain sudo[270031]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acxcrhtmveiwxfhqhephlahaenrpgzlx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890861.4685798-1314-65430238543719/AnsiballZ_edpm_container_manage.py
Nov 23 09:41:01 np0005532585.localdomain sudo[270031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:02.104 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:02 np0005532585.localdomain python3[270033]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:41:02 np0005532585.localdomain python3[270033]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072",
                                                                    "Digest": "sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:11:34.680484424Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 249489385,
                                                                    "VirtualSize": 249489385,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:d9e3e9c6b6b086eeb756b403557bba77ecef73e97936fb3285a5484cd95a1b1a"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:57.320955568Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:11:34.679374753Z",
                                                                              "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:11:36.776164436Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 23 09:41:02 np0005532585.localdomain sudo[270031]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:03 np0005532585.localdomain sudo[270204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvxtccfpyyjdrcstlpcqaryfjwjwjoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890862.8008418-1338-229195270796063/AnsiballZ_stat.py
Nov 23 09:41:03 np0005532585.localdomain sudo[270204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:03 np0005532585.localdomain python3.9[270206]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:41:03 np0005532585.localdomain sudo[270204]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:03 np0005532585.localdomain sudo[270316]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcuxlmwitznsqcihwvvfxrhwsojtfelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890863.6235492-1365-139746441981109/AnsiballZ_file.py
Nov 23 09:41:03 np0005532585.localdomain sudo[270316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:04 np0005532585.localdomain python3.9[270318]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:04 np0005532585.localdomain sudo[270316]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:04 np0005532585.localdomain sudo[270371]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwwowqpyddqlssqcensbiaprowrsdpkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890863.6235492-1365-139746441981109/AnsiballZ_stat.py
Nov 23 09:41:04 np0005532585.localdomain sudo[270371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:04 np0005532585.localdomain python3.9[270373]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:41:04 np0005532585.localdomain sudo[270371]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:05 np0005532585.localdomain sudo[270480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixeusrtekierxyslfcxdcxuccsnocgcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890864.8811224-1365-159441197507467/AnsiballZ_copy.py
Nov 23 09:41:05 np0005532585.localdomain sudo[270480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:05 np0005532585.localdomain python3.9[270482]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890864.8811224-1365-159441197507467/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:05 np0005532585.localdomain sudo[270480]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:05 np0005532585.localdomain sudo[270535]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fckoatytqjchdouowiulhohhsfrfjvrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890864.8811224-1365-159441197507467/AnsiballZ_systemd.py
Nov 23 09:41:05 np0005532585.localdomain sudo[270535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:06 np0005532585.localdomain python3.9[270537]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:06 np0005532585.localdomain sudo[270535]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:41:07 np0005532585.localdomain podman[270557]: 2025-11-23 09:41:07.032130873 +0000 UTC m=+0.085324731 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:41:07 np0005532585.localdomain podman[270557]: 2025-11-23 09:41:07.07039541 +0000 UTC m=+0.123589228 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:41:07 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:41:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:07.108 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:07.109 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:07.109 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:41:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:07.110 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:07.110 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:07.113 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:08 np0005532585.localdomain python3.9[270666]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:41:08 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:08.739 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:09 np0005532585.localdomain sudo[270774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exihrzpzagrbitrbadglthluxaelxtrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890868.839902-1468-240835594316876/AnsiballZ_file.py
Nov 23 09:41:09 np0005532585.localdomain sudo[270774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:41:09.275 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:41:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:41:09.276 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:41:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:41:09.277 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:41:09 np0005532585.localdomain python3.9[270776]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:09 np0005532585.localdomain sudo[270774]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.803 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain sudo[270884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szphtrpihkmyfgnaxdflkxrewnqlzmwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890870.500251-1503-177762172564464/AnsiballZ_file.py
Nov 23 09:41:10 np0005532585.localdomain sudo[270884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.837 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a42e57-cbe3-4dbb-a989-6f01bfe21027', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.804166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b4b1b3a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '27729339740b1614bf019b97411d824c12d181ab1f2a148f947bb085700d926d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.804166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b4b2e90-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '7080f87f7e3b005d2c3dcc72cc18e7ae63af121842224e8527e2a5a4adade0f5'}]}, 'timestamp': '2025-11-23 09:41:10.838656', '_unique_id': 'c8b7fae9de874bc68875ad1fc169c898'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b920102-564b-49c0-9dbf-36f79c02212b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:41:10.841796', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8b4f13ac-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.041101861, 'message_signature': '754ae71fc1d2a2523fee484737491ff3b1c0288be71bbc8c65055a2421812b62'}]}, 'timestamp': '2025-11-23 09:41:10.864271', '_unique_id': 'd326c4fa784648a2a6f22a6c521c1b45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.869 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4af77c4-34ef-47e5-8722-6b7418e9fa39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.866777', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b4ffd6c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '35677048132aa0473c69c486c860677a5145eb5a6754d2c2708506e1234cb20b'}]}, 'timestamp': '2025-11-23 09:41:10.870222', '_unique_id': '64edd949590e4062aec5f827e622c52a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34847494-690c-4a46-b699-3bb28627ca8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.873045', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b5080d4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '35722738a8de5b139adb88d9da1721f7c107c93289f813eebfb348f2e2d5abc6'}]}, 'timestamp': '2025-11-23 09:41:10.873551', '_unique_id': 'ee35aaa543614047b263a468f924c9b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41694fbe-b1ec-4732-98cc-6a3f3fde6bde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.875662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b52da32-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': 'fd96601f110761630c227e77a6e96883e21e20096b34da70119aacc7774ba706'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.875662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b52ec66-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': '105264140730d331b7ecfb51296b64af06736a77cda364c1c11206e13ccfb91e'}]}, 'timestamp': '2025-11-23 09:41:10.889380', '_unique_id': '9e23569aa48c4d8f92eded3277abc4cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 9228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f16e5c8-a0c3-45de-ad7f-8b51347b701f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9228, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.892010', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b53659c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'cd2dd67fadf94244822d369c52cf0080317f4868504f05138dcd99de1fb12e11'}]}, 'timestamp': '2025-11-23 09:41:10.892513', '_unique_id': '3f5c5aa5ba224260bce5f32998932e95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30bdcf20-89f2-45ad-8e2d-f2a4c6cf8187', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.894682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b53cd5c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '3adccb94deaea36080d0626f91cedfa046265fa86724e5fb943340d5942e6cec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.894682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b53dd9c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'f9e42b64f192843f73968b156340be646b055452dce93ccd1cdb293bacfb24df'}]}, 'timestamp': '2025-11-23 09:41:10.895552', '_unique_id': '89e1ba0fc4364254b2d13ac073c81cd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd1dcd36-4abc-45fc-ae1f-08593586da00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.897719', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b54441c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '678ab4852825478dc8c6750141725c7e2217129e7dcd2c13a2a0764c481e823e'}]}, 'timestamp': '2025-11-23 09:41:10.898206', '_unique_id': '09a52185a282430ca803480108af7451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de51a33e-8fb3-45bc-be92-b28ac3c048a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.900300', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b54a77c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'd29a153a363918c67208feeb3738bbd86711620d6804eaca83505559c44f135c'}]}, 'timestamp': '2025-11-23 09:41:10.900748', '_unique_id': 'ae07d79e33fb4c369d212723a45e4236'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c25df58-7efb-4208-ac87-a62a170bddc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.902808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b550aa0-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '6a180e936486056eff0087d7e56bbacf30310f5c07eccde36babb6f31f172b77'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.902808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b551a9a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'e29b0066423dbd5090236f0d93308369302f8cdf2ac9acf722601c85ddaca93a'}]}, 'timestamp': '2025-11-23 09:41:10.903667', '_unique_id': 'c4dd42ccc2e84b2f90ac7233b4121b29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fbd2269-a2f9-48a7-b5fc-0b1f808e787c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.905809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b55802a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'fa1543d688da777754f71b3bc208ad4694d72feba6967227c15f25f93029ea81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.905809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b558ff2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'fab7b7abc384e028dfef55e8dbea63b401e81be11a2b0e324c6db87ef9506dc3'}]}, 'timestamp': '2025-11-23 09:41:10.906668', '_unique_id': '4722f4ee58cf4295804d79951d748ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.908 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 55830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e946d2dd-7d10-4a3f-a0b8-c5a4b3d59e70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55830000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:41:10.908966', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8b55fa0a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.041101861, 'message_signature': '1d7d176bdf1bcccab2105beecee5573bd9464c85a5723cad6bba37e4c5f47924'}]}, 'timestamp': '2025-11-23 09:41:10.909399', '_unique_id': '51249c4142ac4a76832eacfb2bbc7f5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92e9ab2-9784-4f9a-83ad-654a66498228', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.911480', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b565c84-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '6f38f75e3b3bded32d12e5b1d2f7eb7c57a42d141a01f727fed04112804c7c06'}]}, 'timestamp': '2025-11-23 09:41:10.911973', '_unique_id': '8052cc825554485c944e2456480063c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48cca1e8-caa1-47b5-82d2-e163e8c1b65e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.914630', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b56d7c2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '22bc33d64d239c809b6965b71d21f3c8bad7d123cf200b05ed907ac4cac56733'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.914630', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b56ef00-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'a58358bbb4b21f009f9f289f286b2edb2018f6c0bd68d5289cb70566f4d11502'}]}, 'timestamp': '2025-11-23 09:41:10.915672', '_unique_id': '1febe2876caf448aa549a52a3eaf49e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf45ed6e-8ebd-4c17-93f4-9036587d60cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.918087', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b575ecc-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '69762c25056fdb9eda583e52e5b2c34feab4d9ab466a9b20f8031cb7c3f2b9c7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.918087', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b577402-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'db60160a3c2ade5ffd1a17be18c5a0b654e748827349fa1f01cc656efe263b31'}]}, 'timestamp': '2025-11-23 09:41:10.919194', '_unique_id': '37f512c592ce4f5fa4c1d8fdd66412a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a72c3632-6fa1-42c1-9d31-5c392667f291', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.921348', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b57dab4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'a723af9022739f1c749722f01071a0003b66d62f44881d3b62108d78691b069d'}]}, 'timestamp': '2025-11-23 09:41:10.921635', '_unique_id': '737078cbb55140a7bacff86a17e1a9fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab6b8275-20e1-4ea2-8174-160f29c2a3e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.922941', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b5818da-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'ed7c4ff51ffa1df761635fa3b675bca36151ddd458b08e5225a5b1afe47302bf'}]}, 'timestamp': '2025-11-23 09:41:10.923225', '_unique_id': 'd7c0cc9d13be45b48037731325fa5e2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '486baf48-8973-419f-a172-c46937319fc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.924515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b58562e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': 'c3b4d0188b8719e2ab060a961ac31e49ad43a0e50df07d23f7b686dbf94fadad'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.924515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b5860a6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': 'c920b7ef858bd9fdb2c0b05a03f3cb4803582be841f75f924a232a5679d09eaa'}]}, 'timestamp': '2025-11-23 09:41:10.925046', '_unique_id': 'ba375ad520a44965ae41869a97bb2924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fb913b5-58aa-4fe4-a8ca-710a43e58e4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.926515', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b58a462-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '1cb20d14cf93b2111eb96e1ec338b4026118389d2128ddb31d7af63ff8ab7724'}]}, 'timestamp': '2025-11-23 09:41:10.926796', '_unique_id': '3a3d2f396d5e442193555cd27a6688d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15c4a913-1b0b-4154-a29b-9ab5df69ab09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.928234', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b58e7c4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '47b8cb313d901cb71d4f3447791078aaa0990ed13a6aab37bc19561bc60293de'}]}, 'timestamp': '2025-11-23 09:41:10.928525', '_unique_id': '43de97dda29e47e1a9292a1102d4fc65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fe4b43c-d01f-414b-a2c5-95ce83daac61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.929856', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b592824-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': '064bd7bdc9a699defa3c1fd2076ba025f9dc62588aa2034ea463c98beba329d8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.929856', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b5931f2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': '9196f3ce4b0ced2e7a2c46afb64019eadff96d4698479ca8c59edd572dee726a'}]}, 'timestamp': '2025-11-23 09:41:10.930404', '_unique_id': '1cf92fa132164437820a0eca47ec5e0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:41:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:41:11 np0005532585.localdomain python3.9[270886]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 09:41:11 np0005532585.localdomain sudo[270884]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:11 np0005532585.localdomain sudo[270994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvhqmwpwgstcuosagbppplgkcakgvqfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890871.222496-1527-273869471213108/AnsiballZ_modprobe.py
Nov 23 09:41:11 np0005532585.localdomain sudo[270994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:11 np0005532585.localdomain python3.9[270996]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 23 09:41:11 np0005532585.localdomain sudo[270994]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:11.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:11.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:41:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:11.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:41:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:41:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:41:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:41:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:41:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:41:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1"
Nov 23 09:41:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:12.111 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:12 np0005532585.localdomain sudo[271104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzcqynybbhpwshvwxitnpbynwtwgnhob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890871.9271379-1552-232838427487658/AnsiballZ_stat.py
Nov 23 09:41:12 np0005532585.localdomain sudo[271104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:41:12 np0005532585.localdomain systemd[1]: tmp-crun.LzQELM.mount: Deactivated successfully.
Nov 23 09:41:12 np0005532585.localdomain podman[271107]: 2025-11-23 09:41:12.353799303 +0000 UTC m=+0.083180953 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:41:12 np0005532585.localdomain podman[271107]: 2025-11-23 09:41:12.38851593 +0000 UTC m=+0.117897540 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:41:12 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:41:12 np0005532585.localdomain python3.9[271106]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:41:12 np0005532585.localdomain sudo[271104]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:12 np0005532585.localdomain sudo[271184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xveibzcdkgqprqoosihqelwqozsyiqnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890871.9271379-1552-232838427487658/AnsiballZ_file.py
Nov 23 09:41:12 np0005532585.localdomain sudo[271184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:12.791 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:41:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:12.792 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:41:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:12.792 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:41:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:12.793 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:41:12 np0005532585.localdomain python3.9[271186]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:12 np0005532585.localdomain sudo[271184]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:13 np0005532585.localdomain sudo[271294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itskzqbylbytdjpujwzftvqepfphefkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890873.243265-1590-22492456422690/AnsiballZ_lineinfile.py
Nov 23 09:41:13 np0005532585.localdomain sudo[271294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:13 np0005532585.localdomain python3.9[271296]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:13 np0005532585.localdomain sudo[271294]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.813 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.837 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.838 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.838 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.839 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.839 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.839 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.858 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.858 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.859 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.859 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:41:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:13.860 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.265 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.319 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.320 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:41:14 np0005532585.localdomain sudo[271426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uipyiwimcevljrnchnbvhhqgqeoyaqly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890874.09211-1617-87302145158521/AnsiballZ_dnf.py
Nov 23 09:41:14 np0005532585.localdomain sudo[271426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.553 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.554 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12124MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.555 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.555 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:41:14 np0005532585.localdomain python3.9[271428]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.686 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.687 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.687 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.703 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.759 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.760 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.778 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.817 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_F16C,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:41:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:14.864 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:41:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:15.279 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:41:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:15.286 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:41:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:15.304 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:41:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:15.307 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:41:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:15.307 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:41:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:41:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:41:16 np0005532585.localdomain podman[271453]: 2025-11-23 09:41:16.016363849 +0000 UTC m=+0.076213105 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 23 09:41:16 np0005532585.localdomain systemd[1]: tmp-crun.GPml6p.mount: Deactivated successfully.
Nov 23 09:41:16 np0005532585.localdomain podman[271453]: 2025-11-23 09:41:16.082510099 +0000 UTC m=+0.142359365 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:41:16 np0005532585.localdomain sudo[271482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:41:16 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:41:16 np0005532585.localdomain sudo[271482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:41:16 np0005532585.localdomain sudo[271482]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:16 np0005532585.localdomain podman[271454]: 2025-11-23 09:41:16.090058685 +0000 UTC m=+0.147296459 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Nov 23 09:41:16 np0005532585.localdomain podman[271454]: 2025-11-23 09:41:16.174326662 +0000 UTC m=+0.231564426 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7)
Nov 23 09:41:16 np0005532585.localdomain sudo[271512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:41:16 np0005532585.localdomain sudo[271512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:41:16 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:41:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:16.303 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:16.304 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:16.304 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:41:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6841 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA32FB0000000001030307) 
Nov 23 09:41:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:16.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:41:16 np0005532585.localdomain sudo[271512]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:17 np0005532585.localdomain sudo[271563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:41:17 np0005532585.localdomain sudo[271563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:41:17 np0005532585.localdomain sudo[271563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:17 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:17.113 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:17 np0005532585.localdomain sudo[271581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:41:17 np0005532585.localdomain sudo[271581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:41:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6842 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA37200000000001030307) 
Nov 23 09:41:17 np0005532585.localdomain sudo[271581]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:17 np0005532585.localdomain sudo[271426]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27684 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA3A210000000001030307) 
Nov 23 09:41:18 np0005532585.localdomain python3.9[271725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 09:41:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6843 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA3F210000000001030307) 
Nov 23 09:41:19 np0005532585.localdomain sudo[271837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlamilhzaubffyoviyvnnsofblwuuymz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890879.1966398-1669-26642445992149/AnsiballZ_file.py
Nov 23 09:41:19 np0005532585.localdomain sudo[271837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:41:19 np0005532585.localdomain podman[271840]: 2025-11-23 09:41:19.60274657 +0000 UTC m=+0.071196959 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:41:19 np0005532585.localdomain podman[271840]: 2025-11-23 09:41:19.635039021 +0000 UTC m=+0.103489429 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 23 09:41:19 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:41:19 np0005532585.localdomain python3.9[271839]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:19 np0005532585.localdomain sudo[271837]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40356 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA42200000000001030307) 
Nov 23 09:41:20 np0005532585.localdomain sudo[271965]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmjybbzitszasnwclbmtlzheybdfxeff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890880.3760128-1703-109073261072019/AnsiballZ_systemd_service.py
Nov 23 09:41:20 np0005532585.localdomain sudo[271965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:20 np0005532585.localdomain sudo[271968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:41:20 np0005532585.localdomain sudo[271968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:41:20 np0005532585.localdomain python3.9[271967]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:41:20 np0005532585.localdomain sudo[271968]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:20 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:41:21 np0005532585.localdomain systemd-rc-local-generator[272010]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:41:21 np0005532585.localdomain systemd-sysv-generator[272015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:21 np0005532585.localdomain sudo[271965]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:22.115 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:22.120 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:22 np0005532585.localdomain python3.9[272130]: ansible-ansible.builtin.service_facts Invoked
Nov 23 09:41:22 np0005532585.localdomain network[272147]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 09:41:22 np0005532585.localdomain network[272148]: 'network-scripts' will be removed from distribution in near future.
Nov 23 09:41:22 np0005532585.localdomain network[272149]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 09:41:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6844 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA4EE00000000001030307) 
Nov 23 09:41:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:41:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:41:23 np0005532585.localdomain systemd[1]: tmp-crun.G4rFQF.mount: Deactivated successfully.
Nov 23 09:41:23 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:41:23 np0005532585.localdomain podman[272198]: 2025-11-23 09:41:23.633490405 +0000 UTC m=+0.104385997 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:41:23 np0005532585.localdomain podman[272185]: 2025-11-23 09:41:23.590996995 +0000 UTC m=+0.118678335 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:41:23 np0005532585.localdomain podman[272198]: 2025-11-23 09:41:23.648276157 +0000 UTC m=+0.119171789 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 09:41:23 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:41:23 np0005532585.localdomain podman[272185]: 2025-11-23 09:41:23.676390757 +0000 UTC m=+0.204072107 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:41:23 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:41:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:27.117 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:27 np0005532585.localdomain sudo[272423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpdltcgznlqsfjbddzbwspsdppatofck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890887.5502515-1759-229644675634592/AnsiballZ_systemd_service.py
Nov 23 09:41:27 np0005532585.localdomain sudo[272423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:28 np0005532585.localdomain python3.9[272425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:28 np0005532585.localdomain sudo[272423]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:28 np0005532585.localdomain sudo[272534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iowxndfjsxopfufgwyopwjyuvlhafenl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890888.29563-1759-3879935685729/AnsiballZ_systemd_service.py
Nov 23 09:41:28 np0005532585.localdomain sudo[272534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:28 np0005532585.localdomain python3.9[272536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:28 np0005532585.localdomain sudo[272534]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:29 np0005532585.localdomain sudo[272645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmjofwpohbumvjcpjyoxfmtywmbpgqik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890889.0739384-1759-219026940338136/AnsiballZ_systemd_service.py
Nov 23 09:41:29 np0005532585.localdomain sudo[272645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:29 np0005532585.localdomain python3.9[272647]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:29 np0005532585.localdomain sudo[272645]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:41:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:41:30 np0005532585.localdomain sudo[272756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyjqdquizmxbyndypekefqrjmmpgdlta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890889.8428273-1759-143519390225829/AnsiballZ_systemd_service.py
Nov 23 09:41:30 np0005532585.localdomain sudo[272756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:30 np0005532585.localdomain python3.9[272758]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:30 np0005532585.localdomain sudo[272756]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:30 np0005532585.localdomain sudo[272867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rinudmgookbrbmczxehxtordasqesaqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890890.6151206-1759-263560800338687/AnsiballZ_systemd_service.py
Nov 23 09:41:30 np0005532585.localdomain sudo[272867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:31 np0005532585.localdomain python3.9[272869]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:31 np0005532585.localdomain sudo[272867]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:31 np0005532585.localdomain sudo[272978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaojuuvgrkgpchyuobgqoefaiuzjwmsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890891.317375-1759-159485624218939/AnsiballZ_systemd_service.py
Nov 23 09:41:31 np0005532585.localdomain sudo[272978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:31 np0005532585.localdomain python3.9[272980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:31 np0005532585.localdomain sudo[272978]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6845 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA70210000000001030307) 
Nov 23 09:41:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:32.120 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:32 np0005532585.localdomain sudo[273089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmfekriuosduyncwbrtjrbwxgxnkblyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890892.0808675-1759-118536697841654/AnsiballZ_systemd_service.py
Nov 23 09:41:32 np0005532585.localdomain sudo[273089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:32 np0005532585.localdomain python3.9[273091]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:32 np0005532585.localdomain sudo[273089]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:33 np0005532585.localdomain sudo[273200]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozhqkccmvwotmkbfldsdlpyskhhdnzai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890892.830777-1759-131693576163595/AnsiballZ_systemd_service.py
Nov 23 09:41:33 np0005532585.localdomain sudo[273200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:33 np0005532585.localdomain python3.9[273202]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:41:33 np0005532585.localdomain sudo[273200]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:36 np0005532585.localdomain sudo[273311]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kupjdngiiubgbsbmazwsyovkuahwubau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890896.7120578-1936-168454713591689/AnsiballZ_file.py
Nov 23 09:41:36 np0005532585.localdomain sudo[273311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:37.122 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:37 np0005532585.localdomain python3.9[273313]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:37 np0005532585.localdomain sudo[273311]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:37 np0005532585.localdomain sudo[273421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjcvztycundcqxcamaldmqdbzqpeewry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890897.2841957-1936-258645318471815/AnsiballZ_file.py
Nov 23 09:41:37 np0005532585.localdomain sudo[273421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:41:37 np0005532585.localdomain podman[273423]: 2025-11-23 09:41:37.6431132 +0000 UTC m=+0.074904065 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:41:37 np0005532585.localdomain podman[273423]: 2025-11-23 09:41:37.654239939 +0000 UTC m=+0.086030804 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:41:37 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:41:37 np0005532585.localdomain python3.9[273424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:37 np0005532585.localdomain sudo[273421]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:39 np0005532585.localdomain sudo[273553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eothhyinihzxrbrpixlsvnakpqarjlli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890897.8896463-1936-10143968829092/AnsiballZ_file.py
Nov 23 09:41:39 np0005532585.localdomain sudo[273553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:39 np0005532585.localdomain python3.9[273555]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:39 np0005532585.localdomain sudo[273553]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:40 np0005532585.localdomain sudo[273663]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmzgopxaevxmkxigjddubefdlxzmqpks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890900.0176995-1936-195345443126877/AnsiballZ_file.py
Nov 23 09:41:40 np0005532585.localdomain sudo[273663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:40 np0005532585.localdomain python3.9[273665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:40 np0005532585.localdomain sudo[273663]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:40 np0005532585.localdomain sudo[273773]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwowtqddhsfurdoyojfiuvlrmpgwzhkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890900.6177368-1936-243115422216454/AnsiballZ_file.py
Nov 23 09:41:40 np0005532585.localdomain sudo[273773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:41 np0005532585.localdomain python3.9[273775]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:41 np0005532585.localdomain sudo[273773]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:41:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:41:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:41:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:41:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:41:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1"
Nov 23 09:41:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:42.126 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:42.127 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:42.128 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:41:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:42.128 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:42.144 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:42.145 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:42 np0005532585.localdomain sudo[273883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icguglubyffrdquipcuosyhyrzawsqsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890902.1884222-1936-181970367053162/AnsiballZ_file.py
Nov 23 09:41:42 np0005532585.localdomain sudo[273883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:41:42 np0005532585.localdomain podman[273886]: 2025-11-23 09:41:42.585963817 +0000 UTC m=+0.085894098 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:41:42 np0005532585.localdomain podman[273886]: 2025-11-23 09:41:42.593532594 +0000 UTC m=+0.093462925 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:41:42 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:41:42 np0005532585.localdomain python3.9[273885]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:42 np0005532585.localdomain sudo[273883]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:43 np0005532585.localdomain sudo[274014]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkabcrqwclzqhnspbvjynxnamgibxxvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890902.8227217-1936-251493041564647/AnsiballZ_file.py
Nov 23 09:41:43 np0005532585.localdomain sudo[274014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:43 np0005532585.localdomain python3.9[274016]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:43 np0005532585.localdomain sudo[274014]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:43 np0005532585.localdomain sudo[274124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdecxcyuvjfgtglqtypybvpkjkvfwkzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890903.4763446-1936-32132469967705/AnsiballZ_file.py
Nov 23 09:41:43 np0005532585.localdomain sudo[274124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:43 np0005532585.localdomain python3.9[274126]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:43 np0005532585.localdomain sudo[274124]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:44 np0005532585.localdomain sudo[274234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppyrdadfmdcyitcljyfjxsfxklkzhqly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890904.1384811-2108-157557525307152/AnsiballZ_file.py
Nov 23 09:41:44 np0005532585.localdomain sudo[274234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:44 np0005532585.localdomain python3.9[274236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:44 np0005532585.localdomain sudo[274234]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:44 np0005532585.localdomain sudo[274344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itlbwkrqocynjhjthlqrhwjgpizjvaxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890904.7384236-2108-205714486396835/AnsiballZ_file.py
Nov 23 09:41:44 np0005532585.localdomain sudo[274344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:45 np0005532585.localdomain python3.9[274346]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:45 np0005532585.localdomain sudo[274344]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:45 np0005532585.localdomain sudo[274454]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwffoyyvalejsmsnmzcrmhjnnkxhocay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890905.3242712-2108-54103062664623/AnsiballZ_file.py
Nov 23 09:41:45 np0005532585.localdomain sudo[274454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:45 np0005532585.localdomain python3.9[274456]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:45 np0005532585.localdomain sudo[274454]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:46 np0005532585.localdomain sudo[274564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muruxxahiizmqfpgakzulolfcdhqpenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890905.9128664-2108-144138125999845/AnsiballZ_file.py
Nov 23 09:41:46 np0005532585.localdomain sudo[274564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:41:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:41:46 np0005532585.localdomain podman[274566]: 2025-11-23 09:41:46.290276119 +0000 UTC m=+0.090284187 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:41:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3795 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAA82B0000000001030307) 
Nov 23 09:41:46 np0005532585.localdomain podman[274568]: 2025-11-23 09:41:46.262865531 +0000 UTC m=+0.061689801 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7)
Nov 23 09:41:46 np0005532585.localdomain podman[274568]: 2025-11-23 09:41:46.384037713 +0000 UTC m=+0.182862023 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350)
Nov 23 09:41:46 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:41:46 np0005532585.localdomain podman[274566]: 2025-11-23 09:41:46.399231618 +0000 UTC m=+0.199239676 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 09:41:46 np0005532585.localdomain python3.9[274567]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:46 np0005532585.localdomain sudo[274564]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:46 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:41:46 np0005532585.localdomain sudo[274717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikrxscnorjixmsxyciazuyqovjhnwqab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890906.529707-2108-89708355107836/AnsiballZ_file.py
Nov 23 09:41:46 np0005532585.localdomain sudo[274717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:47 np0005532585.localdomain python3.9[274719]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:47 np0005532585.localdomain sudo[274717]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:47.146 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:47.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3796 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAAC210000000001030307) 
Nov 23 09:41:47 np0005532585.localdomain sudo[274827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfalpwtssnvxbhosvvmqkcbhgijvxrpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890907.1526654-2108-255948470241399/AnsiballZ_file.py
Nov 23 09:41:47 np0005532585.localdomain sudo[274827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:47 np0005532585.localdomain python3.9[274829]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:47 np0005532585.localdomain sudo[274827]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:48 np0005532585.localdomain sudo[274937]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lffjgoxdarcysqsemihcoukmyjkfwfgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890907.779214-2108-168976837659612/AnsiballZ_file.py
Nov 23 09:41:48 np0005532585.localdomain sudo[274937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:48 np0005532585.localdomain python3.9[274939]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:48 np0005532585.localdomain sudo[274937]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6846 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAB0200000000001030307) 
Nov 23 09:41:48 np0005532585.localdomain sudo[275047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lncocvxitulcvknqsaaigtrphhgmkhrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890908.3872495-2108-84927487509447/AnsiballZ_file.py
Nov 23 09:41:48 np0005532585.localdomain sudo[275047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:48 np0005532585.localdomain python3.9[275049]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:41:48 np0005532585.localdomain sudo[275047]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3797 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAB4210000000001030307) 
Nov 23 09:41:49 np0005532585.localdomain sudo[275157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qearubqlepqcrrdxzrakledyxiwziqtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890909.2493267-2282-15058219930763/AnsiballZ_command.py
Nov 23 09:41:49 np0005532585.localdomain sudo[275157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:49 np0005532585.localdomain python3.9[275159]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:41:49 np0005532585.localdomain sudo[275157]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:49 np0005532585.localdomain podman[275162]: 2025-11-23 09:41:49.851595827 +0000 UTC m=+0.076953639 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:41:49 np0005532585.localdomain podman[275162]: 2025-11-23 09:41:49.862339584 +0000 UTC m=+0.087697456 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent)
Nov 23 09:41:49 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:41:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27685 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAB8200000000001030307) 
Nov 23 09:41:50 np0005532585.localdomain python3.9[275287]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 09:41:51 np0005532585.localdomain sudo[275395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyxfmwhmiaqklqzhpiglenllswopyfui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890910.9372098-2335-82654355236724/AnsiballZ_systemd_service.py
Nov 23 09:41:51 np0005532585.localdomain sudo[275395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:51 np0005532585.localdomain python3.9[275397]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:41:51 np0005532585.localdomain systemd-rc-local-generator[275425]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:41:51 np0005532585.localdomain systemd-sysv-generator[275428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:41:51 np0005532585.localdomain sudo[275395]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:52.151 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4993-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:52.153 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:52.154 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:41:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:52.154 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:52.179 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:52.180 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:52 np0005532585.localdomain sudo[275541]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qegithwtyszmxtggeksjtfvvlhsjwnhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890912.2066607-2359-159555828000571/AnsiballZ_command.py
Nov 23 09:41:52 np0005532585.localdomain sudo[275541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:52 np0005532585.localdomain python3.9[275543]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:52 np0005532585.localdomain sudo[275541]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:53 np0005532585.localdomain sudo[275652]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swrhqaqxnoohjvnqfjxorwjamrmkdeea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890912.828139-2359-83282645294828/AnsiballZ_command.py
Nov 23 09:41:53 np0005532585.localdomain sudo[275652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:53 np0005532585.localdomain python3.9[275654]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:53 np0005532585.localdomain sudo[275652]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3798 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAC3E10000000001030307) 
Nov 23 09:41:53 np0005532585.localdomain sudo[275763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuupeeknlofwxwuupcmuulsfmtivowsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890913.4657419-2359-68610587151630/AnsiballZ_command.py
Nov 23 09:41:53 np0005532585.localdomain sudo[275763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:41:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:41:53 np0005532585.localdomain systemd[1]: tmp-crun.ULYjwZ.mount: Deactivated successfully.
Nov 23 09:41:53 np0005532585.localdomain podman[275766]: 2025-11-23 09:41:53.944918121 +0000 UTC m=+0.139918929 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 23 09:41:53 np0005532585.localdomain podman[275766]: 2025-11-23 09:41:53.955202623 +0000 UTC m=+0.150203421 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:41:53 np0005532585.localdomain python3.9[275765]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:53 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:41:53 np0005532585.localdomain sudo[275763]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:54 np0005532585.localdomain podman[275767]: 2025-11-23 09:41:53.908068288 +0000 UTC m=+0.103222651 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:41:54 np0005532585.localdomain podman[275767]: 2025-11-23 09:41:54.043181546 +0000 UTC m=+0.238335919 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:41:54 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:41:54 np0005532585.localdomain sudo[275915]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlaachrcdyezlhtvbvomrmpmlenegyll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890914.1102724-2359-231581878981309/AnsiballZ_command.py
Nov 23 09:41:54 np0005532585.localdomain sudo[275915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:54 np0005532585.localdomain python3.9[275917]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:54 np0005532585.localdomain sudo[275915]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:55 np0005532585.localdomain sudo[276026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjkiustbijgzzcaefmcxsnwvvnnoiioh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890915.2640338-2359-216132972787139/AnsiballZ_command.py
Nov 23 09:41:55 np0005532585.localdomain sudo[276026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:55 np0005532585.localdomain python3.9[276028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:55 np0005532585.localdomain sudo[276026]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:56 np0005532585.localdomain sudo[276137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cknuktsagudtetzmpqjjabbnadneihsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890915.929216-2359-105629577499174/AnsiballZ_command.py
Nov 23 09:41:56 np0005532585.localdomain sudo[276137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:56 np0005532585.localdomain python3.9[276139]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:56 np0005532585.localdomain sudo[276137]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:56 np0005532585.localdomain sudo[276248]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paolyrwyjmpyvzgkmzdqpzxxcxrbfsyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890916.5707128-2359-277981723630145/AnsiballZ_command.py
Nov 23 09:41:56 np0005532585.localdomain sudo[276248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:57 np0005532585.localdomain python3.9[276250]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:57 np0005532585.localdomain sudo[276248]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:57.181 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:57.182 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:41:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:57.183 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:41:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:57.183 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:57.228 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:41:57 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:41:57.228 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:41:58 np0005532585.localdomain sudo[276359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laujcjwuhzogasllbbdybmbcqpcklqhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890917.1539423-2359-181442414587121/AnsiballZ_command.py
Nov 23 09:41:58 np0005532585.localdomain sudo[276359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:41:58 np0005532585.localdomain python3.9[276361]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:41:58 np0005532585.localdomain sudo[276359]: pam_unix(sudo:session): session closed for user root
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:41:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:41:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:42:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:42:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3799 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAE4200000000001030307) 
Nov 23 09:42:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:02.230 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:02.231 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:02.232 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:02.232 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:02.296 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:02 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:02.296 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:03 np0005532585.localdomain sudo[276470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxgcrmfkqddojoyybvluefcerkzczjnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890923.2829082-2566-254706544818838/AnsiballZ_file.py
Nov 23 09:42:03 np0005532585.localdomain sudo[276470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:03 np0005532585.localdomain python3.9[276472]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:03 np0005532585.localdomain sudo[276470]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:04 np0005532585.localdomain sudo[276580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lizsdldszjgyjajbncsoykzkevmdkgcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890923.9057324-2566-259846404146547/AnsiballZ_file.py
Nov 23 09:42:04 np0005532585.localdomain sudo[276580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:04 np0005532585.localdomain python3.9[276582]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:04 np0005532585.localdomain sudo[276580]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:42:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:42:04 np0005532585.localdomain sudo[276690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewwwygmnsotnbqkoievthxmhxuxnvoaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890924.548429-2566-72015565980974/AnsiballZ_file.py
Nov 23 09:42:04 np0005532585.localdomain sudo[276690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:05 np0005532585.localdomain python3.9[276692]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:05 np0005532585.localdomain sudo[276690]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:05 np0005532585.localdomain sudo[276800]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agpjhseyszeahiyanwaqxfvewtacpkum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890925.3071249-2632-60662499364985/AnsiballZ_file.py
Nov 23 09:42:05 np0005532585.localdomain sudo[276800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:05 np0005532585.localdomain python3.9[276802]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:05 np0005532585.localdomain sudo[276800]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:06 np0005532585.localdomain sudo[276910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifbrmuuqilchrjogpehpibbkkikuhkip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890925.9438796-2632-213941782096447/AnsiballZ_file.py
Nov 23 09:42:06 np0005532585.localdomain sudo[276910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:06 np0005532585.localdomain python3.9[276912]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:06 np0005532585.localdomain sudo[276910]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:06 np0005532585.localdomain sudo[277020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itwqlbgxummfecqdvckhfsnewmgsaxqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890926.537285-2632-226717569954963/AnsiballZ_file.py
Nov 23 09:42:06 np0005532585.localdomain sudo[277020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:06 np0005532585.localdomain python3.9[277022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:07 np0005532585.localdomain sudo[277020]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:07.298 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:07.300 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:07.300 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:07.300 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:07.328 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:07 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:07.329 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:07 np0005532585.localdomain sudo[277130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjhpzldcqomfcsachxkfzzfpgqresjso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890927.1129298-2632-64203160946748/AnsiballZ_file.py
Nov 23 09:42:07 np0005532585.localdomain sudo[277130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:07 np0005532585.localdomain python3.9[277132]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:07 np0005532585.localdomain sudo[277130]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:42:08 np0005532585.localdomain sudo[277246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xckurhkdkevwsdsigfuamovxknyfcgzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890927.7322037-2632-36037905617586/AnsiballZ_file.py
Nov 23 09:42:08 np0005532585.localdomain sudo[277246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:08 np0005532585.localdomain systemd[1]: tmp-crun.JGx2rD.mount: Deactivated successfully.
Nov 23 09:42:08 np0005532585.localdomain podman[277223]: 2025-11-23 09:42:08.056218871 +0000 UTC m=+0.101924474 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 09:42:08 np0005532585.localdomain podman[277223]: 2025-11-23 09:42:08.070271307 +0000 UTC m=+0.115976920 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:42:08 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:42:08 np0005532585.localdomain python3.9[277253]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:08 np0005532585.localdomain sudo[277246]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:08 np0005532585.localdomain sudo[277370]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkuetbafhnwxpwcivdhxznfzxexsxdyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890928.5944226-2632-128168556817983/AnsiballZ_file.py
Nov 23 09:42:08 np0005532585.localdomain sudo[277370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:09 np0005532585.localdomain python3.9[277372]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:09 np0005532585.localdomain sudo[277370]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:42:09.276 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:42:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:42:09.277 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:42:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:42:09.278 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:42:09 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:09.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:10 np0005532585.localdomain sudo[277480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkhqnvdzuohcftupdifxrdnyjyqojezk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890929.1884565-2632-57563913055186/AnsiballZ_file.py
Nov 23 09:42:10 np0005532585.localdomain sudo[277480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:10 np0005532585.localdomain python3.9[277482]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:10 np0005532585.localdomain sudo[277480]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:11 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:11.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:42:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:42:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:42:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:42:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:42:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1"
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.329 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.331 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.331 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.331 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.369 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.370 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:42:12 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:12.719 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:42:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:42:13 np0005532585.localdomain podman[277500]: 2025-11-23 09:42:13.036647947 +0000 UTC m=+0.087441155 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.039 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.040 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.040 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.040 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:42:13 np0005532585.localdomain podman[277500]: 2025-11-23 09:42:13.049162074 +0000 UTC m=+0.099955272 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:42:13 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.827 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.843 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.844 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.845 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.845 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.865 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.865 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.866 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.866 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:42:13 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:13.866 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.333 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.416 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.416 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.616 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.619 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12125MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.619 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.620 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.683 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.684 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.684 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:42:14 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:14.728 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:42:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:15.187 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:42:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:15.194 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:42:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:15.208 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:42:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:15.210 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:42:15 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:15.211 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:42:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:16.207 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:16.207 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16619 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB1D5B0000000001030307) 
Nov 23 09:42:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:16.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:16 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:16.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:42:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:42:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:42:17 np0005532585.localdomain podman[277568]: 2025-11-23 09:42:17.021676142 +0000 UTC m=+0.073391541 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_id=edpm)
Nov 23 09:42:17 np0005532585.localdomain podman[277568]: 2025-11-23 09:42:17.037516934 +0000 UTC m=+0.089232333 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 23 09:42:17 np0005532585.localdomain podman[277567]: 2025-11-23 09:42:17.073163905 +0000 UTC m=+0.126808694 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:42:17 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:42:17 np0005532585.localdomain podman[277567]: 2025-11-23 09:42:17.168464009 +0000 UTC m=+0.222108778 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 23 09:42:17 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:42:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16620 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB21610000000001030307) 
Nov 23 09:42:17 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:17.372 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:17 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:17.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:17 np0005532585.localdomain sudo[277703]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grnbaqgifjkjltidrqlxarccewfxsvgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890937.51906-2957-260665995927345/AnsiballZ_getent.py
Nov 23 09:42:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3800 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB24200000000001030307) 
Nov 23 09:42:18 np0005532585.localdomain sudo[277703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:18 np0005532585.localdomain python3.9[277705]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 23 09:42:18 np0005532585.localdomain sudo[277703]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:18 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:18.712 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:42:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16621 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB29610000000001030307) 
Nov 23 09:42:19 np0005532585.localdomain sshd[277724]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:42:19 np0005532585.localdomain sshd[277724]: Accepted publickey for zuul from 192.168.122.30 port 36696 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:42:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:42:19 np0005532585.localdomain systemd-logind[761]: New session 60 of user zuul.
Nov 23 09:42:19 np0005532585.localdomain systemd[1]: Started Session 60 of User zuul.
Nov 23 09:42:19 np0005532585.localdomain sshd[277724]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:42:20 np0005532585.localdomain podman[277727]: 2025-11-23 09:42:20.023542462 +0000 UTC m=+0.087220258 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:42:20 np0005532585.localdomain podman[277727]: 2025-11-23 09:42:20.031179335 +0000 UTC m=+0.094857131 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:42:20 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:42:20 np0005532585.localdomain sshd[277733]: Received disconnect from 192.168.122.30 port 36696:11: disconnected by user
Nov 23 09:42:20 np0005532585.localdomain sshd[277733]: Disconnected from user zuul 192.168.122.30 port 36696
Nov 23 09:42:20 np0005532585.localdomain sshd[277724]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:42:20 np0005532585.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Nov 23 09:42:20 np0005532585.localdomain systemd-logind[761]: Session 60 logged out. Waiting for processes to exit.
Nov 23 09:42:20 np0005532585.localdomain systemd-logind[761]: Removed session 60.
Nov 23 09:42:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6847 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB2E220000000001030307) 
Nov 23 09:42:20 np0005532585.localdomain python3.9[277851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:21 np0005532585.localdomain sudo[277938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:42:21 np0005532585.localdomain sudo[277938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:42:21 np0005532585.localdomain sudo[277938]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:21 np0005532585.localdomain sudo[277956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:42:21 np0005532585.localdomain sudo[277956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:42:21 np0005532585.localdomain python3.9[277937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890940.2437236-3039-273523693651941/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:21 np0005532585.localdomain python3.9[278096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:21 np0005532585.localdomain sudo[277956]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:22 np0005532585.localdomain python3.9[278167]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:22.375 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:22.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:22.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:22.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:22.408 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:22 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:22.409 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:22 np0005532585.localdomain python3.9[278275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:23 np0005532585.localdomain python3.9[278361]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890942.4376733-3039-234281955856509/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16622 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB39200000000001030307) 
Nov 23 09:42:24 np0005532585.localdomain sshd[278433]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:42:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:42:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:42:25 np0005532585.localdomain podman[278473]: 2025-11-23 09:42:25.02861655 +0000 UTC m=+0.078505872 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:42:25 np0005532585.localdomain podman[278473]: 2025-11-23 09:42:25.038495663 +0000 UTC m=+0.088385005 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:42:25 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:42:25 np0005532585.localdomain python3.9[278471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:25 np0005532585.localdomain podman[278472]: 2025-11-23 09:42:25.092112834 +0000 UTC m=+0.142170151 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:42:25 np0005532585.localdomain sudo[278504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:42:25 np0005532585.localdomain sudo[278504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:42:25 np0005532585.localdomain sudo[278504]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:25 np0005532585.localdomain sshd[278433]: Invalid user support from 78.128.112.74 port 41940
Nov 23 09:42:25 np0005532585.localdomain podman[278472]: 2025-11-23 09:42:25.113484793 +0000 UTC m=+0.163542160 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:42:25 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:42:25 np0005532585.localdomain sshd[278433]: Connection closed by invalid user support 78.128.112.74 port 41940 [preauth]
Nov 23 09:42:25 np0005532585.localdomain python3.9[278613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890944.2910163-3039-96479055692771/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=c48862f04c3bb6bb101bc9efe68e434d3f83ed7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:26 np0005532585.localdomain python3.9[278721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:27 np0005532585.localdomain python3.9[278807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890945.7078178-3039-133532006022135/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:27.410 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:27.411 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:27.411 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:27.412 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:27.446 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:27 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:27.447 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:27 np0005532585.localdomain python3.9[278915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:28 np0005532585.localdomain python3.9[279001]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890947.465684-3039-234719898134482/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:29 np0005532585.localdomain sudo[279109]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckugdeijpefidymvihrliluizgeakfcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890948.7601745-3288-68378586048471/AnsiballZ_file.py
Nov 23 09:42:29 np0005532585.localdomain sudo[279109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:29 np0005532585.localdomain python3.9[279111]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:42:29 np0005532585.localdomain sudo[279109]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:29 np0005532585.localdomain sudo[279219]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfwzwjeqxqzjeraqolvhmqqqnvdmhywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890949.4616432-3311-154745796859640/AnsiballZ_copy.py
Nov 23 09:42:29 np0005532585.localdomain sudo[279219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:29 np0005532585.localdomain python3.9[279221]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:42:29 np0005532585.localdomain sudo[279219]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:42:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:42:30 np0005532585.localdomain sudo[279329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgllssasuoktowgdypthskxgdjtvygrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890950.2143419-3335-44952957254076/AnsiballZ_stat.py
Nov 23 09:42:30 np0005532585.localdomain sudo[279329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:30 np0005532585.localdomain python3.9[279331]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:30 np0005532585.localdomain sudo[279329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:31 np0005532585.localdomain sudo[279441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbvgciceuojlgvjfaenmurxdlofpoler ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890950.9647124-3362-120475527822773/AnsiballZ_file.py
Nov 23 09:42:31 np0005532585.localdomain sudo[279441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:31 np0005532585.localdomain python3.9[279443]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:42:31 np0005532585.localdomain sudo[279441]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16623 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB5A200000000001030307) 
Nov 23 09:42:32 np0005532585.localdomain python3.9[279551]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:32.448 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:32.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:32.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:32.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:32.480 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:32 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:32.482 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:33 np0005532585.localdomain python3.9[279661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:33 np0005532585.localdomain python3.9[279716]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:34 np0005532585.localdomain python3.9[279824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 09:42:34 np0005532585.localdomain python3.9[279879]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 09:42:35 np0005532585.localdomain sudo[279987]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqmgejetuqhnlwsffnrlypnbzkhquolz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890954.8732328-3491-30462084860349/AnsiballZ_container_config_data.py
Nov 23 09:42:35 np0005532585.localdomain sudo[279987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:35 np0005532585.localdomain python3.9[279989]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 23 09:42:35 np0005532585.localdomain sudo[279987]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:35 np0005532585.localdomain sudo[280097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjksztmvzvewyzafyypjwbvvxygnhyeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890955.6560616-3518-45752384918318/AnsiballZ_container_config_hash.py
Nov 23 09:42:35 np0005532585.localdomain sudo[280097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:36 np0005532585.localdomain python3.9[280099]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:42:36 np0005532585.localdomain sudo[280097]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:36 np0005532585.localdomain sudo[280207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-degmfubivbvdapfogbzyiznqzrusntan ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890956.6588867-3548-71614967770254/AnsiballZ_edpm_container_manage.py
Nov 23 09:42:36 np0005532585.localdomain sudo[280207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:37 np0005532585.localdomain python3[280209]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:42:37 np0005532585.localdomain python3[280209]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",
                                                                    "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:33:31.011385583Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211770748,
                                                                    "VirtualSize": 1211770748,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",
                                                                              "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",
                                                                              "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:19.349843192Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:59.347040136Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:02.744397841Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:22.134493628Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:23.375712978Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:39.628890315Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:54.615675342Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:59.80799783Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:31:53.511902352Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:30.270978852Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:30.703308349Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:31.008952634Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:31.009001445Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:37.508054621Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 23 09:42:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:37.483 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:37.484 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:37.485 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:37.485 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:37.517 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:37 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:37.517 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:37 np0005532585.localdomain sudo[280207]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:38 np0005532585.localdomain sudo[280383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqetihciubcwnlkyabgohdczhbvjbtrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890957.837838-3572-267352279173435/AnsiballZ_stat.py
Nov 23 09:42:38 np0005532585.localdomain sudo[280383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:42:38 np0005532585.localdomain podman[280386]: 2025-11-23 09:42:38.252557712 +0000 UTC m=+0.096423149 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:42:38 np0005532585.localdomain podman[280386]: 2025-11-23 09:42:38.285834298 +0000 UTC m=+0.129699765 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 23 09:42:38 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:42:38 np0005532585.localdomain python3.9[280385]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:38 np0005532585.localdomain sudo[280383]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:39 np0005532585.localdomain sudo[280514]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkrvysrrnfigyyguyfohwbrgkpxmswhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890959.5361216-3608-675639590930/AnsiballZ_container_config_data.py
Nov 23 09:42:39 np0005532585.localdomain sudo[280514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:40 np0005532585.localdomain python3.9[280516]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 23 09:42:40 np0005532585.localdomain sudo[280514]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:40 np0005532585.localdomain sudo[280624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpnebxfgrspgldleryrqdwodzfuadgby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890960.3767278-3635-105733973140898/AnsiballZ_container_config_hash.py
Nov 23 09:42:40 np0005532585.localdomain sudo[280624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:40 np0005532585.localdomain python3.9[280626]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 09:42:40 np0005532585.localdomain sudo[280624]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:42:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:42:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:42:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:42:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:42:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1"
Nov 23 09:42:42 np0005532585.localdomain sudo[280734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upebkeshkkugdckydntzhidpltdolyoy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763890961.8947158-3665-11824867476398/AnsiballZ_edpm_container_manage.py
Nov 23 09:42:42 np0005532585.localdomain sudo[280734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:42 np0005532585.localdomain python3[280736]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 09:42:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:42.518 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:42.520 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:42.520 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:42.521 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:42.581 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:42 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:42.582 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:42 np0005532585.localdomain python3[280736]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",
                                                                    "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-21T06:33:31.011385583Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251118",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211770748,
                                                                    "VirtualSize": 1211770748,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",
                                                                              "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",
                                                                              "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",
                                                                              "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",
                                                                              "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251118",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795434035Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:49.795512415Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251118\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-18T01:56:52.547242013Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947310748Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947327778Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947358359Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.947372589Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94738527Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:01.94739397Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:02.324930938Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:36.349393468Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:39.924297673Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.346524368Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:40.740246279Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.615556259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:41.938464234Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.336857046Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:42.748608395Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.15486404Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.54961678Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:43.896073438Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.223450469Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.48737197Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:44.877325535Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.314037239Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:45.685882303Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:46.113418417Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:49.690122703Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.044711291Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:50.455299838Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:52.102345748Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173930666Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.173989308Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174007728Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:54.174022289Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:10:55.520348349Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:19.349843192Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:13:59.347040136Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:14:02.744397841Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:22.134493628Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:23.375712978Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:19:39.628890315Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:54.615675342Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:20:59.80799783Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:31:53.511902352Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:7b76510d5d5adf2ccf627d29bb9dae76",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:30.270978852Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:30.703308349Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:31.008952634Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:31.009001445Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-21T06:33:37.508054621Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"7b76510d5d5adf2ccf627d29bb9dae76\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 23 09:42:42 np0005532585.localdomain sudo[280734]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:43 np0005532585.localdomain sudo[280907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txlnpdorwpwaheciutwwfqcwrqdhsofv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890963.0418324-3689-30254509744109/AnsiballZ_stat.py
Nov 23 09:42:43 np0005532585.localdomain sudo[280907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:42:43 np0005532585.localdomain podman[280910]: 2025-11-23 09:42:43.473424827 +0000 UTC m=+0.081006751 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:42:43 np0005532585.localdomain podman[280910]: 2025-11-23 09:42:43.483205377 +0000 UTC m=+0.090787301 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:42:43 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:42:43 np0005532585.localdomain python3.9[280909]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:43 np0005532585.localdomain sudo[280907]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:44 np0005532585.localdomain sudo[281043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lipwwsaypygycorplutbixbiqccnqbyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890963.8961117-3716-230328604756186/AnsiballZ_file.py
Nov 23 09:42:44 np0005532585.localdomain sudo[281043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:44 np0005532585.localdomain python3.9[281045]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:42:44 np0005532585.localdomain sudo[281043]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:44 np0005532585.localdomain sudo[281152]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvkqozdvpeuodlwippuawvlzygcjxjtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890964.4477117-3716-179323596515358/AnsiballZ_copy.py
Nov 23 09:42:44 np0005532585.localdomain sudo[281152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:45 np0005532585.localdomain python3.9[281154]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890964.4477117-3716-179323596515358/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:42:45 np0005532585.localdomain sudo[281152]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:45 np0005532585.localdomain sudo[281207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgsxjvwophaidnfdlkiqubkxvebcquez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890964.4477117-3716-179323596515358/AnsiballZ_systemd.py
Nov 23 09:42:45 np0005532585.localdomain sudo[281207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:45 np0005532585.localdomain python3.9[281209]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:42:45 np0005532585.localdomain sudo[281207]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35664 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB928B0000000001030307) 
Nov 23 09:42:47 np0005532585.localdomain python3.9[281319]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35665 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB96A10000000001030307) 
Nov 23 09:42:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:47.583 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:47.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:47.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:47.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:47.622 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:47 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:47.623 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:42:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:42:48 np0005532585.localdomain python3.9[281427]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:48 np0005532585.localdomain systemd[1]: tmp-crun.IFA7lD.mount: Deactivated successfully.
Nov 23 09:42:48 np0005532585.localdomain podman[281428]: 2025-11-23 09:42:48.012826211 +0000 UTC m=+0.074982870 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 09:42:48 np0005532585.localdomain podman[281428]: 2025-11-23 09:42:48.072395181 +0000 UTC m=+0.134551820 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 09:42:48 np0005532585.localdomain podman[281429]: 2025-11-23 09:42:48.080494787 +0000 UTC m=+0.139312590 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 23 09:42:48 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:42:48 np0005532585.localdomain podman[281429]: 2025-11-23 09:42:48.102359531 +0000 UTC m=+0.161177404 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9)
Nov 23 09:42:48 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:42:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16624 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB9A200000000001030307) 
Nov 23 09:42:49 np0005532585.localdomain python3.9[281580]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 09:42:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35666 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB9EA00000000001030307) 
Nov 23 09:42:50 np0005532585.localdomain sudo[281688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryzcvjggrdabbzqzoqaeqbcifwibnfmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890969.6392303-3884-203768741041005/AnsiballZ_podman_container.py
Nov 23 09:42:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:42:50 np0005532585.localdomain sudo[281688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:50 np0005532585.localdomain podman[281690]: 2025-11-23 09:42:50.291675623 +0000 UTC m=+0.082401536 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 23 09:42:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3801 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DBA2210000000001030307) 
Nov 23 09:42:50 np0005532585.localdomain podman[281690]: 2025-11-23 09:42:50.32436562 +0000 UTC m=+0.115091533 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:42:50 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:42:50 np0005532585.localdomain python3.9[281691]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 23 09:42:50 np0005532585.localdomain sudo[281688]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:50 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation.
Nov 23 09:42:50 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 09:42:50 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:42:50 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:42:51 np0005532585.localdomain sudo[281844]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhgqpedgpcdtzofupmrsqorzxrnrcuvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890970.9280038-3908-202882572937140/AnsiballZ_systemd.py
Nov 23 09:42:51 np0005532585.localdomain sudo[281844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:51 np0005532585.localdomain python3.9[281846]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 09:42:51 np0005532585.localdomain systemd[1]: Stopping nova_compute container...
Nov 23 09:42:51 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:51.642 230604 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Nov 23 09:42:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:52.624 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:52.662 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:42:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:52.662 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:42:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:52.663 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:52.664 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:42:52 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:52.665 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:42:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35667 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DBAE600000000001030307) 
Nov 23 09:42:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:42:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:42:55 np0005532585.localdomain podman[281862]: 2025-11-23 09:42:55.278119541 +0000 UTC m=+0.086285689 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:42:55 np0005532585.localdomain podman[281862]: 2025-11-23 09:42:55.291334139 +0000 UTC m=+0.099500317 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:42:55 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:42:55 np0005532585.localdomain systemd[1]: tmp-crun.4WIm1L.mount: Deactivated successfully.
Nov 23 09:42:55 np0005532585.localdomain podman[281863]: 2025-11-23 09:42:55.343210886 +0000 UTC m=+0.147604615 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:42:55 np0005532585.localdomain podman[281863]: 2025-11-23 09:42:55.357351244 +0000 UTC m=+0.161744963 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:42:55 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:42:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:56.347 230604 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 23 09:42:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:56.349 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:42:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:56.350 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:42:56 np0005532585.localdomain nova_compute[230600]: 2025-11-23 09:42:56.350 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:42:56 np0005532585.localdomain virtqemud[203731]: End of file while reading data: Input/output error
Nov 23 09:42:56 np0005532585.localdomain systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Deactivated successfully.
Nov 23 09:42:56 np0005532585.localdomain systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Consumed 20.521s CPU time.
Nov 23 09:42:56 np0005532585.localdomain podman[281850]: 2025-11-23 09:42:56.838290981 +0000 UTC m=+5.263057655 container died 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 23 09:42:56 np0005532585.localdomain systemd[1]: tmp-crun.yxR82I.mount: Deactivated successfully.
Nov 23 09:42:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295-userdata-shm.mount: Deactivated successfully.
Nov 23 09:42:56 np0005532585.localdomain podman[281850]: 2025-11-23 09:42:56.995648824 +0000 UTC m=+5.420415448 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 09:42:56 np0005532585.localdomain podman[281850]: nova_compute
Nov 23 09:42:57 np0005532585.localdomain podman[281935]: error opening file `/run/crun/2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295/status`: No such file or directory
Nov 23 09:42:57 np0005532585.localdomain podman[281922]: 2025-11-23 09:42:57.102317028 +0000 UTC m=+0.067078759 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute)
Nov 23 09:42:57 np0005532585.localdomain podman[281922]: nova_compute
Nov 23 09:42:57 np0005532585.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 23 09:42:57 np0005532585.localdomain systemd[1]: Stopped nova_compute container.
Nov 23 09:42:57 np0005532585.localdomain systemd[1]: Starting nova_compute container...
Nov 23 09:42:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:42:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:57 np0005532585.localdomain podman[281937]: 2025-11-23 09:42:57.256918963 +0000 UTC m=+0.123121637 container init 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:42:57 np0005532585.localdomain podman[281937]: 2025-11-23 09:42:57.264016799 +0000 UTC m=+0.130219463 container start 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:42:57 np0005532585.localdomain podman[281937]: nova_compute
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + sudo -E kolla_set_configs
Nov 23 09:42:57 np0005532585.localdomain systemd[1]: Started nova_compute container.
Nov 23 09:42:57 np0005532585.localdomain sudo[281844]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Validating config file
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying service configuration files
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /etc/ceph
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Creating directory /etc/ceph
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Writing out command to execute
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: ++ cat /run_command
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + CMD=nova-compute
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + ARGS=
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + sudo kolla_copy_cacerts
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + [[ ! -n '' ]]
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + . kolla_extend_start
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: Running command: 'nova-compute'
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + echo 'Running command: '\''nova-compute'\'''
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + umask 0022
Nov 23 09:42:57 np0005532585.localdomain nova_compute[281952]: + exec nova-compute
Nov 23 09:42:57 np0005532585.localdomain sudo[282071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqsuvriqmkwsbdpsvetnvfrbychdcsgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763890977.5523667-3935-34549627978628/AnsiballZ_podman_container.py
Nov 23 09:42:57 np0005532585.localdomain sudo[282071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 23 09:42:58 np0005532585.localdomain python3.9[282073]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 23 09:42:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope.
Nov 23 09:42:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:42:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 09:42:58 np0005532585.localdomain podman[282101]: 2025-11-23 09:42:58.339834181 +0000 UTC m=+0.128157017 container init 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 23 09:42:58 np0005532585.localdomain podman[282101]: 2025-11-23 09:42:58.353835096 +0000 UTC m=+0.142157922 container start 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:42:58 np0005532585.localdomain python3.9[282073]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Applying nova statedir ownership
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d already 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/console.log
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/f8def1b80727f8e5cc38a877010a5f81bbb3086d
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-f8def1b80727f8e5cc38a877010a5f81bbb3086d
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/4143dbbec5b08621aa3c8eb364f8a7d3e97604e18b7ed41c4bab0da11ed561fd
Nov 23 09:42:58 np0005532585.localdomain nova_compute_init[282121]: INFO:nova_statedir:Nova statedir ownership complete
Nov 23 09:42:58 np0005532585.localdomain systemd[1]: libpod-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully.
Nov 23 09:42:58 np0005532585.localdomain podman[282122]: 2025-11-23 09:42:58.431749148 +0000 UTC m=+0.052242859 container died 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 09:42:58 np0005532585.localdomain podman[282135]: 2025-11-23 09:42:58.498551106 +0000 UTC m=+0.063978600 container cleanup 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:42:58 np0005532585.localdomain systemd[1]: libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully.
Nov 23 09:42:58 np0005532585.localdomain sudo[282071]: pam_unix(sudo:session): session closed for user root
Nov 23 09:42:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully.
Nov 23 09:42:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428-userdata-shm.mount: Deactivated successfully.
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.064 281956 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.065 281956 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.065 281956 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.065 281956 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.182 281956 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.204 281956 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.205 281956 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 23 09:42:59 np0005532585.localdomain sshd[263687]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:42:59 np0005532585.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Nov 23 09:42:59 np0005532585.localdomain systemd[1]: session-59.scope: Consumed 1min 28.823s CPU time.
Nov 23 09:42:59 np0005532585.localdomain systemd-logind[761]: Session 59 logged out. Waiting for processes to exit.
Nov 23 09:42:59 np0005532585.localdomain systemd-logind[761]: Removed session 59.
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.632 281956 INFO nova.virt.driver [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.731 281956 INFO nova.compute.provider_config [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.737 281956 DEBUG oslo_concurrency.lockutils [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.737 281956 DEBUG oslo_concurrency.lockutils [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_concurrency.lockutils [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console_host                   = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] host                           = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.787 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.787 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.788 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.788 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.790 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.790 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.790 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.819 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.819 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.820 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.820 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.821 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.821 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.821 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.822 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.822 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.822 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.825 281956 WARNING oslo_config.cfg [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: and ``live_migration_inbound_addr`` respectively.
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: ).  Its value may be silently ignored in the future.
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_secret_uuid        = 46550e70-79cb-5f55-bf6d-1204b97e083b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.885 281956 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.898 281956 INFO nova.virt.node [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.899 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.899 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.900 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.900 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.910 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7faa639ce9d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.913 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7faa639ce9d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.914 281956 INFO nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Connection event '1' reason 'None'
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.919 281956 INFO nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host capabilities <capabilities>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <host>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <uuid>43895caf-e6c2-47af-84a5-6194e901da5c</uuid>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <cpu>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <arch>x86_64</arch>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model>EPYC-Rome-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <vendor>AMD</vendor>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <microcode version='16777317'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <signature family='23' model='49' stepping='0'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='x2apic'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='tsc-deadline'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='osxsave'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='hypervisor'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='tsc_adjust'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='spec-ctrl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='stibp'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='arch-capabilities'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='cmp_legacy'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='topoext'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='virt-ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='lbrv'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='tsc-scale'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='vmcb-clean'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='pause-filter'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='pfthreshold'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='svme-addr-chk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='rdctl-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='skip-l1dfl-vmentry'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='mds-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature name='pschange-mc-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <pages unit='KiB' size='4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <pages unit='KiB' size='2048'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <pages unit='KiB' size='1048576'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </cpu>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <power_management>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <suspend_mem/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <suspend_disk/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <suspend_hybrid/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </power_management>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <iommu support='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <migration_features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <live/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <uri_transports>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <uri_transport>tcp</uri_transport>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <uri_transport>rdma</uri_transport>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </uri_transports>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </migration_features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <topology>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <cells num='1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <cell id='0'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           <memory unit='KiB'>16116612</memory>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           <pages unit='KiB' size='2048'>0</pages>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           <distances>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <sibling id='0' value='10'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           </distances>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           <cpus num='8'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:           </cpus>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         </cell>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </cells>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </topology>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <cache>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </cache>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <secmodel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model>selinux</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <doi>0</doi>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </secmodel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <secmodel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model>dac</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <doi>0</doi>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </secmodel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </host>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <guest>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <os_type>hvm</os_type>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <arch name='i686'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <wordsize>32</wordsize>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <domain type='qemu'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <domain type='kvm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </arch>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <pae/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <nonpae/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <acpi default='on' toggle='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <apic default='on' toggle='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <cpuselection/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <deviceboot/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <disksnapshot default='on' toggle='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <externalSnapshot/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </guest>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <guest>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <os_type>hvm</os_type>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <arch name='x86_64'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <wordsize>64</wordsize>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <domain type='qemu'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <domain type='kvm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </arch>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <acpi default='on' toggle='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <apic default='on' toggle='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <cpuselection/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <deviceboot/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <disksnapshot default='on' toggle='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <externalSnapshot/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </guest>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: </capabilities>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.925 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.928 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: <domainCapabilities>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <domain>kvm</domain>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <arch>i686</arch>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <vcpu max='1024'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <iothreads supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <os supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <enum name='firmware'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <loader supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>rom</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>pflash</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='readonly'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>yes</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='secure'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </loader>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <cpu>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='maximum' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='maximumMigratable'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='host-model' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <vendor>AMD</vendor>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='x2apic'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='stibp'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='succor'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ibrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lbrv'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='custom' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Dhyana-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-128'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-256'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-512'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <memoryBacking supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <enum name='sourceType'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <value>file</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <value>anonymous</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <value>memfd</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </memoryBacking>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <disk supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='diskDevice'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>disk</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>cdrom</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>floppy</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>lun</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>fdc</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>sata</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <graphics supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>vnc</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>egl-headless</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </graphics>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <video supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='modelType'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>vga</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>cirrus</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>none</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>bochs</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>ramfb</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <hostdev supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='mode'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>subsystem</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='startupPolicy'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>mandatory</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>requisite</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>optional</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='subsysType'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>pci</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='capsType'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='pciBackend'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </hostdev>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <rng supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>random</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>egd</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <filesystem supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='driverType'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>path</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>handle</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>virtiofs</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </filesystem>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <tpm supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>tpm-tis</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>tpm-crb</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>emulator</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>external</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='backendVersion'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>2.0</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </tpm>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <redirdev supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </redirdev>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <channel supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </channel>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <crypto supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='model'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>qemu</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </crypto>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <interface supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='backendType'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>passt</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </interface>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <panic supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>isa</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>hyperv</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </panic>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <console supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>null</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>vc</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>dev</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>file</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>pipe</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>stdio</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>udp</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>tcp</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>qemu-vdagent</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </console>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <gic supported='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <genid supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <backingStoreInput supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <backup supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <async-teardown supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <ps2 supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <sev supported='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <sgx supported='no'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <hyperv supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='features'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>relaxed</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>vapic</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>spinlocks</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>vpindex</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>runtime</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>synic</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>stimer</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>reset</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>vendor_id</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>frequencies</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>reenlightenment</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>tlbflush</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>ipi</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>avic</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>emsr_bitmap</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>xmm_input</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <defaults>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <spinlocks>4095</spinlocks>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <stimer_direct>on</stimer_direct>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </defaults>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </hyperv>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <launchSecurity supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='sectype'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>tdx</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </launchSecurity>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: </domainCapabilities>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.931 281956 DEBUG nova.virt.libvirt.volume.mount [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.936 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]: <domainCapabilities>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <domain>kvm</domain>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <arch>i686</arch>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <vcpu max='240'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <iothreads supported='yes'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <os supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <enum name='firmware'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <loader supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>rom</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>pflash</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='readonly'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>yes</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='secure'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </loader>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:   <cpu>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='maximum' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <enum name='maximumMigratable'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='host-model' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <vendor>AMD</vendor>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='x2apic'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='stibp'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='succor'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ibrs'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lbrv'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:     <mode name='custom' supported='yes'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:42:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake'>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:42:59 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Dhyana-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-128'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-256'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-512'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <memoryBacking supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <enum name='sourceType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>file</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>anonymous</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>memfd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </memoryBacking>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <disk supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='diskDevice'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>disk</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>cdrom</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>floppy</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>lun</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ide</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>fdc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>sata</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <graphics supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vnc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>egl-headless</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </graphics>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <video supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='modelType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vga</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>cirrus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>none</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>bochs</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ramfb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <hostdev supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='mode'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>subsystem</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='startupPolicy'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>mandatory</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>requisite</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>optional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='subsysType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pci</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='capsType'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='pciBackend'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </hostdev>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <rng supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>random</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>egd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <filesystem supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='driverType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>path</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>handle</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtiofs</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </filesystem>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <tpm supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tpm-tis</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tpm-crb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>emulator</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>external</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendVersion'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>2.0</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </tpm>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <redirdev supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </redirdev>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <channel supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </channel>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <crypto supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>qemu</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </crypto>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <interface supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>passt</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </interface>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <panic supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>isa</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>hyperv</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </panic>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <console supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>null</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dev</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>file</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pipe</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>stdio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>udp</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tcp</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>qemu-vdagent</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </console>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <gic supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <genid supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <backingStoreInput supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <backup supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <async-teardown supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <ps2 supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <sev supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <sgx supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <hyperv supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='features'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>relaxed</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vapic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>spinlocks</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vpindex</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>runtime</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>synic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>stimer</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>reset</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vendor_id</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>frequencies</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>reenlightenment</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tlbflush</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ipi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>avic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>emsr_bitmap</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>xmm_input</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <defaults>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <spinlocks>4095</spinlocks>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <stimer_direct>on</stimer_direct>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </defaults>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </hyperv>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <launchSecurity supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='sectype'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tdx</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </launchSecurity>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: </domainCapabilities>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.978 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:42:59.983 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: <domainCapabilities>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <domain>kvm</domain>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <arch>x86_64</arch>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <vcpu max='1024'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <iothreads supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <os supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <enum name='firmware'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>efi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <loader supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>rom</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pflash</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='readonly'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>yes</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='secure'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>yes</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </loader>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <cpu>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='maximum' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='maximumMigratable'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='host-model' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <vendor>AMD</vendor>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='x2apic'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='stibp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ssbd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='succor'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lbrv'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='custom' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Dhyana-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-128'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-256'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-512'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <memoryBacking supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <enum name='sourceType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>file</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>anonymous</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>memfd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </memoryBacking>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <disk supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='diskDevice'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>disk</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>cdrom</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>floppy</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>lun</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>fdc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>sata</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <graphics supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vnc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>egl-headless</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </graphics>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <video supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='modelType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vga</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>cirrus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>none</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>bochs</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ramfb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <hostdev supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='mode'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>subsystem</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='startupPolicy'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>mandatory</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>requisite</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>optional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='subsysType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pci</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='capsType'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='pciBackend'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </hostdev>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <rng supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>random</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>egd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <filesystem supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='driverType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>path</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>handle</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtiofs</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </filesystem>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <tpm supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tpm-tis</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tpm-crb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>emulator</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>external</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendVersion'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>2.0</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </tpm>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <redirdev supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </redirdev>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <channel supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </channel>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <crypto supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>qemu</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </crypto>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <interface supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>passt</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </interface>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <panic supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>isa</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>hyperv</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </panic>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <console supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>null</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dev</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>file</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pipe</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>stdio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>udp</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tcp</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>qemu-vdagent</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </console>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <gic supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <genid supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <backingStoreInput supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <backup supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <async-teardown supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <ps2 supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <sev supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <sgx supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <hyperv supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='features'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>relaxed</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vapic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>spinlocks</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vpindex</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>runtime</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>synic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>stimer</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>reset</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vendor_id</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>frequencies</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>reenlightenment</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tlbflush</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ipi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>avic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>emsr_bitmap</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>xmm_input</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <defaults>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <spinlocks>4095</spinlocks>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <stimer_direct>on</stimer_direct>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </defaults>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </hyperv>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <launchSecurity supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='sectype'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tdx</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </launchSecurity>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: </domainCapabilities>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.044 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: <domainCapabilities>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <path>/usr/libexec/qemu-kvm</path>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <domain>kvm</domain>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <arch>x86_64</arch>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <vcpu max='240'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <iothreads supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <os supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <enum name='firmware'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <loader supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>rom</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pflash</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='readonly'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>yes</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='secure'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>no</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </loader>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <cpu>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='host-passthrough' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='hostPassthroughMigratable'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='maximum' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='maximumMigratable'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>on</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>off</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='host-model' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <vendor>AMD</vendor>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='x2apic'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-deadline'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='hypervisor'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc_adjust'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='spec-ctrl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='stibp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ssbd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='cmp_legacy'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='overflow-recov'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='succor'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='amd-ssbd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='virt-ssbd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lbrv'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='tsc-scale'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='vmcb-clean'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pause-filter'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='pfthreshold'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='svme-addr-chk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <feature policy='disable' name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <mode name='custom' supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Broadwell-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cascadelake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Cooperlake-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Denverton-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Dhyana-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Genoa-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='auto-ibrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Milan-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amd-psfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='no-nested-data-bp'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='null-sel-clr-base'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='stibp-always-on'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-Rome-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='EPYC-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='GraniteRapids-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-128'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-256'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx10-512'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='prefetchiti'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Haswell-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-noTSX'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v6'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Icelake-Server-v7'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='IvyBridge-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='KnightsMill-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4fmaps'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-4vnniw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512er'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512pf'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G4-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Opteron_G5-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fma4'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tbm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xop'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SapphireRapids-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='amx-tile'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-bf16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-fp16'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512-vpopcntdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bitalg'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vbmi2'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrc'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fzrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='la57'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='taa-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='tsx-ldtrk'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xfd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='SierraForest-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ifma'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-ne-convert'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx-vnni-int8'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='bus-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cmpccxadd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fbsdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='fsrs'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ibrs-all'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mcdt-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pbrsb-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='psdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='sbdr-ssdp-no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='serialize'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vaes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='vpclmulqdq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Client-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='hle'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='rtm'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Skylake-Server-v5'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512bw'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512cd'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512dq'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512f'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='avx512vl'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='invpcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pcid'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='pku'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='mpx'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v2'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v3'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='core-capability'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='split-lock-detect'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='Snowridge-v4'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='cldemote'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='erms'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='gfni'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdir64b'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='movdiri'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='xsaves'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='athlon-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='core2duo-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='coreduo-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='n270-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='ss'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <blockers model='phenom-v1'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnow'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <feature name='3dnowext'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </blockers>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </mode>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <memoryBacking supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <enum name='sourceType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>file</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>anonymous</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <value>memfd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </memoryBacking>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <disk supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='diskDevice'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>disk</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>cdrom</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>floppy</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>lun</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ide</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>fdc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>sata</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <graphics supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vnc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>egl-headless</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </graphics>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <video supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='modelType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vga</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>cirrus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>none</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>bochs</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ramfb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <hostdev supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='mode'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>subsystem</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='startupPolicy'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>mandatory</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>requisite</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>optional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='subsysType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pci</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>scsi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='capsType'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='pciBackend'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </hostdev>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <rng supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtio-non-transitional</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>random</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>egd</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <filesystem supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='driverType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>path</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>handle</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>virtiofs</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </filesystem>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <tpm supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tpm-tis</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tpm-crb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>emulator</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>external</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendVersion'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>2.0</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </tpm>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <redirdev supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='bus'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>usb</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </redirdev>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <channel supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </channel>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <crypto supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>qemu</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendModel'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>builtin</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </crypto>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <interface supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='backendType'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>default</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>passt</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </interface>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <panic supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='model'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>isa</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>hyperv</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </panic>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <console supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='type'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>null</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vc</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pty</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dev</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>file</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>pipe</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>stdio</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>udp</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tcp</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>unix</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>qemu-vdagent</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>dbus</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </console>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <gic supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <genid supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <backingStoreInput supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <backup supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <async-teardown supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <ps2 supported='yes'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <sev supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <sgx supported='no'/>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <hyperv supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='features'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>relaxed</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vapic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>spinlocks</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vpindex</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>runtime</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>synic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>stimer</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>reset</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>vendor_id</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>frequencies</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>reenlightenment</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tlbflush</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>ipi</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>avic</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>emsr_bitmap</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>xmm_input</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <defaults>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <spinlocks>4095</spinlocks>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <stimer_direct>on</stimer_direct>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <tlbflush_direct>off</tlbflush_direct>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <tlbflush_extended>off</tlbflush_extended>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </defaults>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </hyperv>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     <launchSecurity supported='yes'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       <enum name='sectype'>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:         <value>tdx</value>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:       </enum>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:     </launchSecurity>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: </domainCapabilities>
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.100 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.100 281956 INFO nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Secure Boot support detected
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.103 281956 INFO nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.103 281956 INFO nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.113 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.165 281956 INFO nova.virt.node [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.185 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Verified node dae70d62-10f4-474c-9782-8c926a3641d5 matches my host np0005532585.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.221 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.226 281956 DEBUG nova.virt.libvirt.vif [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005532585.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T08:25:43Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.226 281956 DEBUG nova.network.os_vif_util [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.227 281956 DEBUG nova.network.os_vif_util [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.227 281956 DEBUG os_vif [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.256 281956 DEBUG ovsdbapp.backend.ovs_idl [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.258 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.262 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.273 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.273 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.274 281956 INFO oslo.privsep.daemon [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpkwoipk_5/privsep.sock']
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.909 281956 INFO oslo.privsep.daemon [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.793 282209 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.796 282209 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.798 282209 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 23 09:43:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:00.798 282209 INFO oslo.privsep.daemon [-] privsep daemon running as pid 282209
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.173 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.174 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.175 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.176 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.177 281956 INFO os_vif [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.178 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.182 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.182 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.332 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.332 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.333 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.333 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.334 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:43:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35668 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DBCE210000000001030307) 
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.794 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.854 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:43:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:01.855 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.054 281956 WARNING nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.055 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12139MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.056 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.056 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.211 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.212 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.212 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.262 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.283 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.284 281956 DEBUG nova.compute.provider_tree [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.299 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.326 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.367 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.696 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.815 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.821 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.821 281956 INFO nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] kernel doesn't support AMD SEV
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.823 281956 DEBUG nova.compute.provider_tree [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.823 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.846 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.873 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.874 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.874 281956 DEBUG nova.service [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.916 281956 DEBUG nova.service [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 23 09:43:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:02.917 281956 DEBUG nova.servicegroup.drivers.db [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] DB_Driver: join new ServiceGroup member np0005532585.localdomain to the compute group, service = <Service: host=np0005532585.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 23 09:43:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:05.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:07.697 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:43:09 np0005532585.localdomain podman[282257]: 2025-11-23 09:43:09.023694281 +0000 UTC m=+0.078549543 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:43:09 np0005532585.localdomain podman[282257]: 2025-11-23 09:43:09.059395074 +0000 UTC m=+0.114250336 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:43:09 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:43:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:09.278 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:09.279 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:10.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.805 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4361a4ac-d1f6-4808-ab2b-4597871f5210', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.806740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2d2f1c6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'a269661f0ee2858a54e0ecbf8ebc0b6d90876bb1274e52f94f0dc3a9b6774648'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.806740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2d306a2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '4a000ddad1e31ce03a61bda0127cbe512825519da9ec936d7fd3ba08a4dbfd13'}]}, 'timestamp': '2025-11-23 09:43:10.847203', '_unique_id': '93806cc4d323418da332008aae43b952'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ce40652-eb4e-4642-a9ed-696d8f1a6da0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.850752', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2d5a6b4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': 'c01688a41477b1e96ff18ebe34cfa5835994b204ec1b291b52d9ffe7ad9b8162'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.850752', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2d5b802-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': 'be1bf89ce744fa444230ea5a61209d1541c77989c16a95979e45cc4ec73df90f'}]}, 'timestamp': '2025-11-23 09:43:10.864796', '_unique_id': '4d7fc9067855439e8a34e8b8f428c882'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5323a5f8-32da-45b9-aea8-92cb3bc66b7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.867198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2d6272e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'cdab8db7192a6e45e1c4ba9b6a17d83bb4aa195272a798eca95fbefeff365a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.867198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2d6375a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'b6f3c09d9025af838fa4cc1729c28f2c48b61ac5ed9049177a04dc67065296eb'}]}, 'timestamp': '2025-11-23 09:43:10.868092', '_unique_id': '2fb610ffd94040339e486f7e02fefc17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '272dce7d-8c2e-4708-8e41-1af8e80374e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:43:10.870277', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd2d9f156-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.069545347, 'message_signature': '494d1bb3299b32c549bc1b7410190ceda1000d5969e7bffce4397d35c2cf4612'}]}, 'timestamp': '2025-11-23 09:43:10.892489', '_unique_id': '867b7e2eeefe4a9997295558410dea44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90a23bf2-ddac-4254-abbe-2f22e4d1a6af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.894807', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dad60c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '313a0f7138823e7fd065b1224341842ee42ca6543e2f1384f4461aa89ed1643f'}]}, 'timestamp': '2025-11-23 09:43:10.898361', '_unique_id': 'da0e8d81d13d40e7833f15ab6c17c145'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44a9fec5-3d73-4e83-b7c7-bec1174bcc7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.903743', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dbbeb4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '7d7f2cdcce7f17e31be264720b0b22afa331205aecbe8628d1da339502acd227'}]}, 'timestamp': '2025-11-23 09:43:10.904377', '_unique_id': 'bc37623302a54bfdbdccac1f76ddb6ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20595a1c-f0ec-4137-895d-102d7e4086b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.906699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2dc2f34-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '291a74733e966047c4bae18d39f4dd21c92d92872bd022bc04cb6933fb0dcbb9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.906699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2dc3f42-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'ef66d200807c30836e79e3eca407f6147c535ce19285bdce44eed4ace0b467e8'}]}, 'timestamp': '2025-11-23 09:43:10.907572', '_unique_id': 'e3786650428f4b7cb972fb06608b98f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f50885d6-8ae2-4ec9-97a6-61893f31a241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.909740', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dca61c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '3496f07dc720f06479f3c3d06dd6fe9501190f49447482adc6954ccf8e848624'}]}, 'timestamp': '2025-11-23 09:43:10.910237', '_unique_id': '398163b1296546d88f0abc3c79f60500'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eeeaf314-8038-485a-968a-51e26589aabf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.912314', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dd092c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '16b5e4eb609294ac5c3c39f440ffaed9dcd395d7c883215ad23a8cac993635f3'}]}, 'timestamp': '2025-11-23 09:43:10.912770', '_unique_id': 'c54b75cf934648da9490e1e363d8048c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97cdaf7a-631d-4de3-940f-d033a0f16b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.915028', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dd7344-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '76de5e55d3b3427913a93fc019c6bd46be519df415c92aeb1f00f81aa83565cf'}]}, 'timestamp': '2025-11-23 09:43:10.915485', '_unique_id': '40eca1e0810248c0984cbba9d6480ba9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '237cb085-3816-4101-b36b-706ec51dfde9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.917603', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2ddd7ee-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': 'd012e57d5372642e75ec7578658c74393de784723bc671270d29504af0fe0e90'}]}, 'timestamp': '2025-11-23 09:43:10.918093', '_unique_id': '5262ff0f656b42bea01b0105444073eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 9228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca8633a6-99c2-4a2c-a176-a5ba6dd83f36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9228, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.920416', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2de45c6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': 'c4eb61756bc0e4fda8ad358fa7eac580a38f8e97befb0e00df8b12b789d6f051'}]}, 'timestamp': '2025-11-23 09:43:10.920875', '_unique_id': 'fe3794814c7247b38340bd8cad77a393'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7751415b-e7af-4579-85cd-ef43e85c8de6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.922982', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dea9f8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '3101fcb5f752857e882f2d31eb6f2f18e9acd20e6bbd42dd8cc70115486ab9d3'}]}, 'timestamp': '2025-11-23 09:43:10.923438', '_unique_id': 'aafeda26c2ef4e8fae29711450922fbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc00876-a49a-4221-aec8-1606a1d5891d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.925679', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2df144c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': '92a3f9a0a2533f8f43d3e597fa72f12aab4a7ec68b7bab28857149f2f23798da'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.925679', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2df2464-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': '1dbe35117817771416e3219350573833d95ca127e9c9a306bf8df05b15009f8b'}]}, 'timestamp': '2025-11-23 09:43:10.926548', '_unique_id': 'b3d3e100ca7941f18b944b3115c0053d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e81be2e8-871a-44d0-a2e4-47e41b763b49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.928689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2df8a58-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'b97fe498954d841c1f993b8fb20c9f0e1d18f34e293b93f02cb60851a351dac0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.928689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2df9a66-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'e7d14cd4daa47b2abe0f5476cb8bffad0888ec6f54a42b1819bf40603ccb71dd'}]}, 'timestamp': '2025-11-23 09:43:10.929566', '_unique_id': '3095d66a4e51402b883fbf8f0d722d8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '669abfae-1a43-4114-bcbe-aa7bfdd9cedc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.931660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2dffcb8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '2220bd0d26ccb22ad510ef1a94f8d67e9b7e341488af15fe9428de7b9394d18c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.931660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2e00e10-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'd943988a2479e0e01eb96ff0a8d10cf838ad0d62bf09ec02dbf7b424da3b5a67'}]}, 'timestamp': '2025-11-23 09:43:10.932524', '_unique_id': '22093e80f134447aaff8984f3c86b0aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f65cc3b-bf46-44db-8edd-577eadee27e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.934617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2e0703a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '2b841e6bba26f330c6360cb810b3c85ef55d99d08dacc21f363d9a39c927683b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.934617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2e0814c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '0adc945062d8d5ed83099f1fb10c5af702046d78ed8f97672772b7ec40a7a42f'}]}, 'timestamp': '2025-11-23 09:43:10.935476', '_unique_id': '3c4d4e67581644a68d42d8aca6bc0448'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4fb7379-5f59-4ab1-ac1c-01d96d862f79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.937582', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2e0e100-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': 'f1fae702f9f8258a5c5463b89bddec1adc9ba8b816bcfe74064a84c7631ac91b'}]}, 'timestamp': '2025-11-23 09:43:10.937869', '_unique_id': 'c8254b63173a497d8d0913201c07264e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.939 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3d7e32f-a1cb-42e5-b896-a81a66713352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.939312', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2e1248a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '41d0cf7b07a3a96b0a5f47f98fc7ea611da666956314e033dd7093e2267f5e6e'}]}, 'timestamp': '2025-11-23 09:43:10.939599', '_unique_id': 'af4bef20eb644841ae5f6c5f652029ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 56830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49dafec8-6a67-4859-aee0-235dfcdcc8b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56830000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:43:10.941060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd2e16a3a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.069545347, 'message_signature': '80ef20397d813304505d806fefe6f7598282fbf459f8c598e72fc273c6e9d78c'}]}, 'timestamp': '2025-11-23 09:43:10.941376', '_unique_id': 'a86bceec592541d9b03d3aad745ca634'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.942 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0462217f-29ee-4ef2-88b8-668963483d1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.942787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2e1acf2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': '0342ff63d73a69c3e8201d32315202878cfabe7d974ab68cf706350b3f24211a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.942787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2e1b6fc-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': 'ca8359ca883818e3440fc68c4dd49f36ac9ec4e4d427dfbc09c5fd093824ac8a'}]}, 'timestamp': '2025-11-23 09:43:10.943330', '_unique_id': '3f4aa4aff7c144cf93817fe09bd28c12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:43:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:43:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:43:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:43:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:43:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1"
Nov 23 09:43:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:43:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1"
Nov 23 09:43:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:12.700 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:43:14 np0005532585.localdomain systemd[1]: tmp-crun.ZcvpVX.mount: Deactivated successfully.
Nov 23 09:43:14 np0005532585.localdomain podman[282275]: 2025-11-23 09:43:14.013460124 +0000 UTC m=+0.067216544 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:43:14 np0005532585.localdomain podman[282275]: 2025-11-23 09:43:14.021550421 +0000 UTC m=+0.075306871 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:43:14 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:43:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:15.265 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61215 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC07BB0000000001030307) 
Nov 23 09:43:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61216 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC0BE00000000001030307) 
Nov 23 09:43:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:17.702 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35669 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC0E200000000001030307) 
Nov 23 09:43:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:43:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:43:19 np0005532585.localdomain podman[282298]: 2025-11-23 09:43:19.0263548 +0000 UTC m=+0.081836617 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true)
Nov 23 09:43:19 np0005532585.localdomain podman[282299]: 2025-11-23 09:43:19.060989249 +0000 UTC m=+0.118596414 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:43:19 np0005532585.localdomain podman[282298]: 2025-11-23 09:43:19.06447521 +0000 UTC m=+0.119956997 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:43:19 np0005532585.localdomain podman[282299]: 2025-11-23 09:43:19.075002583 +0000 UTC m=+0.132609848 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 23 09:43:19 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:43:19 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:43:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61217 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC13E00000000001030307) 
Nov 23 09:43:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:20.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16625 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC18200000000001030307) 
Nov 23 09:43:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:43:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:21.019 281956 DEBUG nova.compute.manager [None req-3fb1259b-0712-46f4-b2a6-c039d9b9743a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:21.025 281956 INFO nova.compute.manager [None req-3fb1259b-0712-46f4-b2a6-c039d9b9743a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Retrieving diagnostics
Nov 23 09:43:21 np0005532585.localdomain systemd[1]: tmp-crun.0nbALV.mount: Deactivated successfully.
Nov 23 09:43:21 np0005532585.localdomain podman[282342]: 2025-11-23 09:43:21.039504513 +0000 UTC m=+0.096502153 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:43:21 np0005532585.localdomain podman[282342]: 2025-11-23 09:43:21.045217135 +0000 UTC m=+0.102214775 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:43:21 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:43:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:22.705 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61218 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC23A00000000001030307) 
Nov 23 09:43:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:25.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:25 np0005532585.localdomain sudo[282360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:43:25 np0005532585.localdomain sudo[282360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:43:25 np0005532585.localdomain sudo[282360]: pam_unix(sudo:session): session closed for user root
Nov 23 09:43:25 np0005532585.localdomain sudo[282378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:43:25 np0005532585.localdomain sudo[282378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:43:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:43:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:43:25 np0005532585.localdomain systemd[1]: tmp-crun.p0IRj8.mount: Deactivated successfully.
Nov 23 09:43:25 np0005532585.localdomain podman[282396]: 2025-11-23 09:43:25.527170475 +0000 UTC m=+0.094022504 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:43:25 np0005532585.localdomain podman[282397]: 2025-11-23 09:43:25.572959588 +0000 UTC m=+0.137732771 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:43:25 np0005532585.localdomain podman[282397]: 2025-11-23 09:43:25.585269398 +0000 UTC m=+0.150042661 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:43:25 np0005532585.localdomain podman[282396]: 2025-11-23 09:43:25.592758046 +0000 UTC m=+0.159610125 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:43:25 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:43:25 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:43:26 np0005532585.localdomain sudo[282378]: pam_unix(sudo:session): session closed for user root
Nov 23 09:43:26 np0005532585.localdomain sudo[282469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:43:26 np0005532585.localdomain sudo[282469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:43:26 np0005532585.localdomain sudo[282469]: pam_unix(sudo:session): session closed for user root
Nov 23 09:43:26 np0005532585.localdomain sudo[282487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 09:43:26 np0005532585.localdomain sudo[282487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:43:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:26.950 281956 DEBUG oslo_concurrency.lockutils [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:26.950 281956 DEBUG oslo_concurrency.lockutils [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:26.951 281956 DEBUG nova.compute.manager [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:26.958 281956 DEBUG nova.compute.manager [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Nov 23 09:43:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:26.963 281956 DEBUG nova.objects.instance [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'flavor' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:43:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:27.010 281956 DEBUG nova.virt.libvirt.driver [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 2025-11-23 09:43:27.094063579 +0000 UTC m=+0.082529200 container create f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:43:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b.scope.
Nov 23 09:43:27 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 2025-11-23 09:43:27.060789703 +0000 UTC m=+0.049255354 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 2025-11-23 09:43:27.165317359 +0000 UTC m=+0.153782970 container init f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, distribution-scope=public, release=553, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 2025-11-23 09:43:27.173569241 +0000 UTC m=+0.162034862 container start f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc.)
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 2025-11-23 09:43:27.173806178 +0000 UTC m=+0.162271829 container attach f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:43:27 np0005532585.localdomain interesting_bell[282563]: 167 167
Nov 23 09:43:27 np0005532585.localdomain systemd[1]: libpod-f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b.scope: Deactivated successfully.
Nov 23 09:43:27 np0005532585.localdomain podman[282547]: 2025-11-23 09:43:27.178830088 +0000 UTC m=+0.167295699 container died f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, version=7, release=553, GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55)
Nov 23 09:43:27 np0005532585.localdomain podman[282568]: 2025-11-23 09:43:27.279965817 +0000 UTC m=+0.089483120 container remove f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, vcs-type=git)
Nov 23 09:43:27 np0005532585.localdomain systemd[1]: libpod-conmon-f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b.scope: Deactivated successfully.
Nov 23 09:43:27 np0005532585.localdomain podman[282588]: 
Nov 23 09:43:27 np0005532585.localdomain podman[282588]: 2025-11-23 09:43:27.476879175 +0000 UTC m=+0.070956343 container create c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Nov 23 09:43:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope.
Nov 23 09:43:27 np0005532585.localdomain podman[282588]: 2025-11-23 09:43:27.440760679 +0000 UTC m=+0.034837907 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:43:27 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:43:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 09:43:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 09:43:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:43:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 09:43:27 np0005532585.localdomain podman[282588]: 2025-11-23 09:43:27.557606395 +0000 UTC m=+0.151683533 container init c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:43:27 np0005532585.localdomain podman[282588]: 2025-11-23 09:43:27.566592461 +0000 UTC m=+0.160669589 container start c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, name=rhceph, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph)
Nov 23 09:43:27 np0005532585.localdomain podman[282588]: 2025-11-23 09:43:27.566718445 +0000 UTC m=+0.160795573 container attach c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-type=git, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:43:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:27.745 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: tmp-crun.wL4Dfp.mount: Deactivated successfully.
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-824838b0f865a8cf79c7f6d113ee63e6ec310f4f437e24dd14bd36cff738817e-merged.mount: Deactivated successfully.
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]: [
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:     {
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "available": false,
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "ceph_device": false,
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "lsm_data": {},
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "lvs": [],
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "path": "/dev/sr0",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "rejected_reasons": [
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "Has a FileSystem",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "Insufficient space (<5GB)"
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         ],
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         "sys_api": {
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "actuators": null,
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "device_nodes": "sr0",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "human_readable_size": "482.00 KB",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "id_bus": "ata",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "model": "QEMU DVD-ROM",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "nr_requests": "2",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "partitions": {},
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "path": "/dev/sr0",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "removable": "1",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "rev": "2.5+",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "ro": "0",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "rotational": "1",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "sas_address": "",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "sas_device_handle": "",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "scheduler_mode": "mq-deadline",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "sectors": 0,
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "sectorsize": "2048",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "size": 493568.0,
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "support_discard": "0",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "type": "disk",
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:             "vendor": "QEMU"
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:         }
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]:     }
Nov 23 09:43:28 np0005532585.localdomain reverent_lovelace[282603]: ]
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: libpod-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope: Deactivated successfully.
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: libpod-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope: Consumed 1.111s CPU time.
Nov 23 09:43:28 np0005532585.localdomain podman[282588]: 2025-11-23 09:43:28.638953714 +0000 UTC m=+1.233030882 container died c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.33.12, name=rhceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph)
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: tmp-crun.rtO6lp.mount: Deactivated successfully.
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820-merged.mount: Deactivated successfully.
Nov 23 09:43:28 np0005532585.localdomain podman[284596]: 2025-11-23 09:43:28.726258534 +0000 UTC m=+0.076785787 container remove c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, io.buildah.version=1.33.12, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Nov 23 09:43:28 np0005532585.localdomain systemd[1]: libpod-conmon-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope: Deactivated successfully.
Nov 23 09:43:28 np0005532585.localdomain sudo[282487]: pam_unix(sudo:session): session closed for user root
Nov 23 09:43:29 np0005532585.localdomain sudo[284609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:43:29 np0005532585.localdomain sudo[284609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:43:29 np0005532585.localdomain sudo[284609]: pam_unix(sudo:session): session closed for user root
Nov 23 09:43:29 np0005532585.localdomain kernel: device tapd3912d14-a3 left promiscuous mode
Nov 23 09:43:29 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891009.5415] device (tapd3912d14-a3): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00049|binding|INFO|Releasing lport d3912d14-a3e0-4df9-b811-f3bd90f44559 from this chassis (sb_readonly=0)
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00050|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 down in Southbound
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00051|binding|INFO|Removing iface tapd3912d14-a3 ovn-installed in OVS
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.559 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 23 09:43:29 np0005532585.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 58.825s CPU time.
Nov 23 09:43:29 np0005532585.localdomain systemd-machined[84275]: Machine qemu-1-instance-00000002 terminated.
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.570 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.572 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:aa:3b 192.168.0.77'], port_security=['fa:16:3e:cf:aa:3b 192.168.0.77'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.77/24', 'neutron:device_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005532585.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '1915d3e5d4254231a0517e2dcf35848f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4fe931b0-155c-49e2-b5a5-44d74fa72e9e 6afcee2e-50ee-4b3c-9d1f-24ea7a5a850b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ca41f3-3e94-4959-b1b5-1e81bd2c9bc1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d3912d14-a3e0-4df9-b811-f3bd90f44559) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.573 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d3912d14-a3e0-4df9-b811-f3bd90f44559 in datapath bcac49fc-c589-475a-91a8-00a0ba9c2b33 unbound from our chassis
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.574 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcac49fc-c589-475a-91a8-00a0ba9c2b33, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-c237ed-0
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-49b8a0-0
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00054|ovn_bfd|INFO|Disabled BFD on interface ovn-b7d5b3-0
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.578 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00055|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.587 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa74992-c96b-4a6b-a9d0-f2713d799b6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.588 160439 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 namespace which is not needed anymore
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.616 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:29Z|00056|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.624 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain systemd[1]: libpod-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799.scope: Deactivated successfully.
Nov 23 09:43:29 np0005532585.localdomain podman[284660]: 2025-11-23 09:43:29.756156139 +0000 UTC m=+0.061684268 container died 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 23 09:43:29 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891009.7709] manager: (tapd3912d14-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/15)
Nov 23 09:43:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799-userdata-shm.mount: Deactivated successfully.
Nov 23 09:43:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-086324888e3aef5fa52615eb760fec66e6f8cc0743a0416438a6f968d544d4e3-merged.mount: Deactivated successfully.
Nov 23 09:43:29 np0005532585.localdomain podman[284660]: 2025-11-23 09:43:29.895769959 +0000 UTC m=+0.201297978 container cleanup 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 23 09:43:29 np0005532585.localdomain podman[284674]: 2025-11-23 09:43:29.90837378 +0000 UTC m=+0.140128827 container cleanup 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 23 09:43:29 np0005532585.localdomain systemd[1]: libpod-conmon-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799.scope: Deactivated successfully.
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.919 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.946 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.947 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.948 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:43:29 np0005532585.localdomain podman[284702]: 2025-11-23 09:43:29.984372131 +0000 UTC m=+0.058138956 container remove 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:43:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.993 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e176f012-1286-4a0e-80f6-5a1e970efbe1]: (4, ('Sun Nov 23 09:43:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 (6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799)\n6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799\nSun Nov 23 09:43:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 (6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799)\n6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.995 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[594b1121-f757-4c13-ae31-1ce59239cc7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:29.996 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcac49fc-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:43:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:29.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:29 np0005532585.localdomain kernel: device tapbcac49fc-c0 left promiscuous mode
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.011 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.015 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[faf44cef-00e9-46d1-82dd-49279abfae56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.030 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[36c0fc65-c876-41c4-93fa-f1ce7c339b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.031 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[47f6f33d-f049-4c6d-b2e9-4ab181c93a42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.031 281956 INFO nova.virt.libvirt.driver [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance shutdown successfully after 3 seconds.
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.039 281956 INFO nova.virt.libvirt.driver [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance destroyed successfully.
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.039 281956 DEBUG nova.objects.instance [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'numa_topology' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.044 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c49a33-6f16-454b-acfc-708faa0fb3cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628638, 'reachable_time': 28165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284722, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.054 281956 DEBUG nova.compute.manager [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.058 160573 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.058 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d37ffa-cdeb-432d-8530-503ee71aea89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.149 281956 DEBUG oslo_concurrency.lockutils [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.151 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.151 281956 INFO nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] During sync_power_state the instance has a pending task (powering-off). Skip.
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.152 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.228 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:43:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:30.229 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.246 281956 DEBUG nova.compute.manager [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-unplugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.247 281956 DEBUG oslo_concurrency.lockutils [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.247 281956 DEBUG oslo_concurrency.lockutils [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.248 281956 DEBUG oslo_concurrency.lockutils [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.249 281956 DEBUG nova.compute.manager [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-unplugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.249 281956 WARNING nova.compute.manager [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-unplugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state stopped and task_state None.
Nov 23 09:43:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:30.271 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:30 np0005532585.localdomain systemd[1]: run-netns-ovnmeta\x2dbcac49fc\x2dc589\x2d475a\x2d91a8\x2d00a0ba9c2b33.mount: Deactivated successfully.
Nov 23 09:43:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61219 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC44200000000001030307) 
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.307 281956 DEBUG nova.compute.manager [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.308 281956 DEBUG oslo_concurrency.lockutils [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.309 281956 DEBUG oslo_concurrency.lockutils [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.309 281956 DEBUG oslo_concurrency.lockutils [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.310 281956 DEBUG nova.compute.manager [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.311 281956 WARNING nova.compute.manager [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state stopped and task_state None.
Nov 23 09:43:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:32.792 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.164 281956 DEBUG nova.compute.manager [None req-d184e85c-1645-4f7d-8584-cc69d8ebefd8 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server [None req-d184e85c-1645-4f7d-8584-cc69d8ebefd8 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     raise self.value
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     raise self.value
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 23 09:43:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server 
Nov 23 09:43:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:43:33.231 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:43:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:35.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:37.830 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:38 np0005532585.localdomain sshd[284725]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:43:39 np0005532585.localdomain sshd[284725]: Received disconnect from 207.154.194.2 port 33364:11: Bye Bye [preauth]
Nov 23 09:43:39 np0005532585.localdomain sshd[284725]: Disconnected from authenticating user root 207.154.194.2 port 33364 [preauth]
Nov 23 09:43:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:43:39 np0005532585.localdomain systemd[1]: tmp-crun.tRgasO.mount: Deactivated successfully.
Nov 23 09:43:39 np0005532585.localdomain podman[284727]: 2025-11-23 09:43:39.324158978 +0000 UTC m=+0.100223221 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:43:39 np0005532585.localdomain podman[284727]: 2025-11-23 09:43:39.334460715 +0000 UTC m=+0.110524948 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:43:39 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:43:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:40.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:43:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:43:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:43:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146336 "" "Go-http-client/1.1"
Nov 23 09:43:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:43:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16751 "" "Go-http-client/1.1"
Nov 23 09:43:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:42.875 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:44.794 281956 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763891009.784839, 355032bc-9946-4f6d-817c-2bfc8694d41d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:43:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:44.795 281956 INFO nova.compute.manager [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] VM Stopped (Lifecycle Event)
Nov 23 09:43:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:44.823 281956 DEBUG nova.compute.manager [None req-f4f1784e-c02a-443d-89f6-786bfa75742a - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:44.827 281956 DEBUG nova.compute.manager [None req-f4f1784e-c02a-443d-89f6-786bfa75742a - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:43:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:43:45 np0005532585.localdomain podman[284748]: 2025-11-23 09:43:45.031651992 +0000 UTC m=+0.086176726 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:43:45 np0005532585.localdomain podman[284748]: 2025-11-23 09:43:45.041246626 +0000 UTC m=+0.095771340 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:43:45 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:43:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:45.277 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23202 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC7CEB0000000001030307) 
Nov 23 09:43:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23203 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC80E00000000001030307) 
Nov 23 09:43:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:47.907 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61220 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC84210000000001030307) 
Nov 23 09:43:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23204 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC88E00000000001030307) 
Nov 23 09:43:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:43:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:43:50 np0005532585.localdomain systemd[1]: tmp-crun.LYLlw0.mount: Deactivated successfully.
Nov 23 09:43:50 np0005532585.localdomain podman[284771]: 2025-11-23 09:43:50.046364655 +0000 UTC m=+0.101909583 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:43:50 np0005532585.localdomain podman[284771]: 2025-11-23 09:43:50.123423411 +0000 UTC m=+0.178968419 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:43:50 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:43:50 np0005532585.localdomain podman[284772]: 2025-11-23 09:43:50.149221209 +0000 UTC m=+0.201858985 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:43:50 np0005532585.localdomain podman[284772]: 2025-11-23 09:43:50.165132613 +0000 UTC m=+0.217770379 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 09:43:50 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:43:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35670 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC8C200000000001030307) 
Nov 23 09:43:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:50.279 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:51 np0005532585.localdomain systemd[1]: tmp-crun.Ga6aAJ.mount: Deactivated successfully.
Nov 23 09:43:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:43:52 np0005532585.localdomain podman[284817]: 2025-11-23 09:43:52.024706493 +0000 UTC m=+0.078380548 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:43:52 np0005532585.localdomain podman[284817]: 2025-11-23 09:43:52.058292439 +0000 UTC m=+0.111966554 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:43:52 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:43:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:52.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.083 281956 DEBUG nova.compute.manager [None req-404a0de6-47df-4c5b-8b06-aa6d86925514 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server [None req-404a0de6-47df-4c5b-8b06-aa6d86925514 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     raise self.value
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     self.force_reraise()
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     raise self.value
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Nov 23 09:43:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server 
Nov 23 09:43:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23205 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC98A00000000001030307) 
Nov 23 09:43:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:55.280 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:43:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:43:56 np0005532585.localdomain podman[284835]: 2025-11-23 09:43:56.037568411 +0000 UTC m=+0.081827957 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:43:56 np0005532585.localdomain podman[284835]: 2025-11-23 09:43:56.081135263 +0000 UTC m=+0.125394809 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:43:56 np0005532585.localdomain systemd[1]: tmp-crun.f3qqTu.mount: Deactivated successfully.
Nov 23 09:43:56 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:43:56 np0005532585.localdomain podman[284834]: 2025-11-23 09:43:56.100679903 +0000 UTC m=+0.146853810 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:43:56 np0005532585.localdomain podman[284834]: 2025-11-23 09:43:56.111885819 +0000 UTC m=+0.158059736 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:43:56 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:43:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:57.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:43:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:59.253 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:43:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:59.253 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:43:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:59.254 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:43:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:43:59.254 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:43:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:43:59Z|00057|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:43:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:43:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.065 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.066 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.066 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.066 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.138 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'flavor' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.176 281956 DEBUG oslo_concurrency.lockutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.283 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.914 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.944 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.944 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.945 281956 DEBUG oslo_concurrency.lockutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.945 281956 DEBUG nova.network.neutron [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.945 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.948 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.948 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.949 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.949 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.950 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.951 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.951 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.951 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.980 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.981 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.981 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.981 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:44:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:00.982 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.415 281956 DEBUG nova.network.neutron [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.432 281956 DEBUG oslo_concurrency.lockutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.457 281956 INFO nova.virt.libvirt.driver [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance destroyed successfully.
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.458 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'numa_topology' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23206 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCB8200000000001030307) 
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.466 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.472 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'resources' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.498 281956 DEBUG nova.virt.libvirt.vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T09:43:30Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.499 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.499 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.500 281956 DEBUG os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.502 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.502 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3912d14-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.556 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.556 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.556 281956 INFO os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.559 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.559 281956 INFO nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] UEFI support detected
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.565 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Start _get_guest_xml network_info=[{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=be9a09b1-b916-4d06-9bcd-d8b8afdf9284,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'image_id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}], 'ephemerals': [{'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vdb', 'size': 1, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.568 281956 WARNING nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.571 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.571 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.573 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.573 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.574 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.574 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T08:24:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='8c32de12-b44b-4285-8afc-2a1d7f236d32',id=2,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=be9a09b1-b916-4d06-9bcd-d8b8afdf9284,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.574 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.577 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.577 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.592 281956 DEBUG nova.privsep.utils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.592 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.761 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.763 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12558MB free_disk=41.83708190917969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.763 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.763 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.837 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.838 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.838 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:44:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:01.876 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.056 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.059 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.337 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.343 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.357 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.498 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.499 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.511 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.513 281956 DEBUG nova.virt.libvirt.vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T09:43:30Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.513 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.515 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.517 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'pci_devices' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.540 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] End _get_guest_xml xml=<domain type="kvm">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <uuid>355032bc-9946-4f6d-817c-2bfc8694d41d</uuid>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <name>instance-00000002</name>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <memory>524288</memory>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <vcpu>1</vcpu>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <metadata>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:name>test</nova:name>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:creationTime>2025-11-23 09:44:01</nova:creationTime>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:flavor name="m1.small">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:memory>512</nova:memory>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:disk>1</nova:disk>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:swap>0</nova:swap>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:ephemeral>1</nova:ephemeral>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:vcpus>1</nova:vcpus>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </nova:flavor>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:owner>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:user uuid="7e40ee99e6034be7be796ae12095c154">admin</nova:user>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:project uuid="1915d3e5d4254231a0517e2dcf35848f">admin</nova:project>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </nova:owner>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:root type="image" uuid="be9a09b1-b916-4d06-9bcd-d8b8afdf9284"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <nova:ports>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <nova:port uuid="d3912d14-a3e0-4df9-b811-f3bd90f44559">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:           <nova:ip type="fixed" address="192.168.0.77" ipVersion="4"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         </nova:port>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </nova:ports>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </nova:instance>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </metadata>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <sysinfo type="smbios">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <system>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <entry name="manufacturer">RDO</entry>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <entry name="product">OpenStack Compute</entry>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <entry name="serial">355032bc-9946-4f6d-817c-2bfc8694d41d</entry>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <entry name="uuid">355032bc-9946-4f6d-817c-2bfc8694d41d</entry>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <entry name="family">Virtual Machine</entry>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </system>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </sysinfo>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <os>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <boot dev="hd"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <smbios mode="sysinfo"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <acpi/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <apic/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <clock offset="utc">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <timer name="pit" tickpolicy="delay"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <timer name="hpet" present="no"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </clock>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <cpu mode="host-model" match="exact">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <topology sockets="1" cores="1" threads="1"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <disk type="network" device="disk">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <driver type="raw" cache="none"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <source protocol="rbd" name="vms/355032bc-9946-4f6d-817c-2bfc8694d41d_disk">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.103" port="6789"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.105" port="6789"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.104" port="6789"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </source>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <auth username="openstack">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <secret type="ceph" uuid="46550e70-79cb-5f55-bf6d-1204b97e083b"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </auth>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <target dev="vda" bus="virtio"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <disk type="network" device="disk">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <driver type="raw" cache="none"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <source protocol="rbd" name="vms/355032bc-9946-4f6d-817c-2bfc8694d41d_disk.eph0">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.103" port="6789"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.105" port="6789"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.104" port="6789"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </source>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <auth username="openstack">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:         <secret type="ceph" uuid="46550e70-79cb-5f55-bf6d-1204b97e083b"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       </auth>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <target dev="vdb" bus="virtio"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <interface type="ethernet">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <mac address="fa:16:3e:cf:aa:3b"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <model type="virtio"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <driver name="vhost" rx_queue_size="512"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <mtu size="1292"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <target dev="tapd3912d14-a3"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </interface>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <serial type="pty">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <log file="/var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/console.log" append="off"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </serial>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <video>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <model type="virtio"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <input type="tablet" bus="usb"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <input type="keyboard" bus="usb"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <rng model="virtio">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <backend model="random">/dev/urandom</backend>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <controller type="usb" index="0"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     <memballoon model="virtio">
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:       <stats period="10"/>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:     </memballoon>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: </domain>
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.543 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.543 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.545 281956 DEBUG nova.virt.libvirt.vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T09:43:30Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.546 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.547 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.547 281956 DEBUG os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.549 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.550 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.554 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.555 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.595 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.598 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.603 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.604 281956 INFO os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')
Nov 23 09:44:02 np0005532585.localdomain systemd[1]: Started libvirt secret daemon.
Nov 23 09:44:02 np0005532585.localdomain kernel: device tapd3912d14-a3 entered promiscuous mode
Nov 23 09:44:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891042.7148] manager: (tapd3912d14-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/16)
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00058|binding|INFO|Claiming lport d3912d14-a3e0-4df9-b811-f3bd90f44559 for this chassis.
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00059|binding|INFO|d3912d14-a3e0-4df9-b811-f3bd90f44559: Claiming fa:16:3e:cf:aa:3b 192.168.0.77
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.718 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain systemd-udevd[284995]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.728 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891042.7442] device (tapd3912d14-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 09:44:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891042.7452] device (tapd3912d14-a3): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-c237ed-0
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-49b8a0-0
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00062|ovn_bfd|INFO|Enabled BFD on interface ovn-b7d5b3-0
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.754 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:aa:3b 192.168.0.77'], port_security=['fa:16:3e:cf:aa:3b 192.168.0.77'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.77/24', 'neutron:device_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '1915d3e5d4254231a0517e2dcf35848f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4fe931b0-155c-49e2-b5a5-44d74fa72e9e 6afcee2e-50ee-4b3c-9d1f-24ea7a5a850b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ca41f3-3e94-4959-b1b5-1e81bd2c9bc1, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d3912d14-a3e0-4df9-b811-f3bd90f44559) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.756 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d3912d14-a3e0-4df9-b811-f3bd90f44559 in datapath bcac49fc-c589-475a-91a8-00a0ba9c2b33 bound to our chassis
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.758 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bcac49fc-c589-475a-91a8-00a0ba9c2b33
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.762 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.768 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.772 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[89b4bc42-32af-4e20-99c9-fce330d9c74c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.773 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbcac49fc-c1 in ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.776 160542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbcac49fc-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.776 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d043fc-51df-4719-85bb-ec34cce5872e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.779 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[95cccfcf-2ea0-492b-8aec-0fa00525775d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00063|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 up in Southbound
Nov 23 09:44:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:02Z|00064|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 ovn-installed in OVS
Nov 23 09:44:02 np0005532585.localdomain systemd-machined[84275]: New machine qemu-2-instance-00000002.
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.789 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[e78fff9a-5566-47f7-b09f-cd7fcc162f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.792 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.821 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[082065d0-ee28-4f90-8d12-9838ee506690]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.849 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.853 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.858 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[78abc1ad-e494-4f82-b974-4ef0b83d13f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891042.8651] manager: (tapbcac49fc-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/17)
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.863 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[336c127b-fbb3-4e8b-9a3e-0cc831c49941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.902 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[bc800149-cc74-4ac1-b303-ee061ad4ee5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.906 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[934fe60d-bab2-4e06-a032-557f4081d776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891042.9295] device (tapbcac49fc-c0): carrier: link connected
Nov 23 09:44:02 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c1: link becomes ready
Nov 23 09:44:02 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c0: link becomes ready
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.935 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[4c493eaf-ade0-4737-b409-5ad4d361e3c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.955 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[293fbf38-0cc7-4810-8b30-05eb8731c6ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcac49fc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:b2:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1098603, 'reachable_time': 42896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285040, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.973 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4c5024f5-b189-4646-b3e0-9f213e408f12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:b28b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1098603, 'tstamp': 1098603}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285049, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:02.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:02.994 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[66efae39-f2b5-4c63-8701-aceae1a65e66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcac49fc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:b2:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1098603, 'reachable_time': 42896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285053, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.032 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c3681f4e-b86a-48c8-a278-9ed4f10c10c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.052 281956 DEBUG nova.compute.manager [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.053 281956 DEBUG oslo_concurrency.lockutils [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.053 281956 DEBUG oslo_concurrency.lockutils [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.054 281956 DEBUG oslo_concurrency.lockutils [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.055 281956 DEBUG nova.compute.manager [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.056 281956 WARNING nova.compute.manager [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state stopped and task_state powering-on.
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.113 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[75b9b2d7-fa30-4304-8642-05dcf7c3cbb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.115 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcac49fc-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.116 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.117 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcac49fc-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:03 np0005532585.localdomain kernel: device tapbcac49fc-c0 entered promiscuous mode
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.124 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbcac49fc-c0, col_values=(('external_ids', {'iface-id': '98ef2da5-f5cb-44e8-a4b2-f6178c6c8332'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:03Z|00065|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.134 160439 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.136 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e2f16c-580d-48ca-a98c-3862f2f96fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.139 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.141 160439 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: global
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     log         /dev/log local0 debug
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     log-tag     haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     user        root
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     group       root
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     maxconn     1024
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     pidfile     /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     daemon
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: defaults
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     log global
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     mode http
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     option httplog
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     option dontlognull
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     option http-server-close
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     option forwardfor
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     retries                 3
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout http-request    30s
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout connect         30s
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout client          32s
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout server          32s
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout http-keep-alive 30s
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: listen listener
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     bind 169.254.169.254:80
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     server metadata /var/lib/neutron/metadata_proxy
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:     http-request add-header X-OVN-Network-ID bcac49fc-c589-475a-91a8-00a0ba9c2b33
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 23 09:44:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:03.142 160439 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'env', 'PROCESS_TAG=haproxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.220 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891043.191537, 355032bc-9946-4f6d-817c-2bfc8694d41d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.221 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] VM Resumed (Lifecycle Event)
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.228 281956 DEBUG nova.compute.manager [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.243 281956 INFO nova.virt.libvirt.driver [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance rebooted successfully.
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.245 281956 DEBUG nova.compute.manager [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.253 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.260 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.281 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.282 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891043.1933622, 355032bc-9946-4f6d-817c-2bfc8694d41d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.282 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] VM Started (Lifecycle Event)
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.297 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.301 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:44:03 np0005532585.localdomain podman[285109]: 
Nov 23 09:44:03 np0005532585.localdomain podman[285109]: 2025-11-23 09:44:03.595124683 +0000 UTC m=+0.077458149 container create 91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 09:44:03 np0005532585.localdomain systemd[1]: Started libpod-conmon-91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9.scope.
Nov 23 09:44:03 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:44:03 np0005532585.localdomain podman[285109]: 2025-11-23 09:44:03.561140984 +0000 UTC m=+0.043474440 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:44:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c2424dde758dbd432a1bbe6bb85652b170f03c84b9e9dce074322bc539b635/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:44:03 np0005532585.localdomain podman[285109]: 2025-11-23 09:44:03.707616982 +0000 UTC m=+0.189950458 container init 91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 09:44:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:03Z|00066|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.708 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:03 np0005532585.localdomain podman[285109]: 2025-11-23 09:44:03.71638044 +0000 UTC m=+0.198713916 container start 91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:44:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:03.716 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:03Z|00067|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:44:03 np0005532585.localdomain neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285123]: [NOTICE]   (285127) : New worker (285129) forked
Nov 23 09:44:03 np0005532585.localdomain neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285123]: [NOTICE]   (285127) : Loading success.
Nov 23 09:44:03 np0005532585.localdomain snmpd[67457]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Nov 23 09:44:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:04Z|00068|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:44:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:04.521 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:05.160 281956 DEBUG nova.compute.manager [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:44:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:05.161 281956 DEBUG oslo_concurrency.lockutils [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:44:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:05.161 281956 DEBUG oslo_concurrency.lockutils [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:44:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:05.162 281956 DEBUG oslo_concurrency.lockutils [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:44:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:05.162 281956 DEBUG nova.compute.manager [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:44:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:05.163 281956 WARNING nova.compute.manager [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state active and task_state None.
Nov 23 09:44:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:07.644 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:07.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:44:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:44:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:09.281 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:44:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:44:10 np0005532585.localdomain systemd[1]: tmp-crun.Djnpsp.mount: Deactivated successfully.
Nov 23 09:44:10 np0005532585.localdomain podman[285138]: 2025-11-23 09:44:10.067814036 +0000 UTC m=+0.116531891 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 23 09:44:10 np0005532585.localdomain podman[285138]: 2025-11-23 09:44:10.083216423 +0000 UTC m=+0.131934268 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:44:10 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:44:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:44:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:44:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:44:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 23 09:44:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:44:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17216 "" "Go-http-client/1.1"
Nov 23 09:44:12 np0005532585.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=89.248.163.200 DST=38.102.83.198 LEN=40 TOS=0x08 PREC=0x40 TTL=245 ID=57327 PROTO=TCP SPT=48651 DPT=9090 SEQ=642428843 ACK=0 WINDOW=1024 RES=0x00 SYN URGP=0 
Nov 23 09:44:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:12.680 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:12.995 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:44:16 np0005532585.localdomain systemd[1]: tmp-crun.fp7U7E.mount: Deactivated successfully.
Nov 23 09:44:16 np0005532585.localdomain podman[285155]: 2025-11-23 09:44:16.047669772 +0000 UTC m=+0.097963826 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:44:16 np0005532585.localdomain podman[285155]: 2025-11-23 09:44:16.05857975 +0000 UTC m=+0.108873764 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:44:16 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:44:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36853 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCF21B0000000001030307) 
Nov 23 09:44:17 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:17Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:aa:3b 192.168.0.77
Nov 23 09:44:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36854 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCF6200000000001030307) 
Nov 23 09:44:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:17.708 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23207 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCF8200000000001030307) 
Nov 23 09:44:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:17.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36855 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCFE200000000001030307) 
Nov 23 09:44:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61221 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD02200000000001030307) 
Nov 23 09:44:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:44:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:44:21 np0005532585.localdomain podman[285179]: 2025-11-23 09:44:21.027221626 +0000 UTC m=+0.082932779 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:44:21 np0005532585.localdomain podman[285179]: 2025-11-23 09:44:21.090153875 +0000 UTC m=+0.145865028 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:44:21 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:44:21 np0005532585.localdomain podman[285180]: 2025-11-23 09:44:21.096004787 +0000 UTC m=+0.149527063 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 23 09:44:21 np0005532585.localdomain podman[285180]: 2025-11-23 09:44:21.176405956 +0000 UTC m=+0.229928212 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Nov 23 09:44:21 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:22.695 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:22.698 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:22 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:22.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:44:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:23.001 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:23 np0005532585.localdomain podman[285223]: 2025-11-23 09:44:23.036911689 +0000 UTC m=+0.086961784 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 23 09:44:23 np0005532585.localdomain podman[285223]: 2025-11-23 09:44:23.041633675 +0000 UTC m=+0.091683720 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:44:23 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:44:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:23.157 281956 DEBUG nova.compute.manager [None req-66e56ac4-2fc6-426f-bfb2-48c090186a1a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:44:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:23.163 281956 INFO nova.compute.manager [None req-66e56ac4-2fc6-426f-bfb2-48c090186a1a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Retrieving diagnostics
Nov 23 09:44:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36856 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD0DE00000000001030307) 
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.007 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.007 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3097534
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42604 [23/Nov/2025:09:44:22.694] listener listener/metadata 0/0/0/1313/1313 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.026 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.027 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42608 [23/Nov/2025:09:44:24.025] listener listener/metadata 0/0/0/32/32 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.058 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 0.0310731
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.072 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.074 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.096 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.097 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0235972
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42612 [23/Nov/2025:09:44:24.072] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.104 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.105 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.123 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42620 [23/Nov/2025:09:44:24.104] listener listener/metadata 0/0/0/19/19 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.123 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0181267
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.131 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.132 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.150 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42622 [23/Nov/2025:09:44:24.130] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.151 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.0192010
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.158 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.159 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.179 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42626 [23/Nov/2025:09:44:24.158] listener listener/metadata 0/0/0/22/22 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.180 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 148 time: 0.0209279
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.187 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.188 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.203 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42634 [23/Nov/2025:09:44:24.187] listener listener/metadata 0/0/0/17/17 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.204 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.0159786
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.211 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.212 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.225 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.226 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0140369
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42644 [23/Nov/2025:09:44:24.210] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.232 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.233 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.250 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42648 [23/Nov/2025:09:44:24.232] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.250 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0171552
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.256 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.257 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42658 [23/Nov/2025:09:44:24.255] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.275 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0183260
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.292 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.293 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.307 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42664 [23/Nov/2025:09:44:24.292] listener listener/metadata 0/0/0/16/16 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.308 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.0153022
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.312 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.313 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.332 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42678 [23/Nov/2025:09:44:24.311] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.332 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0197785
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.337 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.338 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.353 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42688 [23/Nov/2025:09:44:24.337] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.353 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.0151033
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.358 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.359 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.373 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42694 [23/Nov/2025:09:44:24.358] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.373 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0145631
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.380 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.380 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.397 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42702 [23/Nov/2025:09:44:24.379] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.398 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0175052
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.404 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.405 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Accept: */*
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Connection: close
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Content-Type: text/plain
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: Host: 169.254.169.254
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: User-Agent: curl/7.84.0
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.424 160537 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Nov 23 09:44:24 np0005532585.localdomain haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42710 [23/Nov/2025:09:44:24.404] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Nov 23 09:44:24 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:24.425 160537 INFO eventlet.wsgi.server [-] 192.168.0.77,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0197296
Nov 23 09:44:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:26.724 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:44:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:26.725 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:44:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:26.753 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:44:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:44:27 np0005532585.localdomain podman[285242]: 2025-11-23 09:44:27.034069387 +0000 UTC m=+0.087213272 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true)
Nov 23 09:44:27 np0005532585.localdomain podman[285243]: 2025-11-23 09:44:27.086994976 +0000 UTC m=+0.136377424 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:44:27 np0005532585.localdomain podman[285243]: 2025-11-23 09:44:27.099487033 +0000 UTC m=+0.148869471 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:44:27 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:44:27 np0005532585.localdomain podman[285242]: 2025-11-23 09:44:27.150934656 +0000 UTC m=+0.204078581 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:44:27 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:44:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:27.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:28.030 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:29 np0005532585.localdomain sudo[285284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:44:29 np0005532585.localdomain sudo[285284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:44:29 np0005532585.localdomain sudo[285284]: pam_unix(sudo:session): session closed for user root
Nov 23 09:44:29 np0005532585.localdomain sudo[285302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:44:29 np0005532585.localdomain sudo[285302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:44:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:44:30 np0005532585.localdomain sudo[285302]: pam_unix(sudo:session): session closed for user root
Nov 23 09:44:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36857 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD2E200000000001030307) 
Nov 23 09:44:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:44:31.728 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:44:32 np0005532585.localdomain sudo[285352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:44:32 np0005532585.localdomain sudo[285352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:44:32 np0005532585.localdomain sudo[285352]: pam_unix(sudo:session): session closed for user root
Nov 23 09:44:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:32.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:32 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:44:32Z|00069|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 23 09:44:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:33.032 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:37.825 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:38.035 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:39 np0005532585.localdomain sshd[285371]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:44:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:44:41 np0005532585.localdomain podman[285372]: 2025-11-23 09:44:41.049153493 +0000 UTC m=+0.095418846 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:44:41 np0005532585.localdomain podman[285372]: 2025-11-23 09:44:41.084628382 +0000 UTC m=+0.130893685 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:44:41 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:44:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:44:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:44:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:44:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 23 09:44:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:44:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1"
Nov 23 09:44:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:42.865 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:43.037 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44461 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD674B0000000001030307) 
Nov 23 09:44:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:44:47 np0005532585.localdomain sshd[285407]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:44:47 np0005532585.localdomain podman[285391]: 2025-11-23 09:44:47.031704911 +0000 UTC m=+0.084748136 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:44:47 np0005532585.localdomain podman[285391]: 2025-11-23 09:44:47.040440331 +0000 UTC m=+0.093483516 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:44:47 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:44:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44462 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD6B600000000001030307) 
Nov 23 09:44:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:47.914 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:48.041 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:48 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36858 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD6E200000000001030307) 
Nov 23 09:44:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44463 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD73600000000001030307) 
Nov 23 09:44:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23208 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD76200000000001030307) 
Nov 23 09:44:50 np0005532585.localdomain sshd[285371]: error: kex_exchange_identification: read: Connection timed out
Nov 23 09:44:50 np0005532585.localdomain sshd[285371]: banner exchange: Connection from 115.190.191.2 port 64048: Connection timed out
Nov 23 09:44:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:44:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:44:52 np0005532585.localdomain systemd[1]: tmp-crun.TN81PO.mount: Deactivated successfully.
Nov 23 09:44:52 np0005532585.localdomain podman[285415]: 2025-11-23 09:44:52.038307363 +0000 UTC m=+0.090611297 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:44:52 np0005532585.localdomain sshd[285407]: Connection closed by 14.103.158.69 port 59974 [preauth]
Nov 23 09:44:52 np0005532585.localdomain podman[285416]: 2025-11-23 09:44:52.118341512 +0000 UTC m=+0.165500206 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter)
Nov 23 09:44:52 np0005532585.localdomain podman[285415]: 2025-11-23 09:44:52.127783945 +0000 UTC m=+0.180087899 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 09:44:52 np0005532585.localdomain podman[285416]: 2025-11-23 09:44:52.139364223 +0000 UTC m=+0.186522977 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 09:44:52 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:44:52 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:44:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:52.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:53.043 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44464 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD83200000000001030307) 
Nov 23 09:44:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:44:54 np0005532585.localdomain podman[285458]: 2025-11-23 09:44:54.031990181 +0000 UTC m=+0.083723134 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:44:54 np0005532585.localdomain podman[285458]: 2025-11-23 09:44:54.036471269 +0000 UTC m=+0.088204252 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:44:54 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:44:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:44:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:44:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:57.950 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:44:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:44:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:44:58.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:44:58 np0005532585.localdomain podman[285477]: 2025-11-23 09:44:58.0585496 +0000 UTC m=+0.090766702 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:44:58 np0005532585.localdomain podman[285477]: 2025-11-23 09:44:58.070535681 +0000 UTC m=+0.102752813 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:44:58 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:44:58 np0005532585.localdomain podman[285478]: 2025-11-23 09:44:58.153656696 +0000 UTC m=+0.181428681 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:44:58 np0005532585.localdomain podman[285478]: 2025-11-23 09:44:58.161829439 +0000 UTC m=+0.189601404 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:44:58 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:44:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:44:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:45:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44465 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDA4200000000001030307) 
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.456 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.457 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.480 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.481 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.481 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.841 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.842 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.842 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.843 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:45:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:02.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.364 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.383 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.383 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.384 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.385 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.385 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.385 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.386 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.386 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.386 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.387 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.409 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.410 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.410 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.411 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.411 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.876 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.947 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:45:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:03.948 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.110 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.111 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12293MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.111 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.111 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.223 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.223 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.223 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.275 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.736 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.744 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.772 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.812 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:45:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:04.813 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:45:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:08.030 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:08.053 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:45:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:45:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:45:09.281 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:45:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:45:09.282 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.804 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.832 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 327680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.833 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4871c7c-50eb-4c4c-be0f-6bfac2029e1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 327680, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.806022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a577bac-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'dd482b2c9a80b9016108197aeb6a2b7715f5728866168c3c71bc7dd462bc3bbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.806022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a57970e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '528b90a3aeb0f09ef62fb5aac2a52fba5f8f1512480bbd0dc18ac9c13e5b400c'}]}, 'timestamp': '2025-11-23 09:45:10.834182', '_unique_id': '65ec9eaaf2f049bf951f05392df43bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.837 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.859 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ec032e4-2736-404a-a661-571dfcb99c60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:45:10.837474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1a5b7c98-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.036686784, 'message_signature': '5df2bd9226d2a7e8dfa19fc5bd8c17b21ab1ad7568a801c6c8a51ff7977e7c8e'}]}, 'timestamp': '2025-11-23 09:45:10.859740', '_unique_id': '0ef07482614449e38546e7c23fe5b945'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1287506-c58d-4074-a375-222647e095d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.862254', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a5c6a7c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '0c574ec46ea776c379d32928c9ff9bde0c4765dc4ce5c1b6c2620a43fda1cbfa'}]}, 'timestamp': '2025-11-23 09:45:10.865813', '_unique_id': '28196d48e5c24b40ab18dfa03cf70789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f76681f8-15d7-4e8d-976e-db150cc21209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.868094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a5e8960-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': '4c83a78632da016e93f51277e01aaf9e3c474750f48f617b9da0e21e8f2ca054'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.868094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a5e9e50-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': '24cb8d35065e72678870fde0b515b7c548b26b3718043599d94a1d86ac7e9d94'}]}, 'timestamp': '2025-11-23 09:45:10.880431', '_unique_id': 'fad264f531ba48f28b07f5042a0e0e10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bea516e-70a7-4893-a45a-64206a73f479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.882857', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a5f1b3c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': 'c157ef704fdf06bbb33cbb66615d5e4511163a51a9ecb9e1377fa0837d6113a1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.882857', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a5f2e9c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': 'c9b06920a7cd1123b1a2a0f1334d307ad9ff4f6598a7763c3f5a26a85310b37d'}]}, 'timestamp': '2025-11-23 09:45:10.883957', '_unique_id': '9c5c319d3e294e97b02b1ebeee5612be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f624aedd-674a-4926-a4ee-7fe4bff047b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.886300', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a5f9ce2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': '25dbf1e1d678c9bfec0cb1bd811809477565e26cc0f55f2d2ad5054bd58303cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.886300', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a5fadea-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': 'c7774c1516c1aa5c8f81dc2d39e2834d32e7ae7b3f87d40f7fd1b7c62990ebee'}]}, 'timestamp': '2025-11-23 09:45:10.887165', '_unique_id': 'e4a94aaf372341ef8fccfc6c95d93c72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd58534dd-deb8-40a2-99d3-f5536007f3bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.889295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a6012da-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '2382eece137fbe9521b2ec6f51b3c6b53c82fb4c75addeeb5c03f07dca2cc302'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.889295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a60240a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '2ca35d18d5ffc4129347a713f1f90166a01b2193fe6a6fbbfc9995f88f72fb22'}]}, 'timestamp': '2025-11-23 09:45:10.890193', '_unique_id': '4831ffafb50442a5a98399c72a6ee99a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb5a7104-a2ca-46bb-80d8-670567bda1dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.892297', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a6086fc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'd2a7e1b90ba3c2800a5ae350622d70cb2db0b23c4bd6f1a20abfde6f1aa1ccf7'}]}, 'timestamp': '2025-11-23 09:45:10.892751', '_unique_id': '34aeb942c9114131bfa9a7fed4f5a221'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 11650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a6b5c5b-aca4-466e-9d37-af948fe859f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11650000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:45:10.894843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1a60eb56-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.036686784, 'message_signature': '0e6846a1045bc1e1a8f6974e7defd811ad874ed3b68daefacdba0bbb6a0781a0'}]}, 'timestamp': '2025-11-23 09:45:10.895301', '_unique_id': '6741cb3511ae4d699728c69c5a2057d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbcf95b5-82c1-4008-8a8c-8bee0e6dd711', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.897412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a614e5c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'dc22cd0829d05ab9fab22269b5d7ca044198ee44e9e04681cded026b5b071c48'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.897412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a615f14-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '9f87002a9a21bd5dfd2af6e0a1eefb1e7de2cb25f1f15df4f8dd2a45cbb990c1'}]}, 'timestamp': '2025-11-23 09:45:10.898249', '_unique_id': '23db7d68b2dd4fab90bc48c3cb55fdd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1112cd82-d488-4ba6-b473-8165056feffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.900330', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a61c062-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'c0995ffe9c937d8612e0cf95251f449e1958142fe326dddaf7f21376d73da26b'}]}, 'timestamp': '2025-11-23 09:45:10.900765', '_unique_id': 'aa4ef75d9a8d4300a417f6fe53245af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2acd2947-5dfb-4d8a-a89e-88d03461753a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.902963', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a622782-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '99fc1663dc105c1f380fe31d68c7d61bee8df41ec8a2fe391bf9da917d62ac2c'}]}, 'timestamp': '2025-11-23 09:45:10.903410', '_unique_id': '95cab1b6ca82441d9a5f93a5eebe00d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 36 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb877ac6-849a-4e73-a458-ab451260f7b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 36, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.905461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a6288b2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'baf5cf10d4c7fc3f6da6b0c8a7e8fb0bc1c8101c1f53548ca05a15c9ac4271a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.905461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a629960-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '4ee382b31bf1372ff0f96916e715374082f06fbce1f51fd262d4d3100c686ddd'}]}, 'timestamp': '2025-11-23 09:45:10.906333', '_unique_id': '4f6b8e38162946dabf87eaa7968fc360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bb6578b-d7e6-46d4-b439-8607defed2fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.908773', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a630d78-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'e7b9f5a79908938ae98e86d2054866ea0e2a484d0bba91b477a139d6abe0f5d2'}]}, 'timestamp': '2025-11-23 09:45:10.909369', '_unique_id': '06303a20846746bbb389bfd70b8a0363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e09aa88-c596-41c8-9a58-e7619d436061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.911544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a637664-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'ddc99b179786aebef0e14c533d18cab531e1150132a1f217f4ad392f68ca2b85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.911544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a63871c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '677cef864b73832a1db8de8ee5e831cf8d5b2b41c4fd3ec01cbf96685a662051'}]}, 'timestamp': '2025-11-23 09:45:10.912380', '_unique_id': 'dfe46d621799418a9fe7850deccc1ff2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 950875641 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a740f129-5510-44b4-b42f-e9bc2dcd01b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 950875641, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.914603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a63edc4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '183bfd62c4918910fb44e8924f3c177c7c23c6beb9d5354ad5e540c33b1c87d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.914603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a63fe68-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '2ccac08766caebc06699a756186a1b612f39d3186fd5c60b922d539933a0b062'}]}, 'timestamp': '2025-11-23 09:45:10.915433', '_unique_id': '0127799b9a8041a88923d1e082f290dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc717cf6-d5d6-4d68-9289-f419a73c957c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.917907', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a646cf4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '631a03efd47083efb7805d6d56359d7d26ca80226c22033ddcd23aee7e952fbb'}]}, 'timestamp': '2025-11-23 09:45:10.918222', '_unique_id': '5bf14446c50c462d91b4dfcddd6a2910'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fbd3a55-71f3-430d-b7da-bda4e09e31ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.919607', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a64b6dc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '0649a0306addb19b77c319338e06e341fb5a4ca3c1c0b5a6eedef1953300245b'}]}, 'timestamp': '2025-11-23 09:45:10.920113', '_unique_id': '2ffafb4dfd514d849250b01d1d2eff7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c54c07a7-4eb4-4c48-83df-d6a8ae359172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.921968', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a650b1e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '04b6e8a2792fea08522442d1721c3686fb8a24bbc51c4b4f01a06395b3bb4204'}]}, 'timestamp': '2025-11-23 09:45:10.922269', '_unique_id': '7d34fdfbfad044058919db3b3b909f85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '432e8f63-9537-4c4a-8375-946e363d5aec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.923620', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a654cb4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '4607573db8cca7788b48b414d0b6b2413bea2bc47373193c65b0882c17543c6d'}]}, 'timestamp': '2025-11-23 09:45:10.923970', '_unique_id': '2e1b17fb6a2f437aac31406e774e1c1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5a1cfd-0e97-43cb-9111-e66d275d66f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.925618', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a659994-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'a903ff11750fe9b9e90bd2c06d8d3587a392fa3b923ffba0eb0246035f392f4d'}]}, 'timestamp': '2025-11-23 09:45:10.925937', '_unique_id': 'cf7455c4264444ca9430a1961dd6b439'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:45:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:45:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:45:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:45:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:45:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 23 09:45:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:45:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:45:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17235 "" "Go-http-client/1.1"
Nov 23 09:45:12 np0005532585.localdomain systemd[1]: tmp-crun.TbRK9q.mount: Deactivated successfully.
Nov 23 09:45:12 np0005532585.localdomain podman[285564]: 2025-11-23 09:45:12.034019052 +0000 UTC m=+0.087426328 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Nov 23 09:45:12 np0005532585.localdomain podman[285564]: 2025-11-23 09:45:12.048395437 +0000 UTC m=+0.101802753 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:45:12 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:45:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:13.056 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:13.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:13.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:13.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:13.072 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:13.072 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59928 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDDC7B0000000001030307) 
Nov 23 09:45:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59929 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDE0A10000000001030307) 
Nov 23 09:45:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:45:18 np0005532585.localdomain podman[285583]: 2025-11-23 09:45:18.034072802 +0000 UTC m=+0.078943977 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:45:18 np0005532585.localdomain podman[285583]: 2025-11-23 09:45:18.042220925 +0000 UTC m=+0.087092110 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:45:18 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:45:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:18.073 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:18.075 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:18.075 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:18.076 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:18.107 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:18.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44466 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDE4200000000001030307) 
Nov 23 09:45:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59930 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDE8A00000000001030307) 
Nov 23 09:45:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36859 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDEC210000000001030307) 
Nov 23 09:45:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:45:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:45:23 np0005532585.localdomain systemd[1]: tmp-crun.XyNBx3.mount: Deactivated successfully.
Nov 23 09:45:23 np0005532585.localdomain podman[285608]: 2025-11-23 09:45:23.022350217 +0000 UTC m=+0.077695467 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:45:23 np0005532585.localdomain podman[285608]: 2025-11-23 09:45:23.101304492 +0000 UTC m=+0.156649802 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:45:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:23.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:23.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:23 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:45:23 np0005532585.localdomain podman[285609]: 2025-11-23 09:45:23.107153764 +0000 UTC m=+0.158968905 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter)
Nov 23 09:45:23 np0005532585.localdomain podman[285609]: 2025-11-23 09:45:23.190449243 +0000 UTC m=+0.242264394 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible)
Nov 23 09:45:23 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:45:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59931 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDF8600000000001030307) 
Nov 23 09:45:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:45:25 np0005532585.localdomain podman[285653]: 2025-11-23 09:45:25.013917619 +0000 UTC m=+0.070694551 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 23 09:45:25 np0005532585.localdomain podman[285653]: 2025-11-23 09:45:25.043801014 +0000 UTC m=+0.100577986 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:45:25 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:45:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:28.110 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:28.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:28.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:28.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:28.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:28.152 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:45:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:45:29 np0005532585.localdomain systemd[1]: tmp-crun.ZW2OIM.mount: Deactivated successfully.
Nov 23 09:45:29 np0005532585.localdomain podman[285672]: 2025-11-23 09:45:29.035653138 +0000 UTC m=+0.089664118 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd)
Nov 23 09:45:29 np0005532585.localdomain podman[285672]: 2025-11-23 09:45:29.054476332 +0000 UTC m=+0.108487312 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 23 09:45:29 np0005532585.localdomain systemd[1]: tmp-crun.G8T98a.mount: Deactivated successfully.
Nov 23 09:45:29 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:45:29 np0005532585.localdomain podman[285673]: 2025-11-23 09:45:29.072098027 +0000 UTC m=+0.124132416 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:45:29 np0005532585.localdomain podman[285673]: 2025-11-23 09:45:29.079500046 +0000 UTC m=+0.131534445 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:45:29 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:45:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:45:31 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59932 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE18200000000001030307) 
Nov 23 09:45:32 np0005532585.localdomain sudo[285714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:45:32 np0005532585.localdomain sudo[285714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:45:32 np0005532585.localdomain sudo[285714]: pam_unix(sudo:session): session closed for user root
Nov 23 09:45:32 np0005532585.localdomain sudo[285732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:45:32 np0005532585.localdomain sudo[285732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:45:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:33.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:33.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:33.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:33.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:33.197 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:33.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:33 np0005532585.localdomain sudo[285732]: pam_unix(sudo:session): session closed for user root
Nov 23 09:45:34 np0005532585.localdomain sudo[285781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:45:34 np0005532585.localdomain sudo[285781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:45:34 np0005532585.localdomain sudo[285781]: pam_unix(sudo:session): session closed for user root
Nov 23 09:45:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:38.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:38.200 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:38.200 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:38.200 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:38.240 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:38.241 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:39 np0005532585.localdomain sshd[285799]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:45:41 np0005532585.localdomain sshd[285799]: Invalid user unknown from 71.12.241.225 port 45354
Nov 23 09:45:41 np0005532585.localdomain sshd[285799]: Connection closed by invalid user unknown 71.12.241.225 port 45354 [preauth]
Nov 23 09:45:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:45:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:45:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:45:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 23 09:45:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:45:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1"
Nov 23 09:45:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:45:43 np0005532585.localdomain podman[285801]: 2025-11-23 09:45:43.018026882 +0000 UTC m=+0.070996080 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 23 09:45:43 np0005532585.localdomain podman[285801]: 2025-11-23 09:45:43.052332294 +0000 UTC m=+0.105301542 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:45:43 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:45:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:43.242 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:43.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:43.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:43.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:43.290 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:43.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:46 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63390 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE51AB0000000001030307) 
Nov 23 09:45:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63391 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE55A10000000001030307) 
Nov 23 09:45:47 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59933 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE58200000000001030307) 
Nov 23 09:45:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:48.313 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:48.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:48.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5024 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:48.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:48.316 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:48.319 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:45:48 np0005532585.localdomain sshd[285820]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:45:49 np0005532585.localdomain systemd[1]: tmp-crun.tsWFYc.mount: Deactivated successfully.
Nov 23 09:45:49 np0005532585.localdomain podman[285821]: 2025-11-23 09:45:49.039847786 +0000 UTC m=+0.089084440 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:45:49 np0005532585.localdomain podman[285821]: 2025-11-23 09:45:49.050511175 +0000 UTC m=+0.099747779 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:45:49 np0005532585.localdomain sshd[285820]: Accepted publickey for zuul from 38.102.83.114 port 47340 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:45:49 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:45:49 np0005532585.localdomain systemd-logind[761]: New session 61 of user zuul.
Nov 23 09:45:49 np0005532585.localdomain systemd[1]: Started Session 61 of User zuul.
Nov 23 09:45:49 np0005532585.localdomain sshd[285820]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 09:45:49 np0005532585.localdomain sudo[285864]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plepfsnqhyycovixmadbtjlydbyazmxx ; /usr/bin/python3
Nov 23 09:45:49 np0005532585.localdomain sudo[285864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 09:45:49 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63392 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE5DA00000000001030307) 
Nov 23 09:45:49 np0005532585.localdomain python3[285866]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:45:49 np0005532585.localdomain subscription-manager[285867]: Unregistered machine with identity: 805f8986-28c6-49b9-9c1d-56700caa6ca2
Nov 23 09:45:50 np0005532585.localdomain sudo[285864]: pam_unix(sudo:session): session closed for user root
Nov 23 09:45:50 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44467 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE62200000000001030307) 
Nov 23 09:45:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:53.320 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:53.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:53.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:53.324 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:53.359 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:53.360 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:53 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63393 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE6D600000000001030307) 
Nov 23 09:45:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:45:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:45:54 np0005532585.localdomain podman[285870]: 2025-11-23 09:45:54.040405871 +0000 UTC m=+0.091243167 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Nov 23 09:45:54 np0005532585.localdomain podman[285870]: 2025-11-23 09:45:54.051942028 +0000 UTC m=+0.102779364 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:45:54 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:45:54 np0005532585.localdomain podman[285869]: 2025-11-23 09:45:54.017105719 +0000 UTC m=+0.072028681 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 23 09:45:54 np0005532585.localdomain podman[285869]: 2025-11-23 09:45:54.09785327 +0000 UTC m=+0.152776252 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:45:54 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:45:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:45:56 np0005532585.localdomain podman[285912]: 2025-11-23 09:45:56.03576616 +0000 UTC m=+0.087058728 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Nov 23 09:45:56 np0005532585.localdomain podman[285912]: 2025-11-23 09:45:56.041436575 +0000 UTC m=+0.092729203 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:45:56 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:45:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:58.361 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:58.362 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:45:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:58.363 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:45:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:58.363 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:58.394 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:45:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:45:58.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:45:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:45:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:45:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:45:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:46:00 np0005532585.localdomain podman[285930]: 2025-11-23 09:46:00.027884393 +0000 UTC m=+0.083616351 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:46:00 np0005532585.localdomain systemd[1]: tmp-crun.1t92JS.mount: Deactivated successfully.
Nov 23 09:46:00 np0005532585.localdomain podman[285931]: 2025-11-23 09:46:00.0487949 +0000 UTC m=+0.099843003 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:46:00 np0005532585.localdomain podman[285931]: 2025-11-23 09:46:00.083948529 +0000 UTC m=+0.134996712 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:46:00 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:46:00 np0005532585.localdomain podman[285930]: 2025-11-23 09:46:00.100563474 +0000 UTC m=+0.156295462 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Nov 23 09:46:00 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:46:01 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63394 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE8E210000000001030307) 
Nov 23 09:46:02 np0005532585.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 23 09:46:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:03.396 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:03.397 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:03.398 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:03.398 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:03.422 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:03.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.815 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.816 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.817 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.817 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.961 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.961 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.962 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:46:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:04.962 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.464 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.477 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.477 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.478 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.479 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.479 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.480 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.480 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.481 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.481 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.482 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.502 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.502 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.503 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.503 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.504 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:46:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:05.991 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.054 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.055 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.277 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.279 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12291MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.279 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.376 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.376 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.377 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.476 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.942 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.950 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.972 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.975 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:46:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:06.975 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:46:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:08.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:08.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:08.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:08.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:08.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:08.460 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:46:09.281 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:46:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:46:09.282 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:46:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:46:09.283 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:46:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:46:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:46:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:46:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 23 09:46:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:46:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17237 "" "Go-http-client/1.1"
Nov 23 09:46:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:13.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:46:14 np0005532585.localdomain podman[286019]: 2025-11-23 09:46:14.029690901 +0000 UTC m=+0.086335725 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:46:14 np0005532585.localdomain podman[286019]: 2025-11-23 09:46:14.041411254 +0000 UTC m=+0.098056108 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:46:14 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:46:16 np0005532585.localdomain sshd[286038]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:46:16 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18246 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DEC6DB0000000001030307) 
Nov 23 09:46:16 np0005532585.localdomain sshd[286038]: Invalid user ec2-user from 107.172.15.139 port 52046
Nov 23 09:46:16 np0005532585.localdomain sshd[286038]: Received disconnect from 107.172.15.139 port 52046:11: Bye Bye [preauth]
Nov 23 09:46:16 np0005532585.localdomain sshd[286038]: Disconnected from invalid user ec2-user 107.172.15.139 port 52046 [preauth]
Nov 23 09:46:17 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18247 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DECAE00000000001030307) 
Nov 23 09:46:18 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63395 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DECE200000000001030307) 
Nov 23 09:46:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:18.461 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:18.505 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:18.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:18.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:18.507 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:18.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:19 np0005532585.localdomain sudo[286040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:46:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:46:19 np0005532585.localdomain sudo[286040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:19 np0005532585.localdomain sudo[286040]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:19 np0005532585.localdomain systemd[1]: tmp-crun.YPLUEv.mount: Deactivated successfully.
Nov 23 09:46:19 np0005532585.localdomain sudo[286068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 09:46:19 np0005532585.localdomain sudo[286068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:19 np0005532585.localdomain podman[286057]: 2025-11-23 09:46:19.275947545 +0000 UTC m=+0.082788700 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:46:19 np0005532585.localdomain podman[286057]: 2025-11-23 09:46:19.310550407 +0000 UTC m=+0.117391572 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:46:19 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:46:19 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18248 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DED2E00000000001030307) 
Nov 23 09:46:19 np0005532585.localdomain sudo[286068]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:20 np0005532585.localdomain sudo[286121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:20 np0005532585.localdomain sudo[286121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:20 np0005532585.localdomain sudo[286121]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:20 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59934 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DED6200000000001030307) 
Nov 23 09:46:21 np0005532585.localdomain sudo[286139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:21 np0005532585.localdomain sudo[286139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:21 np0005532585.localdomain sudo[286139]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:22 np0005532585.localdomain sudo[286157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:22 np0005532585.localdomain sudo[286157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:22 np0005532585.localdomain sudo[286157]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:23 np0005532585.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18249 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DEE2A00000000001030307) 
Nov 23 09:46:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:23.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:23.513 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:23.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:23.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:23.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:23.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:46:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: tmp-crun.9JYCKb.mount: Deactivated successfully.
Nov 23 09:46:25 np0005532585.localdomain podman[286175]: 2025-11-23 09:46:25.05576343 +0000 UTC m=+0.106566594 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:46:25 np0005532585.localdomain podman[286176]: 2025-11-23 09:46:25.022671765 +0000 UTC m=+0.074328845 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Nov 23 09:46:25 np0005532585.localdomain podman[286176]: 2025-11-23 09:46:25.117423288 +0000 UTC m=+0.169080388 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:46:25 np0005532585.localdomain podman[286175]: 2025-11-23 09:46:25.157374057 +0000 UTC m=+0.208177211 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:46:25 np0005532585.localdomain sshd[286219]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:46:25 np0005532585.localdomain sshd[286219]: Accepted publickey for tripleo-admin from 192.168.122.11 port 59110 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 23 09:46:25 np0005532585.localdomain systemd-logind[761]: New session 62 of user tripleo-admin.
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 23 09:46:25 np0005532585.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Queued start job for default target Main User Target.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Created slice User Application Slice.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 09:46:26 np0005532585.localdomain systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 23 09:46:26 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 09:46:26 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Reached target Paths.
Nov 23 09:46:26 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Reached target Timers.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Starting D-Bus User Message Bus Socket...
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Starting Create User's Volatile Files and Directories...
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Listening on D-Bus User Message Bus Socket.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Finished Create User's Volatile Files and Directories.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Reached target Sockets.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Reached target Basic System.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Reached target Main User Target.
Nov 23 09:46:26 np0005532585.localdomain systemd[286223]: Startup finished in 164ms.
Nov 23 09:46:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:46:26 np0005532585.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 23 09:46:26 np0005532585.localdomain systemd[1]: Started Session 62 of User tripleo-admin.
Nov 23 09:46:26 np0005532585.localdomain sshd[286219]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 23 09:46:26 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 09:46:26 np0005532585.localdomain podman[286239]: 2025-11-23 09:46:26.28218024 +0000 UTC m=+0.076050569 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 23 09:46:26 np0005532585.localdomain podman[286239]: 2025-11-23 09:46:26.31450501 +0000 UTC m=+0.108375309 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:46:26 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:46:26 np0005532585.localdomain sudo[286382]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyseizpeutdcyqtgirmsldtqbvojnjcv ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763891186.2844248-60542-119609627376065/AnsiballZ_blockinfile.py
Nov 23 09:46:26 np0005532585.localdomain sudo[286382]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 09:46:26 np0005532585.localdomain python3[286384]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:46:26 np0005532585.localdomain sudo[286382]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:27 np0005532585.localdomain sudo[286526]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isfozvxyuebooujnwfrfcxrjcycstwsr ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763891187.0811882-60556-269768322306146/AnsiballZ_systemd.py
Nov 23 09:46:27 np0005532585.localdomain sudo[286526]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 09:46:27 np0005532585.localdomain python3[286528]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 09:46:27 np0005532585.localdomain systemd[1]: Stopping Netfilter Tables...
Nov 23 09:46:27 np0005532585.localdomain systemd[1]: nftables.service: Deactivated successfully.
Nov 23 09:46:27 np0005532585.localdomain systemd[1]: Stopped Netfilter Tables.
Nov 23 09:46:27 np0005532585.localdomain systemd[1]: Starting Netfilter Tables...
Nov 23 09:46:28 np0005532585.localdomain systemd[1]: Finished Netfilter Tables.
Nov 23 09:46:28 np0005532585.localdomain sudo[286526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:28.552 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:28.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:28.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:28.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:28.597 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:28.598 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:46:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:46:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:46:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:46:31 np0005532585.localdomain systemd[1]: tmp-crun.T3WxNd.mount: Deactivated successfully.
Nov 23 09:46:31 np0005532585.localdomain podman[286553]: 2025-11-23 09:46:31.024595925 +0000 UTC m=+0.071019202 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:46:31 np0005532585.localdomain systemd[1]: tmp-crun.dMw8XN.mount: Deactivated successfully.
Nov 23 09:46:31 np0005532585.localdomain podman[286553]: 2025-11-23 09:46:31.042292889 +0000 UTC m=+0.088716166 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 09:46:31 np0005532585.localdomain podman[286554]: 2025-11-23 09:46:31.045727356 +0000 UTC m=+0.087769296 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:46:31 np0005532585.localdomain podman[286554]: 2025-11-23 09:46:31.057327468 +0000 UTC m=+0.099369408 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:46:31 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:46:31 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:46:33 np0005532585.localdomain sudo[286597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:33 np0005532585.localdomain sudo[286597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:33 np0005532585.localdomain sudo[286597]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:33.628 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:33.631 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:33.631 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:33.632 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:33.632 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:33.636 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:34 np0005532585.localdomain sudo[286615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:46:34 np0005532585.localdomain sudo[286615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:34 np0005532585.localdomain sudo[286615]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:34 np0005532585.localdomain sudo[286633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:46:34 np0005532585.localdomain sudo[286633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:35 np0005532585.localdomain sudo[286633]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:35 np0005532585.localdomain sudo[286683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:35 np0005532585.localdomain sudo[286683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:35 np0005532585.localdomain sudo[286683]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:36 np0005532585.localdomain sudo[286701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:36 np0005532585.localdomain sudo[286701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:36 np0005532585.localdomain sudo[286701]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:37 np0005532585.localdomain sudo[286719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:37 np0005532585.localdomain sudo[286719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:37 np0005532585.localdomain sudo[286719]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:38 np0005532585.localdomain sudo[286737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:38 np0005532585.localdomain sudo[286737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:38 np0005532585.localdomain sudo[286737]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:38.668 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:38.670 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:38.670 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:38.671 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:38.671 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:38.674 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:39 np0005532585.localdomain sudo[286755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:39 np0005532585.localdomain sudo[286755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:39 np0005532585.localdomain sudo[286755]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:46:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:46:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:46:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1"
Nov 23 09:46:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:46:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17239 "" "Go-http-client/1.1"
Nov 23 09:46:43 np0005532585.localdomain sudo[286773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:46:43 np0005532585.localdomain sudo[286773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:43 np0005532585.localdomain sudo[286773]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:43 np0005532585.localdomain sudo[286791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:46:43 np0005532585.localdomain sudo[286791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:43.674 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:43 np0005532585.localdomain podman[286850]: 
Nov 23 09:46:43 np0005532585.localdomain podman[286850]: 2025-11-23 09:46:43.968163434 +0000 UTC m=+0.075306006 container create ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, io.buildah.version=1.33.12, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=553)
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: Started libpod-conmon-ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8.scope.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:46:44 np0005532585.localdomain podman[286850]: 2025-11-23 09:46:43.9344661 +0000 UTC m=+0.041608712 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:46:44 np0005532585.localdomain podman[286850]: 2025-11-23 09:46:44.046035249 +0000 UTC m=+0.153177781 container init ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.33.12, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph)
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: tmp-crun.ByAWGl.mount: Deactivated successfully.
Nov 23 09:46:44 np0005532585.localdomain podman[286850]: 2025-11-23 09:46:44.059693756 +0000 UTC m=+0.166836328 container start ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:46:44 np0005532585.localdomain podman[286850]: 2025-11-23 09:46:44.060089488 +0000 UTC m=+0.167232050 container attach ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.buildah.version=1.33.12, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7)
Nov 23 09:46:44 np0005532585.localdomain nice_cartwright[286865]: 167 167
Nov 23 09:46:44 np0005532585.localdomain podman[286850]: 2025-11-23 09:46:44.064165395 +0000 UTC m=+0.171308007 container died ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: libpod-ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8.scope: Deactivated successfully.
Nov 23 09:46:44 np0005532585.localdomain podman[286870]: 2025-11-23 09:46:44.13916087 +0000 UTC m=+0.066183860 container remove ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: libpod-conmon-ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8.scope: Deactivated successfully.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:46:44 np0005532585.localdomain podman[286876]: 2025-11-23 09:46:44.228187545 +0000 UTC m=+0.138404870 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 09:46:44 np0005532585.localdomain podman[286876]: 2025-11-23 09:46:44.24020417 +0000 UTC m=+0.150421455 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 23 09:46:44 np0005532585.localdomain systemd-rc-local-generator[286927]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:46:44 np0005532585.localdomain systemd-sysv-generator[286933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cc1e8b49fe197506885064802507ef8d24cd2ade0c0b1ad450dc6cb2383b9ee8-merged.mount: Deactivated successfully.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:46:44 np0005532585.localdomain systemd-sysv-generator[286975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:46:44 np0005532585.localdomain systemd-rc-local-generator[286972]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:46:44 np0005532585.localdomain systemd[1]: Starting Ceph mds.mds.np0005532585.jcltnl for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 09:46:45 np0005532585.localdomain podman[287034]: 
Nov 23 09:46:45 np0005532585.localdomain podman[287034]: 2025-11-23 09:46:45.33773309 +0000 UTC m=+0.067965296 container create 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Nov 23 09:46:45 np0005532585.localdomain systemd[1]: tmp-crun.rmcvku.mount: Deactivated successfully.
Nov 23 09:46:45 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:46:45 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 09:46:45 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 09:46:45 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/var/lib/ceph/mds/ceph-mds.np0005532585.jcltnl supports timestamps until 2038 (0x7fffffff)
Nov 23 09:46:45 np0005532585.localdomain podman[287034]: 2025-11-23 09:46:45.401729351 +0000 UTC m=+0.131961567 container init 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:46:45 np0005532585.localdomain podman[287034]: 2025-11-23 09:46:45.306447282 +0000 UTC m=+0.036679518 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:46:45 np0005532585.localdomain podman[287034]: 2025-11-23 09:46:45.411188167 +0000 UTC m=+0.141420383 container start 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:46:45 np0005532585.localdomain bash[287034]: 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7
Nov 23 09:46:45 np0005532585.localdomain systemd[1]: Started Ceph mds.mds.np0005532585.jcltnl for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 09:46:45 np0005532585.localdomain sudo[286791]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:45 np0005532585.localdomain ceph-mds[287052]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 09:46:45 np0005532585.localdomain ceph-mds[287052]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Nov 23 09:46:45 np0005532585.localdomain ceph-mds[287052]: main not setting numa affinity
Nov 23 09:46:45 np0005532585.localdomain ceph-mds[287052]: pidfile_write: ignore empty --pid-file
Nov 23 09:46:45 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl[287048]: starting mds.mds.np0005532585.jcltnl at 
Nov 23 09:46:45 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Updating MDS map to version 7 from mon.0
Nov 23 09:46:46 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Updating MDS map to version 8 from mon.0
Nov 23 09:46:46 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Monitors have assigned me to become a standby.
Nov 23 09:46:48 np0005532585.localdomain sudo[287072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:48 np0005532585.localdomain sudo[287072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:48 np0005532585.localdomain sudo[287072]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:48 np0005532585.localdomain sudo[287090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:46:48 np0005532585.localdomain sudo[287090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:48 np0005532585.localdomain sudo[287090]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:48 np0005532585.localdomain sudo[287108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:46:48 np0005532585.localdomain sudo[287108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:48.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:48.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:48.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:48.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:48.729 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:48.730 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:49 np0005532585.localdomain systemd[1]: tmp-crun.NCQ8yZ.mount: Deactivated successfully.
Nov 23 09:46:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:46:49 np0005532585.localdomain podman[287199]: 2025-11-23 09:46:49.348007312 +0000 UTC m=+0.109781113 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, architecture=x86_64, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:46:49 np0005532585.localdomain systemd[1]: tmp-crun.wiB9nM.mount: Deactivated successfully.
Nov 23 09:46:49 np0005532585.localdomain podman[287217]: 2025-11-23 09:46:49.443152047 +0000 UTC m=+0.086772024 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:46:49 np0005532585.localdomain podman[287199]: 2025-11-23 09:46:49.47904271 +0000 UTC m=+0.240816491 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Nov 23 09:46:49 np0005532585.localdomain podman[287217]: 2025-11-23 09:46:49.530381026 +0000 UTC m=+0.174001103 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:46:49 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:46:49 np0005532585.localdomain sudo[287108]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:50 np0005532585.localdomain sshd[285847]: Received disconnect from 38.102.83.114 port 47340:11: disconnected by user
Nov 23 09:46:50 np0005532585.localdomain sshd[285847]: Disconnected from user zuul 38.102.83.114 port 47340
Nov 23 09:46:50 np0005532585.localdomain sshd[285820]: pam_unix(sshd:session): session closed for user zuul
Nov 23 09:46:50 np0005532585.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Nov 23 09:46:50 np0005532585.localdomain systemd-logind[761]: Session 61 logged out. Waiting for processes to exit.
Nov 23 09:46:50 np0005532585.localdomain systemd-logind[761]: Removed session 61.
Nov 23 09:46:50 np0005532585.localdomain sudo[287306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:50 np0005532585.localdomain sudo[287306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:50 np0005532585.localdomain sudo[287306]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:50 np0005532585.localdomain sudo[287324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:46:50 np0005532585.localdomain sudo[287324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:46:50 np0005532585.localdomain sudo[287324]: pam_unix(sudo:session): session closed for user root
Nov 23 09:46:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:53.731 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4988-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:53.733 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:53.734 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:53.734 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:53.758 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:53.758 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:46:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:46:56 np0005532585.localdomain podman[287343]: 2025-11-23 09:46:56.05345408 +0000 UTC m=+0.089867491 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 23 09:46:56 np0005532585.localdomain systemd[1]: tmp-crun.kish34.mount: Deactivated successfully.
Nov 23 09:46:56 np0005532585.localdomain podman[287343]: 2025-11-23 09:46:56.099360795 +0000 UTC m=+0.135774196 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350)
Nov 23 09:46:56 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:46:56 np0005532585.localdomain podman[287342]: 2025-11-23 09:46:56.109468862 +0000 UTC m=+0.146231684 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:46:56 np0005532585.localdomain podman[287342]: 2025-11-23 09:46:56.195523403 +0000 UTC m=+0.232286195 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:46:56 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:46:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:46:57 np0005532585.localdomain podman[287388]: 2025-11-23 09:46:57.025583849 +0000 UTC m=+0.082243772 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:46:57 np0005532585.localdomain podman[287388]: 2025-11-23 09:46:57.057224818 +0000 UTC m=+0.113884671 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 23 09:46:57 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:46:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:58.759 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:58.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:46:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:58.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:46:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:58.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:58.783 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:46:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:46:58.783 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:46:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:46:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:47:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:47:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:47:02 np0005532585.localdomain podman[287407]: 2025-11-23 09:47:02.025665292 +0000 UTC m=+0.077153874 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:47:02 np0005532585.localdomain podman[287407]: 2025-11-23 09:47:02.031482814 +0000 UTC m=+0.082971346 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:47:02 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:47:02 np0005532585.localdomain podman[287406]: 2025-11-23 09:47:02.072831877 +0000 UTC m=+0.126910520 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:47:02 np0005532585.localdomain podman[287406]: 2025-11-23 09:47:02.082734286 +0000 UTC m=+0.136812919 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:47:02 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:47:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:03.784 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:03.787 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:03.787 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:03.787 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:03.825 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:03.826 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.368 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.369 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.389 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.390 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.390 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.949 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.950 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.950 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:47:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:04.950 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.290 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.305 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.305 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.306 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.306 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.306 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.307 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.307 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.308 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.308 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.308 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.322 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.322 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.322 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.323 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.324 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.758 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.864 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:47:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:05.865 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.096 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.098 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12258MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.098 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.099 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.177 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.177 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.178 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.242 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.730 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.737 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.755 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.758 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:47:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:06.759 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:47:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:08.827 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:08.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:08.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:08.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:08.856 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:08.857 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:47:09.283 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:47:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:47:09.283 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:47:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:47:09.284 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.806 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.807 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.811 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc4d7f9e-60dd-4572-9c5a-753e98c9e72d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.807997', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61dad960-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '153046d01261987070cb728de764f01b2d08d01e23aed81696329a8e0bbd0fd6'}]}, 'timestamp': '2025-11-23 09:47:10.812682', '_unique_id': 'a1f97662b7aa44e4a6294ead547a12e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.843 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd68c03f8-7bf9-40a3-9bc3-0c9d55c02eb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.815637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61df8e56-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '86c673bb84eafca19b8c0c032e01cea23a8507061039a59adff0f819903e9de5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.815637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61dfa3f0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '57c2f1bf6e8d5178b5dcda3e842a3c4707cc9956c7f378063cb264f1ca8d7d2c'}]}, 'timestamp': '2025-11-23 09:47:10.844054', '_unique_id': 'd85a36c13dad44d3844868b06fbf8190'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.846 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4260166b-c13e-4ff0-ad72-186a3222914a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.846480', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e0170e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '8b24e470b00749410d28c51bc2fd35e5a77ea0d1d78c02332ae834c152efeee2'}]}, 'timestamp': '2025-11-23 09:47:10.847019', '_unique_id': '297337742640457783612de1a01fd73c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.849 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.860 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.860 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99d96acc-d82c-46e6-ba19-6c5765aac249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.849156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e23232-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '98ab590cf7d23d3c211d4d71cdb79a59032c87a278cf77dc519d566a978119dd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.849156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e2448e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': 'ca49b57a14b6052aa752b472040bdcea24790ddaf920b2a5aa6cb47b7472d74f'}]}, 'timestamp': '2025-11-23 09:47:10.861225', '_unique_id': '496b329fc17c468c90541f160d3e613a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.863 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5d72883-6e78-40ba-a9e2-491abb0b11eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:47:10.863951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '61e53cac-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.057662842, 'message_signature': '679f239ea21f233a91e8f91079a954c3a8efcd9bc29c9fa97ce5033f1be8bd6e'}]}, 'timestamp': '2025-11-23 09:47:10.880720', '_unique_id': '95cf2a96ebf846fe9f2cc8ca4d2f4bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f761a3e-7a1e-444c-b13c-d322fb5792da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.883108', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e5ac8c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '4342d2a4c0c16110ffcfee919457f1c7476c0023adf47467f65eb338ba771bae'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.883108', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e5c082-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '9710c1209b349c199dfd40a365e90fe1f862d191f9164fd333ffd83d292c078c'}]}, 'timestamp': '2025-11-23 09:47:10.884091', '_unique_id': 'e90304783c6b4b4e82e72d6934d3e141'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8491bf1b-8cd4-4caa-ae89-4f0c7689a8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.886226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e62626-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '5e13bf1227d8f4b6024cc4283bfde79f30f72b27b5e7b702cb71fd55d8a575e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.886226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e63878-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': 'd167241da8f7dc2b28dfac293c247484216085f74223e34a831fcb550cf29454'}]}, 'timestamp': '2025-11-23 09:47:10.887130', '_unique_id': 'e300482d2985481a8b3d4fb50b8b594c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10a0242c-cbd2-4d89-88af-8c0445e63993', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.889228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e69b4c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '254638100d0a020764115ed6e8702dabaaaed5b9a6d51dca9fd2df8725866a13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.889228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e6aace-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '8558273ff6765667f5365165c6d31073de4eceb1d93cb2939bdc2d9127f5fb83'}]}, 'timestamp': '2025-11-23 09:47:10.890084', '_unique_id': '22e0c66dcbe0423aac806ee5319027ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5a1afc9-ede7-4b79-b897-96be4e2501e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.892206', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e70ffa-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '4d7cb9dcb96cd45a6b733483b30b50d13b02d6b11ee06bf98aef5c5b187f9fa7'}]}, 'timestamp': '2025-11-23 09:47:10.892844', '_unique_id': '44796cc61ebd4525b2c41eca2d75adf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 12260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26c6ac29-3994-494e-91d6-2edad82bd220', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12260000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:47:10.895054', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '61e77eb8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.057662842, 'message_signature': '145086982d46d066ff050cf7adcf304a4cfc3fe0515f338e7e49f78f01f48c3d'}]}, 'timestamp': '2025-11-23 09:47:10.895489', '_unique_id': '67acf687a9ea466ca0e2a11c9ba75ece'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.898 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b691d083-03b5-4bd3-8450-0e1cb7cb8e6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.898020', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e7f4b0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '9fbc1246c075b5485fe803ea5ef9e0c1c1e9aacba2b9f33c7ef98d42d6de2b35'}]}, 'timestamp': '2025-11-23 09:47:10.898527', '_unique_id': '72b7cc3f93e248efa9e217656aa9e000'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.901 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12c70c5b-7ac4-48c3-93dd-fb65168546db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.900614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e85806-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': 'c3dab2fec701e0986424bbb1365c6a5e31b9ece3ff7789f7730a5407feabbe47'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.900614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e86918-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '1c8f01564471c88c7f5bd35e487e4e5169a1731334574fb40889c2209cea70e5'}]}, 'timestamp': '2025-11-23 09:47:10.901472', '_unique_id': '851e90c2809746008ee1f56124c8f721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '678b2ed3-f49f-421f-9eb9-11df3e6a4fe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.904059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e8de7a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '73c7d911c792986671ee21a713529438f4b7162aa3d46ac912c200d7b36fc36e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.904059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e8edde-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '9f2637535b122e0ce2aa4fe58eca8286888adab63681ad69102a78074cde263d'}]}, 'timestamp': '2025-11-23 09:47:10.905050', '_unique_id': 'e8064fe8655f40ff8e75d246239575a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd85aca62-69df-4952-8929-56b042b025c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.907158', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e95c6a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '64715d82f5f63dbd0982227e8a5d543c38c7eee384b92587ce4dbb3d8cbc1bb1'}]}, 'timestamp': '2025-11-23 09:47:10.907734', '_unique_id': '3d0ccc921c2d4380b8b02d8a36f3104d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7205eb91-c94f-4725-b6d3-0c5dc77ce618', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.909823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e9c13c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '983be157d49c15fb35bdfcd5f110dc3c342596a0952a4d0436617b204d9a6cd2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.909823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e9d0c8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '56876d15094b89e90e8edf99e822e88dfc381879797d7d0cef9843d04b4d77ee'}]}, 'timestamp': '2025-11-23 09:47:10.910704', '_unique_id': 'f850df4dfedc4c63b7c2c35e32cf69e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9977cf38-6fb9-460f-acfa-61db44026d48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.912764', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61ea33d8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '638a41587943ee3ade8da5092f58891d605326204b0a1e1207d8a9833b81320c'}]}, 'timestamp': '2025-11-23 09:47:10.913246', '_unique_id': '39fde533472a439d9b7a192d423454e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5598d5ab-f7d6-4320-b6d3-b8d98707ffa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.915424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61ea9a58-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': 'cd8bd65d1f5eec5668234e95c6b41f298e36000c4d5eae1893d45b04e7c262ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.915424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61eaab2e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '65ff701e6a61a356fd421ab4102193856c9f28671b0a488efa689feb22ee4bd3'}]}, 'timestamp': '2025-11-23 09:47:10.916274', '_unique_id': '6e2e8753b79b4501ab9e34a8ceebc52c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b37361-8188-4d96-b083-2460af3555ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.918363', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61eb0d76-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'c33359896eeee6a9d2604b6d6c0e4220031aa91f95c48be60580453d2bcc488f'}]}, 'timestamp': '2025-11-23 09:47:10.918849', '_unique_id': '85c1018ac76f4dffb3e48ab22f74a239'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f81efef4-12df-4fb0-877c-bc665376d0ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.920654', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61eb635c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'c85025e299b41aad205957ec6936dc9c360a7a49678a211924ce7d1f6d830cf4'}]}, 'timestamp': '2025-11-23 09:47:10.920957', '_unique_id': 'd97d4a8b44d94335801114ad6a994f1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a36214-4971-4798-bd82-22ac83e3dd84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.922209', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61eba01a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'e0316ca16bf0b71f1673f53800d24348d37e906122223457656d4dcef6c20c52'}]}, 'timestamp': '2025-11-23 09:47:10.922486', '_unique_id': '666bf906010c44eb98a0d86bfa9350c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2eb93ee-b0b4-4243-85be-b6a7385daeed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.923794', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61ebdfda-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'bd0306c9dfe5ad580974be76c41725007d39b37c9d773b6c7db1207cbf0968db'}]}, 'timestamp': '2025-11-23 09:47:10.924122', '_unique_id': '24e6e3810e214f8b99bf0907d7fdf285'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:47:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:47:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:47:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:47:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:47:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1"
Nov 23 09:47:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:47:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17717 "" "Go-http-client/1.1"
Nov 23 09:47:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:13.858 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:47:15 np0005532585.localdomain podman[287491]: 2025-11-23 09:47:15.026047936 +0000 UTC m=+0.082237373 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 23 09:47:15 np0005532585.localdomain podman[287491]: 2025-11-23 09:47:15.035231253 +0000 UTC m=+0.091420720 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:47:15 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:47:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:18.861 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:18.863 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:18.863 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:18.863 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:18.908 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:18.908 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:47:20 np0005532585.localdomain podman[287510]: 2025-11-23 09:47:20.026000666 +0000 UTC m=+0.078375142 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:47:20 np0005532585.localdomain podman[287510]: 2025-11-23 09:47:20.033826551 +0000 UTC m=+0.086200997 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:47:20 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:47:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:23.908 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:23.910 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:23.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:23.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:23.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:23.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:26 np0005532585.localdomain sudo[287534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:47:26 np0005532585.localdomain sudo[287534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:47:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:47:26 np0005532585.localdomain sudo[287534]: pam_unix(sudo:session): session closed for user root
Nov 23 09:47:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:47:26 np0005532585.localdomain podman[287552]: 2025-11-23 09:47:26.711267824 +0000 UTC m=+0.074232903 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:47:26 np0005532585.localdomain podman[287553]: 2025-11-23 09:47:26.771213158 +0000 UTC m=+0.125372151 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Nov 23 09:47:26 np0005532585.localdomain podman[287553]: 2025-11-23 09:47:26.783196754 +0000 UTC m=+0.137355707 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 09:47:26 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:47:26 np0005532585.localdomain podman[287552]: 2025-11-23 09:47:26.824805154 +0000 UTC m=+0.187770233 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:47:26 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:47:27 np0005532585.localdomain sshd[286245]: Received disconnect from 192.168.122.11 port 59110:11: disconnected by user
Nov 23 09:47:27 np0005532585.localdomain sshd[286245]: Disconnected from user tripleo-admin 192.168.122.11 port 59110
Nov 23 09:47:27 np0005532585.localdomain sshd[286219]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 23 09:47:27 np0005532585.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Nov 23 09:47:27 np0005532585.localdomain systemd[1]: session-62.scope: Consumed 1.256s CPU time.
Nov 23 09:47:27 np0005532585.localdomain systemd-logind[761]: Session 62 logged out. Waiting for processes to exit.
Nov 23 09:47:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:47:27 np0005532585.localdomain systemd-logind[761]: Removed session 62.
Nov 23 09:47:27 np0005532585.localdomain podman[287597]: 2025-11-23 09:47:27.389776841 +0000 UTC m=+0.085811175 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:47:27 np0005532585.localdomain podman[287597]: 2025-11-23 09:47:27.421121511 +0000 UTC m=+0.117155825 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:47:27 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:47:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:28.946 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:28.947 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:28.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:28.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:28.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:28.989 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:47:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:47:31 np0005532585.localdomain sshd[287614]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:47:31 np0005532585.localdomain sshd[287614]: Invalid user ahsan from 207.154.194.2 port 35488
Nov 23 09:47:31 np0005532585.localdomain sshd[287614]: Received disconnect from 207.154.194.2 port 35488:11: Bye Bye [preauth]
Nov 23 09:47:31 np0005532585.localdomain sshd[287614]: Disconnected from invalid user ahsan 207.154.194.2 port 35488 [preauth]
Nov 23 09:47:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:47:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:47:32 np0005532585.localdomain systemd[1]: tmp-crun.L1jFdl.mount: Deactivated successfully.
Nov 23 09:47:32 np0005532585.localdomain podman[287616]: 2025-11-23 09:47:32.306972271 +0000 UTC m=+0.089187550 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd)
Nov 23 09:47:32 np0005532585.localdomain podman[287616]: 2025-11-23 09:47:32.317762089 +0000 UTC m=+0.099977388 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 23 09:47:32 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:47:32 np0005532585.localdomain systemd[1]: tmp-crun.V7sbez.mount: Deactivated successfully.
Nov 23 09:47:32 np0005532585.localdomain podman[287617]: 2025-11-23 09:47:32.374846094 +0000 UTC m=+0.152920373 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:47:32 np0005532585.localdomain podman[287617]: 2025-11-23 09:47:32.382856504 +0000 UTC m=+0.160930793 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:47:32 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:47:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:33.989 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:33.991 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:33.992 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:33.992 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:34.019 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:34.020 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:36 np0005532585.localdomain sudo[287658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:47:36 np0005532585.localdomain sudo[287658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:47:36 np0005532585.localdomain sudo[287658]: pam_unix(sudo:session): session closed for user root
Nov 23 09:47:36 np0005532585.localdomain sudo[287676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:47:36 np0005532585.localdomain sudo[287676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:47:36 np0005532585.localdomain sudo[287676]: pam_unix(sudo:session): session closed for user root
Nov 23 09:47:36 np0005532585.localdomain sudo[287694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:47:36 np0005532585.localdomain sudo[287694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:47:37 np0005532585.localdomain sudo[287694]: pam_unix(sudo:session): session closed for user root
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Activating special unit Exit the Session...
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped target Main User Target.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped target Basic System.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped target Paths.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped target Sockets.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped target Timers.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Closed D-Bus User Message Bus Socket.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Stopped Create User's Volatile Files and Directories.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Removed slice User Application Slice.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Reached target Shutdown.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Finished Exit the Session.
Nov 23 09:47:37 np0005532585.localdomain systemd[286223]: Reached target Exit the Session.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 23 09:47:37 np0005532585.localdomain systemd[1]: user-1003.slice: Consumed 1.665s CPU time.
Nov 23 09:47:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:39.020 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:39.023 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:39.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:39.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:39.051 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:39.052 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:39 np0005532585.localdomain sudo[287745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:47:39 np0005532585.localdomain sudo[287745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:47:39 np0005532585.localdomain sudo[287745]: pam_unix(sudo:session): session closed for user root
Nov 23 09:47:39 np0005532585.localdomain sudo[287763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:47:39 np0005532585.localdomain sudo[287763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:47:39 np0005532585.localdomain sudo[287763]: pam_unix(sudo:session): session closed for user root
Nov 23 09:47:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:47:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:47:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:47:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1"
Nov 23 09:47:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:47:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17716 "" "Go-http-client/1.1"
Nov 23 09:47:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:44.053 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:47:46 np0005532585.localdomain podman[287781]: 2025-11-23 09:47:46.027634338 +0000 UTC m=+0.078395822 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible)
Nov 23 09:47:46 np0005532585.localdomain podman[287781]: 2025-11-23 09:47:46.041691808 +0000 UTC m=+0.092453322 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:47:46 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:47:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:49.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:49.057 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:49.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:49.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:49.087 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:49.087 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:47:51 np0005532585.localdomain podman[287799]: 2025-11-23 09:47:51.031600821 +0000 UTC m=+0.081381285 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:47:51 np0005532585.localdomain podman[287799]: 2025-11-23 09:47:51.069466846 +0000 UTC m=+0.119247280 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:47:51 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:47:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:54.088 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:54.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:54.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:54.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:54.118 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:54.119 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:47:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:47:57 np0005532585.localdomain systemd[1]: tmp-crun.3HYSfZ.mount: Deactivated successfully.
Nov 23 09:47:57 np0005532585.localdomain podman[287822]: 2025-11-23 09:47:57.034863805 +0000 UTC m=+0.087674503 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 23 09:47:57 np0005532585.localdomain podman[287823]: 2025-11-23 09:47:57.106290558 +0000 UTC m=+0.155170023 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:47:57 np0005532585.localdomain podman[287822]: 2025-11-23 09:47:57.110293223 +0000 UTC m=+0.163103991 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 23 09:47:57 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:47:57 np0005532585.localdomain podman[287823]: 2025-11-23 09:47:57.122581508 +0000 UTC m=+0.171460973 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 23 09:47:57 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:47:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:47:58 np0005532585.localdomain podman[287867]: 2025-11-23 09:47:58.025402729 +0000 UTC m=+0.079803337 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:47:58 np0005532585.localdomain podman[287867]: 2025-11-23 09:47:58.034461382 +0000 UTC m=+0.088861990 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:47:58 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.120 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.149 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.150 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.232 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.232 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 09:47:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:47:59.246 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:47:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:47:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:48:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:00.257 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.238 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.701 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.785 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:48:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:01.786 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.022 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.024 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12247MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.024 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.025 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.138 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.139 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.139 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.297 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.749 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.758 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.776 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.778 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:48:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:02.779 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:48:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:48:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:48:03 np0005532585.localdomain podman[287929]: 2025-11-23 09:48:03.041883486 +0000 UTC m=+0.094622560 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:48:03 np0005532585.localdomain podman[287929]: 2025-11-23 09:48:03.055632386 +0000 UTC m=+0.108371470 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 09:48:03 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:48:03 np0005532585.localdomain systemd[1]: tmp-crun.kqYJx6.mount: Deactivated successfully.
Nov 23 09:48:03 np0005532585.localdomain podman[287930]: 2025-11-23 09:48:03.148799749 +0000 UTC m=+0.195752562 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:48:03 np0005532585.localdomain podman[287930]: 2025-11-23 09:48:03.156408916 +0000 UTC m=+0.203361739 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:48:03 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.154 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.154 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.177 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.178 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.780 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.781 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.781 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.782 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.985 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.986 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.986 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:48:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:04.987 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.503 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.520 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.520 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.521 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.522 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.522 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.523 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:05.523 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:48:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:06.217 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:48:07 np0005532585.localdomain sudo[287970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:07 np0005532585.localdomain sudo[287970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:07 np0005532585.localdomain sudo[287970]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:08 np0005532585.localdomain sudo[287988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:08 np0005532585.localdomain sudo[287988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:08 np0005532585.localdomain sudo[287988]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:09.178 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:09.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:09.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:09.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:09.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:09.210 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:48:09.284 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:48:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:48:09.285 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:48:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:48:09.285 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:48:09 np0005532585.localdomain sudo[288006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:09 np0005532585.localdomain sudo[288006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:09 np0005532585.localdomain sudo[288006]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:48:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:48:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:48:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1"
Nov 23 09:48:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:48:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17721 "" "Go-http-client/1.1"
Nov 23 09:48:13 np0005532585.localdomain sudo[288024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:48:13 np0005532585.localdomain sudo[288024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:13 np0005532585.localdomain sudo[288024]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:13 np0005532585.localdomain sudo[288042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:13 np0005532585.localdomain sudo[288042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 2025-11-23 09:48:13.788044279 +0000 UTC m=+0.050857823 container create 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., release=553)
Nov 23 09:48:13 np0005532585.localdomain systemd[1]: Started libpod-conmon-14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3.scope.
Nov 23 09:48:13 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 2025-11-23 09:48:13.859156532 +0000 UTC m=+0.121970006 container init 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main)
Nov 23 09:48:13 np0005532585.localdomain systemd[1]: tmp-crun.PevKIl.mount: Deactivated successfully.
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 2025-11-23 09:48:13.769317633 +0000 UTC m=+0.032131087 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 2025-11-23 09:48:13.87092462 +0000 UTC m=+0.133738094 container start 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True)
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 2025-11-23 09:48:13.871188608 +0000 UTC m=+0.134002092 container attach 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, distribution-scope=public, ceph=True, version=7, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:48:13 np0005532585.localdomain relaxed_rhodes[288117]: 167 167
Nov 23 09:48:13 np0005532585.localdomain systemd[1]: libpod-14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3.scope: Deactivated successfully.
Nov 23 09:48:13 np0005532585.localdomain podman[288102]: 2025-11-23 09:48:13.874051277 +0000 UTC m=+0.136864781 container died 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, GIT_CLEAN=True, release=553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public)
Nov 23 09:48:13 np0005532585.localdomain podman[288122]: 2025-11-23 09:48:13.956058362 +0000 UTC m=+0.071512007 container remove 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., release=553, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:48:13 np0005532585.localdomain systemd[1]: libpod-conmon-14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3.scope: Deactivated successfully.
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:48:14 np0005532585.localdomain systemd-rc-local-generator[288160]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:48:14 np0005532585.localdomain systemd-sysv-generator[288166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:14.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:14.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:14.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:14.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:48:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:14.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:14.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-accaf8fa8c3faa14bcdf046d7f01e8737ee63d93c9f6e2663ab0e78253c9978c-merged.mount: Deactivated successfully.
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:48:14 np0005532585.localdomain systemd-rc-local-generator[288205]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:48:14 np0005532585.localdomain systemd-sysv-generator[288209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:14 np0005532585.localdomain systemd[1]: Starting Ceph mgr.np0005532585.gzafiw for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 09:48:15 np0005532585.localdomain podman[288268]: 
Nov 23 09:48:15 np0005532585.localdomain podman[288268]: 2025-11-23 09:48:15.226659103 +0000 UTC m=+0.081293953 container create ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw, RELEASE=main, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Nov 23 09:48:15 np0005532585.localdomain systemd[1]: tmp-crun.DoMo0M.mount: Deactivated successfully.
Nov 23 09:48:15 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:15 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:15 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:15 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/var/lib/ceph/mgr/ceph-np0005532585.gzafiw supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:15 np0005532585.localdomain podman[288268]: 2025-11-23 09:48:15.194483648 +0000 UTC m=+0.049118528 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:48:15 np0005532585.localdomain podman[288268]: 2025-11-23 09:48:15.300660618 +0000 UTC m=+0.155295428 container init ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git)
Nov 23 09:48:15 np0005532585.localdomain podman[288268]: 2025-11-23 09:48:15.311003921 +0000 UTC m=+0.165638731 container start ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64)
Nov 23 09:48:15 np0005532585.localdomain bash[288268]: ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a
Nov 23 09:48:15 np0005532585.localdomain systemd[1]: Started Ceph mgr.np0005532585.gzafiw for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: pidfile_write: ignore empty --pid-file
Nov 23 09:48:15 np0005532585.localdomain sudo[288042]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'alerts'
Nov 23 09:48:15 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:15.506+0000 7f5f26c11140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'balancer'
Nov 23 09:48:15 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:15.577+0000 7f5f26c11140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 09:48:15 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'cephadm'
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'crash'
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.220+0000 7f5f26c11140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'dashboard'
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'devicehealth'
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.769+0000 7f5f26c11140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]:   from numpy import show_config as show_numpy_config
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.911+0000 7f5f26c11140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'influx'
Nov 23 09:48:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:48:16 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.974+0000 7f5f26c11140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 09:48:16 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'insights'
Nov 23 09:48:17 np0005532585.localdomain podman[288317]: 2025-11-23 09:48:17.034678061 +0000 UTC m=+0.090078719 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'iostat'
Nov 23 09:48:17 np0005532585.localdomain podman[288317]: 2025-11-23 09:48:17.071390609 +0000 UTC m=+0.126791297 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:48:17 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:48:17 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:17.094+0000 7f5f26c11140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'k8sevents'
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'localpool'
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'mirroring'
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'nfs'
Nov 23 09:48:17 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:17.864+0000 7f5f26c11140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 09:48:17 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'orchestrator'
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.008+0000 7f5f26c11140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.077+0000 7f5f26c11140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'osd_support'
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.144+0000 7f5f26c11140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.220+0000 7f5f26c11140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'progress'
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.279+0000 7f5f26c11140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'prometheus'
Nov 23 09:48:18 np0005532585.localdomain sudo[288336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:18 np0005532585.localdomain sudo[288336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:18 np0005532585.localdomain sudo[288336]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:18 np0005532585.localdomain sudo[288354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:48:18 np0005532585.localdomain sudo[288354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:18 np0005532585.localdomain sudo[288354]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.581+0000 7f5f26c11140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'rbd_support'
Nov 23 09:48:18 np0005532585.localdomain sudo[288372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:48:18 np0005532585.localdomain sudo[288372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.661+0000 7f5f26c11140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'restful'
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'rgw'
Nov 23 09:48:18 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.987+0000 7f5f26c11140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 09:48:18 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'rook'
Nov 23 09:48:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:19.228 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:19.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:19.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:19.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:19.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:19.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:19 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.475+0000 7f5f26c11140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'selftest'
Nov 23 09:48:19 np0005532585.localdomain systemd[1]: tmp-crun.iBZHk6.mount: Deactivated successfully.
Nov 23 09:48:19 np0005532585.localdomain podman[288458]: 2025-11-23 09:48:19.524738314 +0000 UTC m=+0.108331578 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:48:19 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.544+0000 7f5f26c11140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'snap_schedule'
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'stats'
Nov 23 09:48:19 np0005532585.localdomain podman[288458]: 2025-11-23 09:48:19.637927934 +0000 UTC m=+0.221521138 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, release=553, GIT_BRANCH=main, name=rhceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12)
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'status'
Nov 23 09:48:19 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.755+0000 7f5f26c11140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'telegraf'
Nov 23 09:48:19 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.818+0000 7f5f26c11140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'telemetry'
Nov 23 09:48:19 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.955+0000 7f5f26c11140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 09:48:19 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 09:48:20 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:20.113+0000 7f5f26c11140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'volumes'
Nov 23 09:48:20 np0005532585.localdomain sudo[288372]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:20 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:20.327+0000 7f5f26c11140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'zabbix'
Nov 23 09:48:20 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:20.387+0000 7f5f26c11140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 23 09:48:20 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1698075890
Nov 23 09:48:20 np0005532585.localdomain sudo[288560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:20 np0005532585.localdomain sudo[288560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:20 np0005532585.localdomain sudo[288560]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:21 np0005532585.localdomain sudo[288578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:21 np0005532585.localdomain sudo[288578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:48:21 np0005532585.localdomain sudo[288578]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:21 np0005532585.localdomain podman[288596]: 2025-11-23 09:48:21.265316982 +0000 UTC m=+0.089705676 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:48:21 np0005532585.localdomain podman[288596]: 2025-11-23 09:48:21.304414325 +0000 UTC m=+0.128802979 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:48:21 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:48:21 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1698075890
Nov 23 09:48:22 np0005532585.localdomain sudo[288618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:22 np0005532585.localdomain sudo[288618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:22 np0005532585.localdomain sudo[288618]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:24.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:24.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:24.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:24.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:24.296 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:24.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:26 np0005532585.localdomain sudo[288636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:26 np0005532585.localdomain sudo[288636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288636]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:48:26 np0005532585.localdomain sudo[288654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288654]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:48:26 np0005532585.localdomain sudo[288672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288672]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:26 np0005532585.localdomain sudo[288690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288690]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:26 np0005532585.localdomain sudo[288708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288708]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:26 np0005532585.localdomain sudo[288726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:26 np0005532585.localdomain sudo[288760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288760]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:26 np0005532585.localdomain sudo[288778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:26 np0005532585.localdomain sudo[288778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:26 np0005532585.localdomain sudo[288778]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[288796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:48:27 np0005532585.localdomain sudo[288796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288796]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[288814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:48:27 np0005532585.localdomain sudo[288814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:48:27 np0005532585.localdomain sudo[288814]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:48:27 np0005532585.localdomain sudo[288839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:48:27 np0005532585.localdomain sudo[288839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288839]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain podman[288832]: 2025-11-23 09:48:27.256114415 +0000 UTC m=+0.090494190 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 09:48:27 np0005532585.localdomain podman[288833]: 2025-11-23 09:48:27.35059309 +0000 UTC m=+0.178591896 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 23 09:48:27 np0005532585.localdomain sudo[288877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:27 np0005532585.localdomain sudo[288877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288877]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain podman[288833]: 2025-11-23 09:48:27.360591303 +0000 UTC m=+0.188590149 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 09:48:27 np0005532585.localdomain podman[288832]: 2025-11-23 09:48:27.368797848 +0000 UTC m=+0.203177663 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:48:27 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:48:27 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:48:27 np0005532585.localdomain sudo[288912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:27 np0005532585.localdomain sudo[288912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288912]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[288930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:27 np0005532585.localdomain sudo[288930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288930]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[288964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:27 np0005532585.localdomain sudo[288964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288964]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[288982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:27 np0005532585.localdomain sudo[288982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[288982]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[289000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:27 np0005532585.localdomain sudo[289000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[289000]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[289018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:48:27 np0005532585.localdomain sudo[289018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[289018]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:27 np0005532585.localdomain sudo[289036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:48:27 np0005532585.localdomain sudo[289036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:27 np0005532585.localdomain sudo[289036]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289054]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:28 np0005532585.localdomain sudo[289072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:48:28 np0005532585.localdomain sudo[289072]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289096]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain podman[289090]: 2025-11-23 09:48:28.201195048 +0000 UTC m=+0.079932991 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:48:28 np0005532585.localdomain podman[289090]: 2025-11-23 09:48:28.207465354 +0000 UTC m=+0.086203307 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 09:48:28 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:48:28 np0005532585.localdomain sudo[289142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289142]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289160]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:48:28 np0005532585.localdomain sudo[289178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289178]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:48:28 np0005532585.localdomain sudo[289196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289196]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:48:28 np0005532585.localdomain sudo[289214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289214]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289232]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:28 np0005532585.localdomain sudo[289250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289250]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289268]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:28 np0005532585.localdomain sudo[289302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:48:28 np0005532585.localdomain sudo[289302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:28 np0005532585.localdomain sudo[289302]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:29 np0005532585.localdomain sudo[289320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:48:29 np0005532585.localdomain sudo[289320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:29 np0005532585.localdomain sudo[289320]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:29 np0005532585.localdomain sudo[289338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:48:29 np0005532585.localdomain sudo[289338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:29 np0005532585.localdomain sudo[289338]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:29.298 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:29.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:29.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:29.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:29.336 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:29.337 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:29 np0005532585.localdomain sudo[289356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:29 np0005532585.localdomain sudo[289356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:29 np0005532585.localdomain sudo[289356]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:48:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:48:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:48:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:48:34 np0005532585.localdomain podman[289375]: 2025-11-23 09:48:34.047563846 +0000 UTC m=+0.097089758 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:48:34 np0005532585.localdomain podman[289375]: 2025-11-23 09:48:34.061727328 +0000 UTC m=+0.111253220 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:48:34 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:48:34 np0005532585.localdomain podman[289376]: 2025-11-23 09:48:34.111662749 +0000 UTC m=+0.154093069 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:48:34 np0005532585.localdomain podman[289376]: 2025-11-23 09:48:34.150671789 +0000 UTC m=+0.193102109 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:48:34 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:48:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:34.337 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:34.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:34.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:34.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:34.376 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:34.377 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:35 np0005532585.localdomain sudo[289415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:48:35 np0005532585.localdomain sudo[289415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:35 np0005532585.localdomain sudo[289415]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:35 np0005532585.localdomain sudo[289433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:35 np0005532585.localdomain sudo[289433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:35 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 23 09:48:35 np0005532585.localdomain sshd[289468]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 2025-11-23 09:48:36.095662429 +0000 UTC m=+0.086935019 container create db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b.scope.
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 2025-11-23 09:48:36.059618462 +0000 UTC m=+0.050891042 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 2025-11-23 09:48:36.176317761 +0000 UTC m=+0.167590301 container init db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public)
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 2025-11-23 09:48:36.187756239 +0000 UTC m=+0.179028779 container start db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git)
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 2025-11-23 09:48:36.188667428 +0000 UTC m=+0.179940008 container attach db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Nov 23 09:48:36 np0005532585.localdomain eager_goldberg[289510]: 167 167
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: libpod-db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b.scope: Deactivated successfully.
Nov 23 09:48:36 np0005532585.localdomain podman[289495]: 2025-11-23 09:48:36.193296592 +0000 UTC m=+0.184569152 container died db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64)
Nov 23 09:48:36 np0005532585.localdomain podman[289515]: 2025-11-23 09:48:36.291492283 +0000 UTC m=+0.086145055 container remove db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: libpod-conmon-db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b.scope: Deactivated successfully.
Nov 23 09:48:36 np0005532585.localdomain sshd[289468]: Invalid user lawrence from 207.154.194.2 port 33260
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 2025-11-23 09:48:36.427256428 +0000 UTC m=+0.088092475 container create c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc.scope.
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:48:36 np0005532585.localdomain sshd[289468]: Received disconnect from 207.154.194.2 port 33260:11: Bye Bye [preauth]
Nov 23 09:48:36 np0005532585.localdomain sshd[289468]: Disconnected from invalid user lawrence 207.154.194.2 port 33260 [preauth]
Nov 23 09:48:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 2025-11-23 09:48:36.389070724 +0000 UTC m=+0.049906811 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 2025-11-23 09:48:36.494401078 +0000 UTC m=+0.155237125 container init c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7)
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 2025-11-23 09:48:36.507085324 +0000 UTC m=+0.167921371 container start c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12)
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 2025-11-23 09:48:36.507422015 +0000 UTC m=+0.168258082 container attach c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, description=Red Hat Ceph Storage 7, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: libpod-c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc.scope: Deactivated successfully.
Nov 23 09:48:36 np0005532585.localdomain podman[289533]: 2025-11-23 09:48:36.602454477 +0000 UTC m=+0.263290504 container died c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, name=rhceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:48:36 np0005532585.localdomain podman[289574]: 2025-11-23 09:48:36.692489832 +0000 UTC m=+0.077994770 container remove c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: libpod-conmon-c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc.scope: Deactivated successfully.
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:48:36 np0005532585.localdomain systemd-rc-local-generator[289611]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:48:36 np0005532585.localdomain systemd-sysv-generator[289614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:36 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-389a2e189f6cc00e3a32be8e29ed54e28047e97a87c7dc64b9273f56e5f484f1-merged.mount: Deactivated successfully.
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:48:37 np0005532585.localdomain systemd-rc-local-generator[289653]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:48:37 np0005532585.localdomain systemd-sysv-generator[289659]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:48:37 np0005532585.localdomain systemd[1]: Starting Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 09:48:37 np0005532585.localdomain podman[289717]: 
Nov 23 09:48:38 np0005532585.localdomain podman[289717]: 2025-11-23 09:48:38.000836374 +0000 UTC m=+0.087882239 container create 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Nov 23 09:48:38 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:38 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:38 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:38 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff)
Nov 23 09:48:38 np0005532585.localdomain podman[289717]: 2025-11-23 09:48:37.966005034 +0000 UTC m=+0.053050879 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:48:38 np0005532585.localdomain podman[289717]: 2025-11-23 09:48:38.068004025 +0000 UTC m=+0.155049870 container init 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:48:38 np0005532585.localdomain podman[289717]: 2025-11-23 09:48:38.084236391 +0000 UTC m=+0.171282236 container start 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:48:38 np0005532585.localdomain bash[289717]: 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803
Nov 23 09:48:38 np0005532585.localdomain systemd[1]: Started Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pidfile_write: ignore empty --pid-file
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: load: jerasure load: lrc 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: RocksDB version: 7.9.2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Git sha 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: DB SUMMARY
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: DB Session ID:  8ON8PRI8V1RJ4RVNWHFL
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: CURRENT file:  CURRENT
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005532585/store.db dir, Total Num: 0, files: 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005532585/store.db: 000004.log size: 761 ; 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                         Options.error_if_exists: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.create_if_missing: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                                     Options.env: 0x559413d149e0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                                Options.info_log: 0x5594163ccd20
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                              Options.statistics: (nil)
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                               Options.use_fsync: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                              Options.db_log_dir: 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                                 Options.wal_dir: 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                    Options.write_buffer_manager: 0x5594163dd540
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.unordered_write: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                               Options.row_cache: None
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                              Options.wal_filter: None
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.two_write_queues: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.wal_compression: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.atomic_flush: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.max_background_jobs: 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.max_background_compactions: -1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.max_subcompactions: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.max_total_wal_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                          Options.max_open_files: -1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:       Options.compaction_readahead_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Compression algorithms supported:
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kZSTD supported: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kXpressCompression supported: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kBZip2Compression supported: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kLZ4Compression supported: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kZlibCompression supported: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kLZ4HCCompression supported: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         kSnappyCompression supported: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:           Options.merge_operator: 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:        Options.compaction_filter: None
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5594163cc980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x5594163c9350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:        Options.write_buffer_size: 33554432
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:  Options.max_write_buffer_number: 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.compression: NoCompression
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.num_levels: 7
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.table_properties_collectors: 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                           Options.bloom_locality: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                               Options.ttl: 2592000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                       Options.enable_blob_files: false
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                           Options.min_blob_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f90877de-8e0c-4aa9-bd89-60d6d2f6e09f
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891318139479, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891318142069, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891318142203, "job": 1, "event": "recovery_finished"}
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5594163f0e00
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: DB pointer 0x5594164e6000
Nov 23 09:48:38 np0005532585.localdomain sudo[289433]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 does not exist in monmap, will attempt to join an existing cluster
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5594163c9350#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.8e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.95 KB,0.000181794%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 23 09:48:38 np0005532585.localdomain systemd[1]: tmp-crun.npBrnb.mount: Deactivated successfully.
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: using public_addr v2:172.18.0.107:0/0 -> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0]
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: starting mon.np0005532585 rank -1 at public addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] at bind addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005532585 fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(???) e0 preinit fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing) e4 sync_obtain_latest_monmap
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing) e4 sync_obtain_latest_monmap obtained monmap e4
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).mds e16 new map
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        14
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-11-23T08:00:26.486221+0000
                                                           modified        2025-11-23T09:47:19.846415+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        79
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26392}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26392 members: 26392
                                                           [mds.mds.np0005532586.mfohsb{0:26392} state up:active seq 12 addr [v2:172.18.0.108:6808/2718449296,v1:172.18.0.108:6809/2718449296] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005532585.jcltnl{-1:17133} state up:standby seq 1 addr [v2:172.18.0.107:6808/563301557,v1:172.18.0.107:6809/563301557] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005532584.aoxjmw{-1:17139} state up:standby seq 1 addr [v2:172.18.0.106:6808/2261302276,v1:172.18.0.106:6809/2261302276] compat {c=[1],r=[1],i=[17ff]}]
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 3314933000852226048, adjusting msgr requires
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3914: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005532583.nwcrcp"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005532583.nwcrcp"}]': finished
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Removing key for mds.mds.np0005532583.nwcrcp
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3915: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3916: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3917: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3918: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3919: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3920: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3921: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3922: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3923: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3924: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3925: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3926: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/226999052' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/226999052' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3927: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/2785489473' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/4238037226' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3928: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/902812654' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/3634360147' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/2632593886' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/629446965' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3929: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3930: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17271 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532584.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mgr to host np0005532584.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3931: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17277 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532585.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mgr to host np0005532585.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17283 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532586.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mgr to host np0005532586.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3932: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17289 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Saving service mgr spec with placement label:mgr
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3933: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17295 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17307 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532581.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mon to host np0005532581.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3934: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17313 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532581.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label _admin to host np0005532581.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3935: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17325 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532582.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mon to host np0005532582.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Standby manager daemon np0005532584.naxwxy started
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17331 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532582.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label _admin to host np0005532582.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3936: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mgrmap e12: np0005532581.sxlgsx(active, since 2h), standbys: np0005532583.orhywt, np0005532582.gilwrz, np0005532584.naxwxy
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17340 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532583.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mon to host np0005532583.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3937: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Standby manager daemon np0005532585.gzafiw started
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17346 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532583.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label _admin to host np0005532583.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mgrmap e13: np0005532581.sxlgsx(active, since 2h), standbys: np0005532583.orhywt, np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17352 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532584.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mon to host np0005532584.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3938: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Standby manager daemon np0005532586.thmvqb started
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17358 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532584.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label _admin to host np0005532584.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mgrmap e14: np0005532581.sxlgsx(active, since 2h), standbys: np0005532583.orhywt, np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3939: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17364 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532585.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mon to host np0005532585.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17370 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532585.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label _admin to host np0005532585.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3940: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17376 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532586.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label mon to host np0005532586.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3941: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17382 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005532586.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Added label _admin to host np0005532586.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17388 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Saving service mon spec with placement label:mon
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3942: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='client.17394 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532584", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3943: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: pgmap v3944: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Nov 23 09:48:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:39.377 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:39.379 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:39.380 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:39.380 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:39.415 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:39.416 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:48:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:48:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:48:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:48:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:48:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18670 "" "Go-http-client/1.1"
Nov 23 09:48:42 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 23 09:48:43 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(probing) e4  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 23 09:48:43 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(probing) e4  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 23 09:48:44 np0005532585.localdomain sshd[289774]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:48:44 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@-1(probing) e5  my rank is now 4 (was -1)
Nov 23 09:48:44 np0005532585.localdomain ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:48:44 np0005532585.localdomain ceph-mon[289735]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 23 09:48:44 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:44.417 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:44.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:44.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:44.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:44.458 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:44.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:44 np0005532585.localdomain sshd[289774]: Invalid user fastuser from 107.172.15.139 port 44362
Nov 23 09:48:44 np0005532585.localdomain sshd[289774]: Received disconnect from 107.172.15.139 port 44362:11: Bye Bye [preauth]
Nov 23 09:48:44 np0005532585.localdomain sshd[289774]: Disconnected from invalid user fastuser 107.172.15.139 port 44362 [preauth]
Nov 23 09:48:45 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532581"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532581 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: pgmap v3945: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: pgmap v3946: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532581 is new leader, mons np0005532581,np0005532583,np0005532582,np0005532586 in quorum (ranks 0,1,2,3)
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: monmap epoch 4
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:48:35.509809+0000
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532581
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532582
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: osdmap e81: 6 total, 6 up, 6 in
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mgrmap e14: np0005532581.sxlgsx(active, since 2h), standbys: np0005532583.orhywt, np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: pgmap v3947: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mgrc update_daemon_metadata mon.np0005532585 metadata {addrs=[v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005532585.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005532585.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: pgmap v3948: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532581"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532581 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: pgmap v3949: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: pgmap v3950: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532581 is new leader, mons np0005532581,np0005532583,np0005532582,np0005532586,np0005532585 in quorum (ranks 0,1,2,3,4)
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: monmap epoch 5
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:48:42.355433+0000
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532581
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532582
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005532585
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: osdmap e81: 6 total, 6 up, 6 in
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mgrmap e14: np0005532581.sxlgsx(active, since 2h), standbys: np0005532583.orhywt, np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 23 09:48:47 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: paxos.4).electionLogic(22) init, last seen epoch 22
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:47 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:48:48 np0005532585.localdomain systemd[1]: tmp-crun.oXlu8e.mount: Deactivated successfully.
Nov 23 09:48:48 np0005532585.localdomain podman[289776]: 2025-11-23 09:48:48.045229921 +0000 UTC m=+0.104359544 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:48:48 np0005532585.localdomain podman[289776]: 2025-11-23 09:48:48.084372755 +0000 UTC m=+0.143502328 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:48:48 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:48:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:49.460 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:49.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:49.463 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:49.463 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:49.495 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:49.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:48:52 np0005532585.localdomain podman[289795]: 2025-11-23 09:48:52.032826933 +0000 UTC m=+0.088008694 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:48:52 np0005532585.localdomain podman[289795]: 2025-11-23 09:48:52.039447139 +0000 UTC m=+0.094649701 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:48:52 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532581"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532581 calling monitor election
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 calling monitor election
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: pgmap v3951: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532584 calling monitor election
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: pgmap v3952: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mon.np0005532581 is new leader, mons np0005532581,np0005532583,np0005532582,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3,4,5)
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: monmap epoch 6
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:48:47.512664+0000
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532581
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532582
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005532585
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005532584
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: osdmap e81: 6 total, 6 up, 6 in
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: mgrmap e14: np0005532581.sxlgsx(active, since 2h), standbys: np0005532583.orhywt, np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:48:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:53 np0005532585.localdomain sudo[289819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:53 np0005532585.localdomain sudo[289819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:53 np0005532585.localdomain sudo[289819]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:53 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e6 handle_auth_request failed to assign global_id
Nov 23 09:48:53 np0005532585.localdomain sudo[289837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:48:53 np0005532585.localdomain sudo[289837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:53 np0005532585.localdomain sudo[289837]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:53 np0005532585.localdomain sudo[289855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:48:53 np0005532585.localdomain sudo[289855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:53 np0005532585.localdomain ceph-mon[289735]: pgmap v3953: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:53 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:48:54 np0005532585.localdomain podman[289943]: 2025-11-23 09:48:54.163023743 +0000 UTC m=+0.101743912 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Nov 23 09:48:54 np0005532585.localdomain podman[289943]: 2025-11-23 09:48:54.2876276 +0000 UTC m=+0.226347759 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:48:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:54.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:54.499 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:54.500 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:54.500 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:54.539 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:54.539 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:54 np0005532585.localdomain ceph-mon[289735]: from='client.26497 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532584", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:48:54 np0005532585.localdomain sudo[289855]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:54 np0005532585.localdomain sudo[290065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:48:55 np0005532585.localdomain sudo[290065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:55 np0005532585.localdomain sudo[290065]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:55 np0005532585.localdomain sudo[290083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:48:55 np0005532585.localdomain sudo[290083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:55 np0005532585.localdomain sudo[290083]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: pgmap v3954: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:48:55 np0005532585.localdomain sudo[290132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:48:55 np0005532585.localdomain sudo[290132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:55 np0005532585.localdomain sudo[290132]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:55 np0005532585.localdomain sudo[290150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:48:55 np0005532585.localdomain sudo[290150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:55 np0005532585.localdomain sudo[290150]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:56 np0005532585.localdomain sudo[290168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290168]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:56 np0005532585.localdomain sudo[290186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290186]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:56 np0005532585.localdomain sudo[290204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290204]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:56 np0005532585.localdomain sudo[290238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290238]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:48:56 np0005532585.localdomain sudo[290256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290256]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain sudo[290274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290274]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:48:56 np0005532585.localdomain sudo[290292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290292]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:48:56 np0005532585.localdomain sudo[290310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290310]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:56 np0005532585.localdomain sudo[290328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290328]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain sudo[290346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:48:56 np0005532585.localdomain sudo[290346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290346]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532581.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:48:56 np0005532585.localdomain sudo[290364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:56 np0005532585.localdomain sudo[290364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:56 np0005532585.localdomain sudo[290364]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:57 np0005532585.localdomain sudo[290398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:57 np0005532585.localdomain sudo[290398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:57 np0005532585.localdomain sudo[290398]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:57 np0005532585.localdomain sudo[290416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:48:57 np0005532585.localdomain sudo[290416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:57 np0005532585.localdomain sudo[290416]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:57 np0005532585.localdomain sudo[290434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain sudo[290434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:57 np0005532585.localdomain sudo[290434]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:57 np0005532585.localdomain sudo[290452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:48:57 np0005532585.localdomain sudo[290452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:48:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:48:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:48:57 np0005532585.localdomain sudo[290452]: pam_unix(sudo:session): session closed for user root
Nov 23 09:48:57 np0005532585.localdomain podman[290470]: 2025-11-23 09:48:57.878764957 +0000 UTC m=+0.098351146 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: pgmap v3955: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:48:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:57 np0005532585.localdomain podman[290471]: 2025-11-23 09:48:57.965071586 +0000 UTC m=+0.183195959 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:48:57 np0005532585.localdomain podman[290470]: 2025-11-23 09:48:57.972501559 +0000 UTC m=+0.192087708 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 09:48:57 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:48:58 np0005532585.localdomain podman[290471]: 2025-11-23 09:48:58.009564067 +0000 UTC m=+0.227688440 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 23 09:48:58 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532581 (monmap changed)...
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532581 on np0005532581.localdomain
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532581.sxlgsx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:48:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:48:59 np0005532585.localdomain podman[290518]: 2025-11-23 09:48:59.038245924 +0000 UTC m=+0.091579984 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:48:59 np0005532585.localdomain podman[290518]: 2025-11-23 09:48:59.043412046 +0000 UTC m=+0.096745836 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 23 09:48:59 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:48:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:59.540 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:48:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:59.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:59.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:48:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:59.543 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:59.543 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:48:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:48:59.546 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: pgmap v3956: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532581.sxlgsx (monmap changed)...
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532581.sxlgsx on np0005532581.localdomain
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532581.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:48:59 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:48:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:48:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532581 (monmap changed)...
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532581 on np0005532581.localdomain
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='client.17418 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532584", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/3125926817' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/3125926817' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.235 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.754 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.827 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:49:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:01.828 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: pgmap v3957: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)...
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:01 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/2195341595' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.022 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.023 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11720MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.023 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.023 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.125 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.125 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.126 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.145 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.172 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.173 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.193 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.233 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.289 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.765 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.771 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.787 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.790 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:49:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:02.791 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='client.17436 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532585", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532582 (monmap changed)...
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532582 on np0005532582.localdomain
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/4262355343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/3244034298' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:49:02 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/921950662' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:49:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:03.792 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:03.792 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:03.793 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: pgmap v3958: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)...
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='client.17466 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532586", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/593143704' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/1331658662' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.582 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:04.583 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e6 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Nov 23 09:49:04 np0005532585.localdomain ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.103:0/1798289153' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:49:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:49:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:49:05 np0005532585.localdomain systemd[1]: tmp-crun.15uqAf.mount: Deactivated successfully.
Nov 23 09:49:05 np0005532585.localdomain podman[290581]: 2025-11-23 09:49:05.029755901 +0000 UTC m=+0.087265969 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: pgmap v3959: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)...
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.103:0/1798289153' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:49:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:49:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:49:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:49:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:49:05 np0005532585.localdomain podman[290581]: 2025-11-23 09:49:05.045433092 +0000 UTC m=+0.102943170 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:49:05 np0005532585.localdomain podman[290582]: 2025-11-23 09:49:05.080617852 +0000 UTC m=+0.133261718 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:49:05 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:49:05 np0005532585.localdomain podman[290582]: 2025-11-23 09:49:05.116181554 +0000 UTC m=+0.168825510 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:49:05 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:06 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:49:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:06.136 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:49:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:06.154 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:49:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:06.155 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:49:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:06.156 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:06.156 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:06.156 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.103:0/256735359' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: pgmap v3960: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' 
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:49:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:07.153 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:07.154 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:07.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/443540260' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 23 09:49:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 e82: 6 total, 6 up, 6 in
Nov 23 09:49:07 np0005532585.localdomain sshd[26382]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26327]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26251]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26346]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26308]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-26.scope: Consumed 3min 24.930s CPU time.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 26 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 19 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 23 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 24 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 22 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain sshd[26363]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26289]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26174]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26213]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26191]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26270]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain sshd[26232]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 19.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 17 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 25 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 14 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 20 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 18 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 21 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Session 16 logged out. Waiting for processes to exit.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 26.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 23.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 22.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 24.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 21.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 16.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 14.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 25.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 17.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 18.
Nov 23 09:49:07 np0005532585.localdomain systemd-logind[761]: Removed session 20.
Nov 23 09:49:08 np0005532585.localdomain sshd[290624]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: Activating manager daemon np0005532583.orhywt
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.103:0/443540260' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532581"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mds metadata", "who": "mds.np0005532585.jcltnl"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mds metadata", "who": "mds.np0005532584.aoxjmw"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mds metadata", "who": "mds.np0005532586.mfohsb"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: osdmap e82: 6 total, 6 up, 6 in
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: mgrmap e15: np0005532583.orhywt(active, starting, since 0.0701971s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr metadata", "who": "np0005532583.orhywt", "id": "np0005532583.orhywt"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr metadata", "who": "np0005532582.gilwrz", "id": "np0005532582.gilwrz"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mds metadata"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: Manager daemon np0005532583.orhywt is now available
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/mirror_snapshot_schedule"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/mirror_snapshot_schedule"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/trash_purge_schedule"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/trash_purge_schedule"} : dispatch
Nov 23 09:49:08 np0005532585.localdomain sshd[290624]: Accepted publickey for ceph-admin from 192.168.122.105 port 39354 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 09:49:08 np0005532585.localdomain systemd-logind[761]: New session 64 of user ceph-admin.
Nov 23 09:49:08 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1019743782 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:08 np0005532585.localdomain systemd[1]: Started Session 64 of User ceph-admin.
Nov 23 09:49:08 np0005532585.localdomain sshd[290624]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 09:49:08 np0005532585.localdomain sudo[290628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:08 np0005532585.localdomain sudo[290628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:08 np0005532585.localdomain sudo[290628]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:08 np0005532585.localdomain sudo[290646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:49:08 np0005532585.localdomain sudo[290646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:09 np0005532585.localdomain podman[290737]: 2025-11-23 09:49:09.282559027 +0000 UTC m=+0.105217091 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, architecture=x86_64, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Nov 23 09:49:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:49:09.285 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:49:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:49:09.286 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:49:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:49:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:49:09 np0005532585.localdomain podman[290737]: 2025-11-23 09:49:09.364415846 +0000 UTC m=+0.187073890 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Nov 23 09:49:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:09.583 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:09.585 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:09.586 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:09.586 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:09.641 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:09.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:09 np0005532585.localdomain ceph-mon[289735]: mgrmap e16: np0005532583.orhywt(active, since 1.07186s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:49:09 np0005532585.localdomain ceph-mon[289735]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:09 np0005532585.localdomain sudo[290646]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:10 np0005532585.localdomain sudo[290855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:10 np0005532585.localdomain sudo[290855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:10 np0005532585.localdomain sudo[290855]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:10 np0005532585.localdomain sudo[290873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:49:10 np0005532585.localdomain sudo[290873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.806 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain sudo[290873]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Bus STARTING
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Serving on http://172.18.0.105:8765
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Serving on https://172.18.0.105:7150
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Bus STARTED
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Client ('172.18.0.105', 46326) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: mgrmap e17: np0005532583.orhywt(active, since 2s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.841 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7077b846-9a10-4bd6-a7b8-739ece95bb37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.807807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a965e2a2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '943d2b85e492b579799ca682ba37ba97005542272c2424b5b5d9de4fb7687340'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.807807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a965f85a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '73c1aca9b1835254b14a46633df8961b966de7a3246e66e91e472b2f203ca3ad'}]}, 'timestamp': '2025-11-23 09:49:10.842598', '_unique_id': '94733f6fd87d41508da374bc7b000004'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec27a55c-a2c9-44fb-a2a8-f4d991e454ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.845736', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a9672a54-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'c885fd54b708dd5d63fd14e05c5d08893d07e16d76d308d43aa90fab318e094e'}]}, 'timestamp': '2025-11-23 09:49:10.850479', '_unique_id': 'cd99aed6501c44bcbbeed944dff12581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.853 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa61345-223a-4bd8-970b-0d13de27e4af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:49:10.853350', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a9697cdc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.04258916, 'message_signature': '86622d310ad3a44dffdc2700c063d254a4f8109b8398d2c663c33d0e488bb05a'}]}, 'timestamp': '2025-11-23 09:49:10.865709', '_unique_id': 'b77d4a60de7b440c96c7f336edbbf159'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fa408f9-1084-4620-ade1-aabeb2c2e6fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.868469', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a969fcfc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'ce936cc1ad7e7a9a4efe34e6e96b94862440746c2fa0a392fcc5a1992ad6884f'}]}, 'timestamp': '2025-11-23 09:49:10.868993', '_unique_id': 'be3265b57fe54851a7826da31adb1608'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84a8a26a-9b78-429d-9d78-9c341b08844c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.871446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96be6b6-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '6744464af0e5959125e9544a4c60682bbc09c0ea9000a962c9fec49cf02631fe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.871446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96c0196-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '762142512675724d4de1fc2ecfbc3ff30dcb3b41beeb06f8d3e1ae611658a1d5'}]}, 'timestamp': '2025-11-23 09:49:10.882170', '_unique_id': '052614046e0b4cf09959915894e8abaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1e146ba-f155-4f44-90e6-d61b3dad4384', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.885436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96c96ec-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '640a2f9a70b8ba11ee5158b39d6f149e2e3f4a7ccd7d7b0c0ac9514ab1982d3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.885436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96caf2e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '39dc2fcd56befd33f7ddfe85ed868f08fa60dd8b5c2422fd4a23eb32a483d334'}]}, 'timestamp': '2025-11-23 09:49:10.886675', '_unique_id': '5652d85864d8444a9d1bf8a06f488804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.890 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5c6f3b2-280b-418c-90c4-ea227bcdcf11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.890342', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a96d56a4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'e88353e3a4d68516d01b389b301b673ab49ed0dae8ff9ce8f61cd921e2c3549c'}]}, 'timestamp': '2025-11-23 09:49:10.891011', '_unique_id': 'e01bd84fbd3143d08df5c0d15b0d7643'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '036ae7b8-66c3-431f-879c-d8606310c590', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.893749', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96dd980-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'fcbfdbba2e27ad20fa86f0cfe05793b262fc5fb356c396c8799baea1d43314a9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.893749', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96de9de-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '29a5b7b1110b625adabcac4a41cc654feac692a262d8624d303f4024b94dcd8b'}]}, 'timestamp': '2025-11-23 09:49:10.894644', '_unique_id': 'cd033693def74e1bae493273778eedeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4edd0b66-e900-43ed-9ae0-4df0a6009f44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.897025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96e5766-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '90a94507b1c79ebca7a15c8fe3c406876f13a8bd6f1e570aed3f929e26ba1c81'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.897025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96e6832-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': 'a0286463f6818a1a1525c46477dbed14cf4947529a17bcb86773cba77bebcdfa'}]}, 'timestamp': '2025-11-23 09:49:10.897870', '_unique_id': '45f0f5e0492344c88e1a799fba68d8d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 12890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '597c1d3b-9182-41de-ad34-e7b92c705141', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12890000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:49:10.900048', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a96ecd90-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.04258916, 'message_signature': '582aa938219eed9827e8ad2b5cf155d4c9618d195fab01afefaf3086919539e3'}]}, 'timestamp': '2025-11-23 09:49:10.900481', '_unique_id': '7e7c7bdade164b3fbe9269d743988532'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c951ad9d-be67-4163-8e59-4261cc7ec705', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.902733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96f3852-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '3bde47314ab4f47a251f575c31b4a1d9cd6c117df55f46371b82b061bf247d4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.902733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96f4aa4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'b4e23aaee5a7c1200c6613a498b6d9c67c0863f5182439568ae1cec54a0b1b6d'}]}, 'timestamp': '2025-11-23 09:49:10.903687', '_unique_id': '2668aba8c7c44938bd2932224f79fdeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e926dc0-90ea-41ec-934f-4f32f18cc9c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.906177', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a96fbffc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': '93a99b8676d1895ddadcd24894e78e3ce2b66b6df82a7f839b734ae31b32c256'}]}, 'timestamp': '2025-11-23 09:49:10.906993', '_unique_id': '848390cd3e1c411584437c3a80ef16e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07dc6da0-3017-4574-a67a-7cdd4cee7962', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.909985', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a970532c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'bf7d1bc5410df5f75d7aefc0f84f82a88d614b3dadc35535d9be9e3a50833828'}]}, 'timestamp': '2025-11-23 09:49:10.910490', '_unique_id': '369fa97b08e2484dbc35a597f6b43568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d7362bf-583e-4cf2-92c2-b01035515761', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.912803', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a970c19a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'd3ce5bf718bed2d7a843978aaea4a5577443b6d4be27c22d88d81d37c8c332ee'}]}, 'timestamp': '2025-11-23 09:49:10.913305', '_unique_id': 'b0afe3211d3448f2a42bba2568b0b9ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd7da782-0998-4655-a42f-c7e817459fd3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.915126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9711848-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '44bfecdf48dbd17be2dda1695081272b6ed2393de4f4469d4c0a63fb43a8066e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.915126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a97122e8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'f3d1b0f1d21cd12ae4a02a5bfb9831d7ffaa1327dfa68d9a1e4fc652430f5cd3'}]}, 'timestamp': '2025-11-23 09:49:10.915762', '_unique_id': '289b96b915c742afaebc09f9798c5f6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '548669e5-74d8-44f3-a488-6d4d4578e903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.917225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9716a00-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'cdb8d239cc02b221848d172fb7521a5576f1da005d877fe3c2783015b62d9c6f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.917225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a97175b8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '17ce090e2c63ce5d97ee1dbb7ea021b7fff06a652a363444747234e91460b4a2'}]}, 'timestamp': '2025-11-23 09:49:10.917811', '_unique_id': '389bf9cb8f9543c29191e316237841c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f164214f-3829-49dd-a523-cea231417284', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.919284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a971ba3c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '700ddbc9c24ef25b36841d39e42e5c622f0babfaac9533296196b547c812edf3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.919284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a971c4c8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': 'a4b4c92c98c55d18daf739fe85e226ffbc71f994b8aa07a6bee39a81e4fdcdff'}]}, 'timestamp': '2025-11-23 09:49:10.919833', '_unique_id': '341f28fc21b74b9abf55b6410aa7a9e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01c96ddc-9cee-4e83-9679-ba9a3c1614ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.921476', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a9721072-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'c6cab1972da46204453aa8bf869ae6ad457635f6b3194f92d80ae693c58067dc'}]}, 'timestamp': '2025-11-23 09:49:10.921796', '_unique_id': 'be060c2f5ecb4d78958882d23b2941bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'beae3ec8-7bad-4249-97c5-69d223645acd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.923460', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a9725dc0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'a3ac68f29405c464564f4a45f946d0ed0e266c2cae644622e34ccc464c9eec03'}]}, 'timestamp': '2025-11-23 09:49:10.923782', '_unique_id': '2b54399165754d17bc2aab614b4cf812'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0a0514-54fd-41a5-ba36-aab44cff159b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.925251', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a972a3c0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'd47d662b7f7f29343ba043cbe5470bb6f3b14ed2e1f2f1717dc6ac71e93b96ad'}]}, 'timestamp': '2025-11-23 09:49:10.925616', '_unique_id': '4e101cb44cd54cdeb646aaaf29f1934c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c6ee195-eb72-4d6f-bc4f-0a671149939d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.927047', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a972e9ca-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': '6cdd137a14507bd17c4d2085331dbf7455f9d6ba6db2ce71e4f7cc448e822a9e'}]}, 'timestamp': '2025-11-23 09:49:10.927359', '_unique_id': '927f64cde45c455a986b72f3c1eadea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:49:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:49:10 np0005532585.localdomain sudo[290923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:10 np0005532585.localdomain sudo[290923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:10 np0005532585.localdomain sudo[290923]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:11 np0005532585.localdomain sudo[290941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:49:11 np0005532585.localdomain sudo[290941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:11 np0005532585.localdomain sudo[290941]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:11 np0005532585.localdomain sudo[290977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:49:11 np0005532585.localdomain sudo[290977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:11 np0005532585.localdomain sudo[290977]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:11 np0005532585.localdomain sudo[290995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:49:11 np0005532585.localdomain sudo[290995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:11 np0005532585.localdomain sudo[290995]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:11 np0005532585.localdomain sudo[291013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:11 np0005532585.localdomain sudo[291013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:11 np0005532585.localdomain sudo[291013]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:49:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:49:11 np0005532585.localdomain sudo[291031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:11 np0005532585.localdomain sudo[291031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:11 np0005532585.localdomain sudo[291031]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:49:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:49:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:49:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18687 "" "Go-http-client/1.1"
Nov 23 09:49:11 np0005532585.localdomain sudo[291049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:11 np0005532585.localdomain sudo[291049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:11 np0005532585.localdomain sudo[291049]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532581", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532581", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532581.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain ceph-mon[289735]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:12 np0005532585.localdomain sudo[291083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:12 np0005532585.localdomain sudo[291083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291083]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:12 np0005532585.localdomain sudo[291101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291101]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain sudo[291119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291119]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:49:12 np0005532585.localdomain sudo[291137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291137]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:49:12 np0005532585.localdomain sudo[291155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291155]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:12 np0005532585.localdomain sudo[291173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291173]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:12 np0005532585.localdomain sudo[291191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291191]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:12 np0005532585.localdomain sudo[291209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291209]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:12 np0005532585.localdomain sudo[291243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291243]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:12 np0005532585.localdomain sudo[291261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291261]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:12 np0005532585.localdomain sudo[291279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291279]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:12 np0005532585.localdomain sudo[291297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:49:12 np0005532585.localdomain sudo[291297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:12 np0005532585.localdomain sudo[291297]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:49:13 np0005532585.localdomain sudo[291315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291315]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:49:13 np0005532585.localdomain sudo[291333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291333]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1020048401 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:13 np0005532585.localdomain sudo[291351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:13 np0005532585.localdomain sudo[291351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291351]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:49:13 np0005532585.localdomain sudo[291369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291369]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:49:13 np0005532585.localdomain sudo[291403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291403]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: mgrmap e18: np0005532583.orhywt(active, since 4s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532581.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:13 np0005532585.localdomain ceph-mon[289735]: Standby manager daemon np0005532581.sxlgsx started
Nov 23 09:49:13 np0005532585.localdomain sudo[291421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:49:13 np0005532585.localdomain sudo[291421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291421]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:13 np0005532585.localdomain sudo[291439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291439]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:49:13 np0005532585.localdomain sudo[291457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291457]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:49:13 np0005532585.localdomain sudo[291475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291475]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:49:13 np0005532585.localdomain sudo[291493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291493]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:13 np0005532585.localdomain sudo[291511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291511]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:13 np0005532585.localdomain sudo[291529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:49:13 np0005532585.localdomain sudo[291529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:13 np0005532585.localdomain sudo[291529]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:14 np0005532585.localdomain sudo[291563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:49:14 np0005532585.localdomain sudo[291563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:14 np0005532585.localdomain sudo[291563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:14 np0005532585.localdomain sudo[291581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:49:14 np0005532585.localdomain sudo[291581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:14 np0005532585.localdomain sudo[291581]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:14 np0005532585.localdomain sudo[291599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain sudo[291599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:14 np0005532585.localdomain sudo[291599]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: mgrmap e19: np0005532583.orhywt(active, since 5s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr metadata", "who": "np0005532581.sxlgsx", "id": "np0005532581.sxlgsx"} : dispatch
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:49:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:14.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:14 np0005532585.localdomain sudo[291617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:49:14 np0005532585.localdomain sudo[291617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:14 np0005532585.localdomain sudo[291617]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:15 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:49:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:49:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:15 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:49:16 np0005532585.localdomain ceph-mon[289735]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 23 09:49:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:16 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:49:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:49:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:17 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054597 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:19 np0005532585.localdomain podman[291635]: 2025-11-23 09:49:19.03354329 +0000 UTC m=+0.084723790 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:49:19 np0005532585.localdomain podman[291635]: 2025-11-23 09:49:19.072443896 +0000 UTC m=+0.123624386 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 23 09:49:19 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:49:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:19.644 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:19 np0005532585.localdomain sudo[291655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:19 np0005532585.localdomain sudo[291655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:19 np0005532585.localdomain sudo[291655]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:20 np0005532585.localdomain sudo[291673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:20 np0005532585.localdomain sudo[291673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 2025-11-23 09:49:20.461160711 +0000 UTC m=+0.079803646 container create 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:49:20 np0005532585.localdomain systemd[1]: Started libpod-conmon-2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9.scope.
Nov 23 09:49:20 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 2025-11-23 09:49:20.429039017 +0000 UTC m=+0.047682012 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 2025-11-23 09:49:20.538873781 +0000 UTC m=+0.157516716 container init 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 2025-11-23 09:49:20.550002949 +0000 UTC m=+0.168645884 container start 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 2025-11-23 09:49:20.550479244 +0000 UTC m=+0.169122229 container attach 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12)
Nov 23 09:49:20 np0005532585.localdomain loving_feynman[291721]: 167 167
Nov 23 09:49:20 np0005532585.localdomain systemd[1]: libpod-2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9.scope: Deactivated successfully.
Nov 23 09:49:20 np0005532585.localdomain podman[291707]: 2025-11-23 09:49:20.554299614 +0000 UTC m=+0.172942569 container died 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, CEPH_POINT_RELEASE=)
Nov 23 09:49:20 np0005532585.localdomain podman[291726]: 2025-11-23 09:49:20.655166998 +0000 UTC m=+0.088737446 container remove 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True, name=rhceph, GIT_CLEAN=True, release=553, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container)
Nov 23 09:49:20 np0005532585.localdomain systemd[1]: libpod-conmon-2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9.scope: Deactivated successfully.
Nov 23 09:49:20 np0005532585.localdomain sudo[291673]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='client.17544 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:49:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:20 np0005532585.localdomain sudo[291743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:20 np0005532585.localdomain sudo[291743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:20 np0005532585.localdomain sudo[291743]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:20 np0005532585.localdomain sudo[291761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:20 np0005532585.localdomain sudo[291761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 2025-11-23 09:49:21.387878249 +0000 UTC m=+0.081394726 container create cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:49:21 np0005532585.localdomain systemd[1]: Started libpod-conmon-cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea.scope.
Nov 23 09:49:21 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 2025-11-23 09:49:21.451234331 +0000 UTC m=+0.144750798 container init cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, version=7, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 2025-11-23 09:49:21.352168263 +0000 UTC m=+0.045684790 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 2025-11-23 09:49:21.459139238 +0000 UTC m=+0.152655685 container start cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55)
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 2025-11-23 09:49:21.459471738 +0000 UTC m=+0.152988205 container attach cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=553, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:49:21 np0005532585.localdomain angry_taussig[291812]: 167 167
Nov 23 09:49:21 np0005532585.localdomain systemd[1]: libpod-cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea.scope: Deactivated successfully.
Nov 23 09:49:21 np0005532585.localdomain podman[291797]: 2025-11-23 09:49:21.464351861 +0000 UTC m=+0.157868358 container died cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, RELEASE=main, io.openshift.expose-services=, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:49:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a0998737b2ad229fca58596bf13e0c669a0bf094ca288df092c5077416bf7dc9-merged.mount: Deactivated successfully.
Nov 23 09:49:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1fc7c22e46194956e486f3caf689085d24fc60b8b4ff37693a607a99b020a1bb-merged.mount: Deactivated successfully.
Nov 23 09:49:21 np0005532585.localdomain podman[291817]: 2025-11-23 09:49:21.568978972 +0000 UTC m=+0.091320936 container remove cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, vcs-type=git, ceph=True, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55)
Nov 23 09:49:21 np0005532585.localdomain systemd[1]: libpod-conmon-cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea.scope: Deactivated successfully.
Nov 23 09:49:21 np0005532585.localdomain sudo[291761]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:21 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:49:21 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:49:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:49:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:21 np0005532585.localdomain sudo[291841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:21 np0005532585.localdomain sudo[291841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:21 np0005532585.localdomain sudo[291841]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:21 np0005532585.localdomain sudo[291859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:21 np0005532585.localdomain sudo[291859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:49:22 np0005532585.localdomain podman[291892]: 2025-11-23 09:49:22.381456029 +0000 UTC m=+0.091220454 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:49:22 np0005532585.localdomain podman[291892]: 2025-11-23 09:49:22.397292814 +0000 UTC m=+0.107057259 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:49:22 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 2025-11-23 09:49:22.445802481 +0000 UTC m=+0.135246090 container create 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:49:22 np0005532585.localdomain systemd[1]: Started libpod-conmon-921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350.scope.
Nov 23 09:49:22 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 2025-11-23 09:49:22.414035898 +0000 UTC m=+0.103479547 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 2025-11-23 09:49:22.5222244 +0000 UTC m=+0.211668009 container init 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main)
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 2025-11-23 09:49:22.534007138 +0000 UTC m=+0.223450747 container start 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:49:22 np0005532585.localdomain tender_curran[291933]: 167 167
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 2025-11-23 09:49:22.537583921 +0000 UTC m=+0.227027570 container attach 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:49:22 np0005532585.localdomain systemd[1]: libpod-921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350.scope: Deactivated successfully.
Nov 23 09:49:22 np0005532585.localdomain podman[291900]: 2025-11-23 09:49:22.540824672 +0000 UTC m=+0.230268331 container died 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:49:22 np0005532585.localdomain podman[291938]: 2025-11-23 09:49:22.647390784 +0000 UTC m=+0.095667962 container remove 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:49:22 np0005532585.localdomain systemd[1]: libpod-conmon-921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350.scope: Deactivated successfully.
Nov 23 09:49:22 np0005532585.localdomain sudo[291859]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:22 np0005532585.localdomain ceph-mon[289735]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:49:22 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:49:22 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:49:22 np0005532585.localdomain sudo[291962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:22 np0005532585.localdomain sudo[291962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:22 np0005532585.localdomain sudo[291962]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:23 np0005532585.localdomain sudo[291980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:23 np0005532585.localdomain sudo[291980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efe11e68d075679b9233858ce9f9507c795f6ea1656950c6827243b2718d7f22-merged.mount: Deactivated successfully.
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 2025-11-23 09:49:23.518289107 +0000 UTC m=+0.080140737 container create 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, architecture=x86_64, name=rhceph, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, release=553, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:49:23 np0005532585.localdomain systemd[1]: Started libpod-conmon-2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0.scope.
Nov 23 09:49:23 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 2025-11-23 09:49:23.582664091 +0000 UTC m=+0.144515721 container init 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, ceph=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 2025-11-23 09:49:23.486296096 +0000 UTC m=+0.048147776 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 2025-11-23 09:49:23.592619231 +0000 UTC m=+0.154470841 container start 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 2025-11-23 09:49:23.593039375 +0000 UTC m=+0.154891005 container attach 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Nov 23 09:49:23 np0005532585.localdomain vigilant_greider[292029]: 167 167
Nov 23 09:49:23 np0005532585.localdomain systemd[1]: libpod-2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0.scope: Deactivated successfully.
Nov 23 09:49:23 np0005532585.localdomain podman[292014]: 2025-11-23 09:49:23.598694922 +0000 UTC m=+0.160546532 container died 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, ceph=True, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:49:23 np0005532585.localdomain podman[292034]: 2025-11-23 09:49:23.695384545 +0000 UTC m=+0.085948179 container remove 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, name=rhceph, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553)
Nov 23 09:49:23 np0005532585.localdomain systemd[1]: libpod-conmon-2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0.scope: Deactivated successfully.
Nov 23 09:49:23 np0005532585.localdomain sudo[291980]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:23 np0005532585.localdomain sudo[292052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='client.34168 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532581", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:23 np0005532585.localdomain sudo[292052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:23 np0005532585.localdomain sudo[292052]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:23 np0005532585.localdomain sudo[292070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:23 np0005532585.localdomain sudo[292070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 2025-11-23 09:49:24.32873304 +0000 UTC m=+0.047218698 container create b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, RELEASE=main)
Nov 23 09:49:24 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@4(peon) e7  my rank is now 3 (was 4)
Nov 23 09:49:24 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 23 09:49:24 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 23 09:49:24 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb159600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: paxos.3).electionLogic(26) init, last seen epoch 26
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: Started libpod-conmon-b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf.scope.
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 2025-11-23 09:49:24.408304089 +0000 UTC m=+0.126789737 container init b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, io.buildah.version=1.33.12, release=553, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7)
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 2025-11-23 09:49:24.3086004 +0000 UTC m=+0.027086019 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 2025-11-23 09:49:24.414328717 +0000 UTC m=+0.132814345 container start b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, ceph=True, release=553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, version=7, RELEASE=main)
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 2025-11-23 09:49:24.414531463 +0000 UTC m=+0.133017121 container attach b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, release=553)
Nov 23 09:49:24 np0005532585.localdomain sharp_driscoll[292120]: 167 167
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: libpod-b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf.scope: Deactivated successfully.
Nov 23 09:49:24 np0005532585.localdomain podman[292105]: 2025-11-23 09:49:24.416507614 +0000 UTC m=+0.134993242 container died b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, release=553, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., RELEASE=main)
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: tmp-crun.MNQdaR.mount: Deactivated successfully.
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4d9a124529c96da9428c378bcfe99ca14134ff53d5ff5bfc51382c1ed6944592-merged.mount: Deactivated successfully.
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-19f0ad4f30e6e6a8d15ebb142551279c5c74c2cad6c5118f7e7ac59710c75095-merged.mount: Deactivated successfully.
Nov 23 09:49:24 np0005532585.localdomain podman[292125]: 2025-11-23 09:49:24.496384352 +0000 UTC m=+0.068318347 container remove b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7)
Nov 23 09:49:24 np0005532585.localdomain systemd[1]: libpod-conmon-b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf.scope: Deactivated successfully.
Nov 23 09:49:24 np0005532585.localdomain sudo[292070]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:49:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:24.649 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:24.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:24.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:24.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='client.26960 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532581"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: Remove daemons mon.np0005532581
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "quorum_status"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: Safe to remove mon.np0005532581: new quorum should be ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585', 'np0005532584'] (from ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585', 'np0005532584'])
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: Removing monitor np0005532581 from monmap...
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon rm", "name": "np0005532581"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: Removing daemon mon.np0005532581 from np0005532581.localdomain -- ports []
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:24.675 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 calling monitor election
Nov 23 09:49:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:24.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532584 calling monitor election
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3,4)
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: monmap epoch 7
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:49:24.323483+0000
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532582
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005532585
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005532584
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: osdmap e82: 6 total, 6 up, 6 in
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: mgrmap e19: np0005532583.orhywt(active, since 16s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:24 np0005532585.localdomain sudo[292141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:24 np0005532585.localdomain sudo[292141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:24 np0005532585.localdomain sudo[292141]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:24 np0005532585.localdomain sudo[292159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:24 np0005532585.localdomain sudo[292159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 2025-11-23 09:49:25.246746837 +0000 UTC m=+0.082045147 container create 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git)
Nov 23 09:49:25 np0005532585.localdomain systemd[1]: Started libpod-conmon-99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f.scope.
Nov 23 09:49:25 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 2025-11-23 09:49:25.308672733 +0000 UTC m=+0.143971073 container init 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public)
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 2025-11-23 09:49:25.215116808 +0000 UTC m=+0.050415208 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 2025-11-23 09:49:25.316033613 +0000 UTC m=+0.151331933 container start 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, RELEASE=main, version=7, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 2025-11-23 09:49:25.31623456 +0000 UTC m=+0.151532900 container attach 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:49:25 np0005532585.localdomain magical_tesla[292210]: 167 167
Nov 23 09:49:25 np0005532585.localdomain systemd[1]: libpod-99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f.scope: Deactivated successfully.
Nov 23 09:49:25 np0005532585.localdomain podman[292195]: 2025-11-23 09:49:25.319610605 +0000 UTC m=+0.154909025 container died 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:49:25 np0005532585.localdomain podman[292215]: 2025-11-23 09:49:25.411347064 +0000 UTC m=+0.080945492 container remove 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, version=7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True)
Nov 23 09:49:25 np0005532585.localdomain systemd[1]: libpod-conmon-99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f.scope: Deactivated successfully.
Nov 23 09:49:25 np0005532585.localdomain sudo[292159]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f3a51811fb891e82ae0c5efa0fd91aa30419890bdf8bca7e1fbf4d62bbdedda6-merged.mount: Deactivated successfully.
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:49:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: from='client.26971 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532581.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: Removed label mon from host np0005532581.localdomain
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:49:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:49:29 np0005532585.localdomain podman[292232]: 2025-11-23 09:49:29.036294088 +0000 UTC m=+0.088993834 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 23 09:49:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:49:29 np0005532585.localdomain podman[292233]: 2025-11-23 09:49:29.12560856 +0000 UTC m=+0.178407259 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal)
Nov 23 09:49:29 np0005532585.localdomain podman[292232]: 2025-11-23 09:49:29.13837403 +0000 UTC m=+0.191073776 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:49:29 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:49:29 np0005532585.localdomain podman[292233]: 2025-11-23 09:49:29.169513574 +0000 UTC m=+0.222312223 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 23 09:49:29 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:49:29 np0005532585.localdomain podman[292275]: 2025-11-23 09:49:29.223225863 +0000 UTC m=+0.086912809 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:49:29 np0005532585.localdomain podman[292275]: 2025-11-23 09:49:29.230090188 +0000 UTC m=+0.093777154 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 09:49:29 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:49:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:29.675 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='client.26976 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532581.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: Removed label mgr from host np0005532581.localdomain
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:49:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:31 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532586 (monmap changed)...
Nov 23 09:49:31 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:49:31 np0005532585.localdomain ceph-mon[289735]: from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532581.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:31 np0005532585.localdomain ceph-mon[289735]: Removed label _admin from host np0005532581.localdomain
Nov 23 09:49:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:32 np0005532585.localdomain sudo[292297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:49:32 np0005532585.localdomain sudo[292297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:32 np0005532585.localdomain sudo[292297]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:32 np0005532585.localdomain sudo[292315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:49:32 np0005532585.localdomain sudo[292315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:32 np0005532585.localdomain sudo[292315]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:32 np0005532585.localdomain sudo[292333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:32 np0005532585.localdomain sudo[292333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:32 np0005532585.localdomain sudo[292333]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:32 np0005532585.localdomain sudo[292351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:32 np0005532585.localdomain sudo[292351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:32 np0005532585.localdomain sudo[292351]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:32 np0005532585.localdomain sudo[292369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:32 np0005532585.localdomain sudo[292369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:32 np0005532585.localdomain sudo[292369]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:32 np0005532585.localdomain sudo[292403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:32 np0005532585.localdomain sudo[292403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:32 np0005532585.localdomain sudo[292403]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:49:33 np0005532585.localdomain sudo[292421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292421]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain sudo[292439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292439]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:49:33 np0005532585.localdomain sudo[292457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:33 np0005532585.localdomain sudo[292457]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:49:33 np0005532585.localdomain sudo[292475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292475]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:33 np0005532585.localdomain sudo[292493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292493]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:33 np0005532585.localdomain sudo[292511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292511]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:33 np0005532585.localdomain sudo[292529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292529]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:33 np0005532585.localdomain sudo[292563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:49:33 np0005532585.localdomain sudo[292581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292581]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain sudo[292599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain sudo[292599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:33 np0005532585.localdomain sudo[292599]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:33.937044) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891373937113, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11943, "num_deletes": 513, "total_data_size": 17596087, "memory_usage": 18490960, "flush_reason": "Manual Compaction"}
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Removing np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Removing np0005532581.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: Removing np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891373995263, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12135327, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11948, "table_properties": {"data_size": 12078944, "index_size": 30101, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25221, "raw_key_size": 263943, "raw_average_key_size": 26, "raw_value_size": 11907585, "raw_average_value_size": 1181, "num_data_blocks": 1146, "num_entries": 10082, "num_filter_entries": 10082, "num_deletions": 512, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 1763891318, "file_creation_time": 1763891373, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:49:33 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 58275 microseconds, and 27867 cpu microseconds.
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:33.995320) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12135327 bytes OK
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:33.995341) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.001070) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.001089) EVENT_LOG_v1 {"time_micros": 1763891374001084, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.001104) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 17517697, prev total WAL file size 17542408, number of live WAL files 2.
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.003097) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1887B)]
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374003194, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12137214, "oldest_snapshot_seqno": -1}
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9573 keys, 12127502 bytes, temperature: kUnknown
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374068068, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12127502, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12072447, "index_size": 30058, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 255527, "raw_average_key_size": 26, "raw_value_size": 11907643, "raw_average_value_size": 1243, "num_data_blocks": 1144, "num_entries": 9573, "num_filter_entries": 9573, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891374, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.068326) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12127502 bytes
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.069801) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.8 rd, 186.7 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.6, 0.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10087, records dropped: 514 output_compression: NoCompression
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.069822) EVENT_LOG_v1 {"time_micros": 1763891374069812, "job": 4, "event": "compaction_finished", "compaction_time_micros": 64966, "compaction_time_cpu_micros": 38566, "output_level": 6, "num_output_files": 1, "total_output_size": 12127502, "num_input_records": 10087, "num_output_records": 9573, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374071045, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374071091, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.002991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:34 np0005532585.localdomain sshd[292617]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:49:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:34.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:34.706 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:34.707 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:34 np0005532585.localdomain ceph-mon[289735]: Removing daemon mgr.np0005532581.sxlgsx from np0005532581.localdomain -- ports [9283, 8765]
Nov 23 09:49:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:49:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:49:36 np0005532585.localdomain systemd[1]: tmp-crun.A4uNnE.mount: Deactivated successfully.
Nov 23 09:49:36 np0005532585.localdomain podman[292620]: 2025-11-23 09:49:36.040261574 +0000 UTC m=+0.092641408 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:49:36 np0005532585.localdomain ceph-mon[289735]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:36 np0005532585.localdomain podman[292620]: 2025-11-23 09:49:36.077262541 +0000 UTC m=+0.129642325 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:49:36 np0005532585.localdomain podman[292619]: 2025-11-23 09:49:36.085065815 +0000 UTC m=+0.138384729 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:49:36 np0005532585.localdomain podman[292619]: 2025-11-23 09:49:36.118348266 +0000 UTC m=+0.171667160 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 09:49:36 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:49:36 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:49:36 np0005532585.localdomain sshd[292617]: Invalid user steam from 80.94.95.116 port 57850
Nov 23 09:49:36 np0005532585.localdomain sudo[292661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:49:36 np0005532585.localdomain sudo[292661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:36 np0005532585.localdomain sudo[292661]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:36 np0005532585.localdomain sshd[292617]: Connection closed by invalid user steam 80.94.95.116 port 57850 [preauth]
Nov 23 09:49:37 np0005532585.localdomain ceph-mon[289735]: Removing key for mgr.np0005532581.sxlgsx
Nov 23 09:49:37 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth rm", "entity": "mgr.np0005532581.sxlgsx"} : dispatch
Nov 23 09:49:37 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005532581.sxlgsx"}]': finished
Nov 23 09:49:37 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:37 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:37 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:49:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:38 np0005532585.localdomain sudo[292679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:49:38 np0005532585.localdomain sudo[292679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:38 np0005532585.localdomain sudo[292679]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532581 (monmap changed)...
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532581.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532581 on np0005532581.localdomain
Nov 23 09:49:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:39.707 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:39.709 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:39.710 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:39.710 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:39.751 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:39.752 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.793841) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379793879, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 462, "num_deletes": 256, "total_data_size": 303488, "memory_usage": 313512, "flush_reason": "Manual Compaction"}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379799087, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 194587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11953, "largest_seqno": 12410, "table_properties": {"data_size": 191999, "index_size": 571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7023, "raw_average_key_size": 19, "raw_value_size": 186389, "raw_average_value_size": 514, "num_data_blocks": 26, "num_entries": 362, "num_filter_entries": 362, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891373, "oldest_key_time": 1763891373, "file_creation_time": 1763891379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 5266 microseconds, and 1025 cpu microseconds.
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.799110) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 194587 bytes OK
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.799128) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.800577) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.800591) EVENT_LOG_v1 {"time_micros": 1763891379800586, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.800607) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 300514, prev total WAL file size 300514, number of live WAL files 2.
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.801252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353231' seq:72057594037927935, type:22 .. '6C6F676D0033373734' seq:0, type:0; will stop at (end)
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(190KB)], [15(11MB)]
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379801324, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12322089, "oldest_snapshot_seqno": -1}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9401 keys, 12213836 bytes, temperature: kUnknown
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379886734, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12213836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12159096, "index_size": 30127, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 253102, "raw_average_key_size": 26, "raw_value_size": 11996589, "raw_average_value_size": 1276, "num_data_blocks": 1145, "num_entries": 9401, "num_filter_entries": 9401, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.887103) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12213836 bytes
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.888855) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 144.1 rd, 142.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 11.6 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(126.1) write-amplify(62.8) OK, records in: 9935, records dropped: 534 output_compression: NoCompression
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.888884) EVENT_LOG_v1 {"time_micros": 1763891379888871, "job": 6, "event": "compaction_finished", "compaction_time_micros": 85522, "compaction_time_cpu_micros": 36448, "output_level": 6, "num_output_files": 1, "total_output_size": 12213836, "num_input_records": 9935, "num_output_records": 9401, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379889068, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379890613, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.801113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890711) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)...
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532582 (monmap changed)...
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532582 on np0005532582.localdomain
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:49:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:49:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:49:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:49:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:49:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18689 "" "Go-http-client/1.1"
Nov 23 09:49:42 np0005532585.localdomain sshd[292697]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)...
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='client.34189 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005532581.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: Added label _no_schedule to host np0005532581.localdomain
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532581.localdomain
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:43 np0005532585.localdomain sshd[292697]: Invalid user hamza from 207.154.194.2 port 38380
Nov 23 09:49:43 np0005532585.localdomain sshd[292697]: Received disconnect from 207.154.194.2 port 38380:11: Bye Bye [preauth]
Nov 23 09:49:43 np0005532585.localdomain sshd[292697]: Disconnected from invalid user hamza 207.154.194.2 port 38380 [preauth]
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='client.26986 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005532581.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)...
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain"} : dispatch
Nov 23 09:49:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain"}]': finished
Nov 23 09:49:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:44.752 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='client.34177 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005532581.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: Removed host np0005532581.localdomain
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:49:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:49:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:47 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:49:47 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:49:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:49.754 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:49.756 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:49.756 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:49.757 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:49.794 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:49.795 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:49:49 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:49:50 np0005532585.localdomain podman[292699]: 2025-11-23 09:49:50.028062884 +0000 UTC m=+0.086102584 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:49:50 np0005532585.localdomain podman[292699]: 2025-11-23 09:49:50.067355943 +0000 UTC m=+0.125395593 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 23 09:49:50 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:49:50 np0005532585.localdomain sudo[292718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:50 np0005532585.localdomain sudo[292718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:50 np0005532585.localdomain sudo[292718]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:50 np0005532585.localdomain sudo[292736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:50 np0005532585.localdomain sudo[292736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:49:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 2025-11-23 09:49:51.053586052 +0000 UTC m=+0.080574471 container create 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, version=7, vcs-type=git, name=rhceph, release=553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:49:51 np0005532585.localdomain systemd[1]: Started libpod-conmon-7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee.scope.
Nov 23 09:49:51 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 2025-11-23 09:49:51.02187244 +0000 UTC m=+0.048860849 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 2025-11-23 09:49:51.132628203 +0000 UTC m=+0.159616622 container init 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 2025-11-23 09:49:51.143216814 +0000 UTC m=+0.170205223 container start 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, vcs-type=git, name=rhceph, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=)
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 2025-11-23 09:49:51.143483783 +0000 UTC m=+0.170472192 container attach 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55)
Nov 23 09:49:51 np0005532585.localdomain frosty_babbage[292787]: 167 167
Nov 23 09:49:51 np0005532585.localdomain systemd[1]: libpod-7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee.scope: Deactivated successfully.
Nov 23 09:49:51 np0005532585.localdomain podman[292772]: 2025-11-23 09:49:51.148981505 +0000 UTC m=+0.175969984 container died 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main)
Nov 23 09:49:51 np0005532585.localdomain podman[292792]: 2025-11-23 09:49:51.249465887 +0000 UTC m=+0.088362285 container remove 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, GIT_BRANCH=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:49:51 np0005532585.localdomain systemd[1]: libpod-conmon-7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee.scope: Deactivated successfully.
Nov 23 09:49:51 np0005532585.localdomain sudo[292736]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:51 np0005532585.localdomain sudo[292810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:51 np0005532585.localdomain sudo[292810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:51 np0005532585.localdomain sudo[292810]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:51 np0005532585.localdomain sudo[292828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:51 np0005532585.localdomain sudo[292828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:51 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:49:51 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:49:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:49:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:51 np0005532585.localdomain podman[292862]: 
Nov 23 09:49:51 np0005532585.localdomain podman[292862]: 2025-11-23 09:49:51.985268076 +0000 UTC m=+0.084092811 container create 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=)
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: Started libpod-conmon-4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8.scope.
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:52 np0005532585.localdomain podman[292862]: 2025-11-23 09:49:52.048252915 +0000 UTC m=+0.147077660 container init 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=)
Nov 23 09:49:52 np0005532585.localdomain podman[292862]: 2025-11-23 09:49:51.954426751 +0000 UTC m=+0.053251546 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:52 np0005532585.localdomain podman[292862]: 2025-11-23 09:49:52.059354062 +0000 UTC m=+0.158178787 container start 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, version=7, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, architecture=x86_64)
Nov 23 09:49:52 np0005532585.localdomain podman[292862]: 2025-11-23 09:49:52.059986082 +0000 UTC m=+0.158810847 container attach 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, build-date=2025-09-24T08:57:55)
Nov 23 09:49:52 np0005532585.localdomain trusting_chatterjee[292878]: 167 167
Nov 23 09:49:52 np0005532585.localdomain podman[292862]: 2025-11-23 09:49:52.063291585 +0000 UTC m=+0.162116320 container died 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-11f724b00811be9a98ca4dfeb1b38b223dc8f6265d4b293a3cb4945f5db9990e-merged.mount: Deactivated successfully.
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: libpod-4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8.scope: Deactivated successfully.
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1f64b933e49d02a720f8facc951ee760d2a42e677556b3718415408ab9b3ce83-merged.mount: Deactivated successfully.
Nov 23 09:49:52 np0005532585.localdomain podman[292883]: 2025-11-23 09:49:52.153671262 +0000 UTC m=+0.082380418 container remove 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, distribution-scope=public, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12)
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: libpod-conmon-4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8.scope: Deactivated successfully.
Nov 23 09:49:52 np0005532585.localdomain sudo[292828]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:52 np0005532585.localdomain sudo[292907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:52 np0005532585.localdomain sudo[292907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:49:52 np0005532585.localdomain sudo[292907]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:52 np0005532585.localdomain sudo[292926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:52 np0005532585.localdomain sudo[292926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:52 np0005532585.localdomain podman[292925]: 2025-11-23 09:49:52.594120034 +0000 UTC m=+0.088176869 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:49:52 np0005532585.localdomain podman[292925]: 2025-11-23 09:49:52.609296608 +0000 UTC m=+0.103353443 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:49:52 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:52 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:52 np0005532585.localdomain podman[292982]: 
Nov 23 09:49:52 np0005532585.localdomain podman[292982]: 2025-11-23 09:49:52.974772187 +0000 UTC m=+0.081836920 container create d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, name=rhceph, architecture=x86_64, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main)
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: Started libpod-conmon-d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e.scope.
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:53 np0005532585.localdomain podman[292982]: 2025-11-23 09:49:53.036776936 +0000 UTC m=+0.143841669 container init d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, distribution-scope=public, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:49:53 np0005532585.localdomain podman[292982]: 2025-11-23 09:49:52.940049431 +0000 UTC m=+0.047114224 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:53 np0005532585.localdomain stupefied_vaughan[292997]: 167 167
Nov 23 09:49:53 np0005532585.localdomain podman[292982]: 2025-11-23 09:49:53.04744629 +0000 UTC m=+0.154511033 container start d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=)
Nov 23 09:49:53 np0005532585.localdomain podman[292982]: 2025-11-23 09:49:53.047873873 +0000 UTC m=+0.154938606 container attach d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: libpod-d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e.scope: Deactivated successfully.
Nov 23 09:49:53 np0005532585.localdomain podman[292982]: 2025-11-23 09:49:53.05320574 +0000 UTC m=+0.160270503 container died d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, name=rhceph)
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d02748f416b725da51fb02cec4b710af258b4841a547b29089a2777032038f6c-merged.mount: Deactivated successfully.
Nov 23 09:49:53 np0005532585.localdomain podman[293002]: 2025-11-23 09:49:53.137289749 +0000 UTC m=+0.075888484 container remove d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main)
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: libpod-conmon-d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e.scope: Deactivated successfully.
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:49:53 np0005532585.localdomain sudo[292926]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:53 np0005532585.localdomain sudo[293026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:53 np0005532585.localdomain sudo[293026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:53 np0005532585.localdomain sudo[293026]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:53 np0005532585.localdomain sudo[293044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:53 np0005532585.localdomain sudo[293044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:53 np0005532585.localdomain podman[293080]: 
Nov 23 09:49:53 np0005532585.localdomain podman[293080]: 2025-11-23 09:49:53.931815073 +0000 UTC m=+0.077951337 container create eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, GIT_CLEAN=True, release=553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph)
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: from='client.26996 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: Saving service mon spec with placement label:mon
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:49:53 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: Started libpod-conmon-eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c.scope.
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:53 np0005532585.localdomain podman[293080]: 2025-11-23 09:49:53.989609711 +0000 UTC m=+0.135745985 container init eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:49:53 np0005532585.localdomain podman[293080]: 2025-11-23 09:49:53.995515666 +0000 UTC m=+0.141651960 container start eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:49:53 np0005532585.localdomain podman[293080]: 2025-11-23 09:49:53.995704311 +0000 UTC m=+0.141840615 container attach eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, version=7, vcs-type=git)
Nov 23 09:49:53 np0005532585.localdomain jovial_jennings[293095]: 167 167
Nov 23 09:49:53 np0005532585.localdomain systemd[1]: libpod-eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c.scope: Deactivated successfully.
Nov 23 09:49:53 np0005532585.localdomain podman[293080]: 2025-11-23 09:49:53.998685735 +0000 UTC m=+0.144822029 container died eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:49:54 np0005532585.localdomain podman[293080]: 2025-11-23 09:49:53.903764956 +0000 UTC m=+0.049901270 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f72e3ee1f1fa62996a99db9cdf51b9dd3291820925581936cff3c2d1b050f676-merged.mount: Deactivated successfully.
Nov 23 09:49:54 np0005532585.localdomain podman[293100]: 2025-11-23 09:49:54.072047248 +0000 UTC m=+0.067203453 container remove eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:49:54 np0005532585.localdomain systemd[1]: libpod-conmon-eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c.scope: Deactivated successfully.
Nov 23 09:49:54 np0005532585.localdomain sudo[293044]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:54 np0005532585.localdomain sudo[293117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:49:54 np0005532585.localdomain sudo[293117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:54 np0005532585.localdomain sudo[293117]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:54 np0005532585.localdomain sudo[293135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:49:54 np0005532585.localdomain sudo[293135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 2025-11-23 09:49:54.695414981 +0000 UTC m=+0.056063934 container create 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, distribution-scope=public)
Nov 23 09:49:54 np0005532585.localdomain systemd[1]: Started libpod-conmon-08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796.scope.
Nov 23 09:49:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 2025-11-23 09:49:54.766565496 +0000 UTC m=+0.127214499 container init 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 2025-11-23 09:49:54.669604355 +0000 UTC m=+0.030253318 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 2025-11-23 09:49:54.773088141 +0000 UTC m=+0.133737094 container start 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 2025-11-23 09:49:54.77340455 +0000 UTC m=+0.134053533 container attach 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:49:54 np0005532585.localdomain exciting_lamarr[293185]: 167 167
Nov 23 09:49:54 np0005532585.localdomain systemd[1]: libpod-08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796.scope: Deactivated successfully.
Nov 23 09:49:54 np0005532585.localdomain podman[293169]: 2025-11-23 09:49:54.776780816 +0000 UTC m=+0.137429769 container died 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:49:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:54.795 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:54.797 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:54.798 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:54.798 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:54.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:54.831 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:54 np0005532585.localdomain podman[293190]: 2025-11-23 09:49:54.866513662 +0000 UTC m=+0.078484566 container remove 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, com.redhat.component=rhceph-container, version=7, architecture=x86_64, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, CEPH_POINT_RELEASE=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:49:54 np0005532585.localdomain systemd[1]: libpod-conmon-08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796.scope: Deactivated successfully.
Nov 23 09:49:54 np0005532585.localdomain sudo[293135]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='client.27001 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532584", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:49:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:49:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7ef813d07dc3ea023aa859f645b88f97377b354847b9b9f0c4ccee6c4745cc67-merged.mount: Deactivated successfully.
Nov 23 09:49:55 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 23 09:49:55 np0005532585.localdomain ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:49:55 np0005532585.localdomain ceph-mon[289735]: paxos.3).electionLogic(28) init, last seen epoch 28
Nov 23 09:49:55 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:49:55 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:49:55 np0005532585.localdomain sudo[293206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:49:55 np0005532585.localdomain sudo[293206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:49:55 np0005532585.localdomain sudo[293206]: pam_unix(sudo:session): session closed for user root
Nov 23 09:49:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:59.831 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:49:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:59.833 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:59.834 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:49:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:59.834 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:59.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:49:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:49:59.838 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:49:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:49:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:49:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:49:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:49:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:50:00 np0005532585.localdomain podman[293224]: 2025-11-23 09:50:00.049906167 +0000 UTC m=+0.098614674 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 23 09:50:00 np0005532585.localdomain podman[293224]: 2025-11-23 09:50:00.084894152 +0000 UTC m=+0.133602659 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 23 09:50:00 np0005532585.localdomain podman[293226]: 2025-11-23 09:50:00.097097943 +0000 UTC m=+0.144129987 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Nov 23 09:50:00 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:50:00 np0005532585.localdomain podman[293226]: 2025-11-23 09:50:00.107305623 +0000 UTC m=+0.154337647 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm)
Nov 23 09:50:00 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:00 np0005532585.localdomain podman[293225]: 2025-11-23 09:50:00.155410497 +0000 UTC m=+0.204360561 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:00 np0005532585.localdomain podman[293225]: 2025-11-23 09:50:00.186685765 +0000 UTC m=+0.235635899 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:50:00 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='client.27006 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532584"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: Remove daemons mon.np0005532584
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: Safe to remove mon.np0005532584: new quorum should be ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585'] (from ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585'])
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: Removing monitor np0005532584 from monmap...
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: Removing daemon mon.np0005532584 from np0005532584.localdomain -- ports []
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 calling monitor election
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)...
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532586,np0005532585 in quorum (ranks 0,2,3)
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586,np0005532585 in quorum (ranks 0,1,2,3)
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: monmap epoch 8
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:49:55.093726+0000
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532582
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005532585
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: osdmap e82: 6 total, 6 up, 6 in
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: mgrmap e19: np0005532583.orhywt(active, since 52s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:50:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/1282554930' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/1282554930' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)...
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.238 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)...
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/3149325585' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/1371552442' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:50:03 np0005532585.localdomain ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4224794661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.681 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.758 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.759 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.953 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.953 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11740MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.954 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:50:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:03.954 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.009 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.010 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.010 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.047 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/4224794661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:50:04 np0005532585.localdomain ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3085114768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.460 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.466 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.492 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.494 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.495 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.870 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.872 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.873 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.873 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.874 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:04.878 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/3085114768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/1725845173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:05 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:50:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:05.495 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:05.495 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:50:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:05.496 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.039 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.039 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/2255157616' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:50:06 np0005532585.localdomain ceph-mon[289735]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.543 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.559 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.559 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.560 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.560 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.560 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.561 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:06.561 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:50:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:50:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:50:07 np0005532585.localdomain podman[293332]: 2025-11-23 09:50:07.010370605 +0000 UTC m=+0.069055641 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:50:07 np0005532585.localdomain podman[293332]: 2025-11-23 09:50:07.04634608 +0000 UTC m=+0.105031116 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 23 09:50:07 np0005532585.localdomain podman[293333]: 2025-11-23 09:50:07.083243474 +0000 UTC m=+0.135015003 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:50:07 np0005532585.localdomain podman[293333]: 2025-11-23 09:50:07.088609562 +0000 UTC m=+0.140381051 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:50:07 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:50:07 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:50:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:07.275 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:50:08 np0005532585.localdomain sudo[293374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:08 np0005532585.localdomain sudo[293374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:08 np0005532585.localdomain sudo[293374]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:08 np0005532585.localdomain sudo[293392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:08 np0005532585.localdomain sudo[293392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:08 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 2025-11-23 09:50:09.267349211 +0000 UTC m=+0.077862496 container create a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, release=553, version=7, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git)
Nov 23 09:50:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:50:09.286 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:50:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:50:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:50:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:50:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:50:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b.scope.
Nov 23 09:50:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 2025-11-23 09:50:09.233246264 +0000 UTC m=+0.043759539 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 2025-11-23 09:50:09.343532323 +0000 UTC m=+0.154045558 container init a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 2025-11-23 09:50:09.355138136 +0000 UTC m=+0.165651371 container start a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 2025-11-23 09:50:09.355538619 +0000 UTC m=+0.166051894 container attach a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Nov 23 09:50:09 np0005532585.localdomain modest_kare[293442]: 167 167
Nov 23 09:50:09 np0005532585.localdomain systemd[1]: libpod-a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b.scope: Deactivated successfully.
Nov 23 09:50:09 np0005532585.localdomain podman[293427]: 2025-11-23 09:50:09.359586016 +0000 UTC m=+0.170099301 container died a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, name=rhceph, version=7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Nov 23 09:50:09 np0005532585.localdomain podman[293447]: 2025-11-23 09:50:09.451277462 +0000 UTC m=+0.081944493 container remove a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, architecture=x86_64, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=)
Nov 23 09:50:09 np0005532585.localdomain systemd[1]: libpod-conmon-a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b.scope: Deactivated successfully.
Nov 23 09:50:09 np0005532585.localdomain sudo[293392]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:09 np0005532585.localdomain sudo[293463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:09 np0005532585.localdomain sudo[293463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:09 np0005532585.localdomain sudo[293463]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:09 np0005532585.localdomain sudo[293481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:09 np0005532585.localdomain sudo[293481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: from='client.34235 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005532584.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: Deploying daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:50:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:09.878 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:09.880 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:09.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:09.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:09.914 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:09.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 2025-11-23 09:50:10.15922551 +0000 UTC m=+0.075909735 container create 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public, release=553, io.buildah.version=1.33.12, version=7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:50:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01.scope.
Nov 23 09:50:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 2025-11-23 09:50:10.221124946 +0000 UTC m=+0.137809141 container init 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, release=553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 2025-11-23 09:50:10.126977341 +0000 UTC m=+0.043661586 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 2025-11-23 09:50:10.229055294 +0000 UTC m=+0.145739489 container start 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 2025-11-23 09:50:10.229358483 +0000 UTC m=+0.146042678 container attach 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph)
Nov 23 09:50:10 np0005532585.localdomain mystifying_volhard[293530]: 167 167
Nov 23 09:50:10 np0005532585.localdomain systemd[1]: libpod-673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01.scope: Deactivated successfully.
Nov 23 09:50:10 np0005532585.localdomain podman[293515]: 2025-11-23 09:50:10.231556932 +0000 UTC m=+0.148241167 container died 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, release=553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:50:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-06e50dcc3c8545d2341a09a6b04afb1395f1433750eac236ba0367b00b232f9a-merged.mount: Deactivated successfully.
Nov 23 09:50:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b0e94a1798b46e1cee30b4fdbae8b573e0cc76171b7c9530d7aeacc5cd17ece5-merged.mount: Deactivated successfully.
Nov 23 09:50:10 np0005532585.localdomain podman[293535]: 2025-11-23 09:50:10.319338737 +0000 UTC m=+0.079013852 container remove 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container)
Nov 23 09:50:10 np0005532585.localdomain systemd[1]: libpod-conmon-673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01.scope: Deactivated successfully.
Nov 23 09:50:10 np0005532585.localdomain sudo[293481]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:10 np0005532585.localdomain sudo[293557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:10 np0005532585.localdomain sudo[293557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:10 np0005532585.localdomain sudo[293557]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:10 np0005532585.localdomain sudo[293575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:10 np0005532585.localdomain sudo[293575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:50:10 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 2025-11-23 09:50:11.149732454 +0000 UTC m=+0.066877173 container create 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, io.buildah.version=1.33.12, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, architecture=x86_64, GIT_CLEAN=True)
Nov 23 09:50:11 np0005532585.localdomain systemd[1]: Started libpod-conmon-5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b.scope.
Nov 23 09:50:11 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 2025-11-23 09:50:11.208489781 +0000 UTC m=+0.125634520 container init 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553)
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 2025-11-23 09:50:11.117854146 +0000 UTC m=+0.034998905 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 2025-11-23 09:50:11.218299507 +0000 UTC m=+0.135444246 container start 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, build-date=2025-09-24T08:57:55, name=rhceph, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=)
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 2025-11-23 09:50:11.218577836 +0000 UTC m=+0.135722595 container attach 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:50:11 np0005532585.localdomain busy_jennings[293624]: 167 167
Nov 23 09:50:11 np0005532585.localdomain systemd[1]: libpod-5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b.scope: Deactivated successfully.
Nov 23 09:50:11 np0005532585.localdomain podman[293609]: 2025-11-23 09:50:11.221119485 +0000 UTC m=+0.138264254 container died 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, distribution-scope=public)
Nov 23 09:50:11 np0005532585.localdomain systemd[1]: tmp-crun.SzEMlR.mount: Deactivated successfully.
Nov 23 09:50:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ef2e47773231d596c7cbcce8ce45a7a6cac7a559ab80c6d1847d2aa2e66e9fb-merged.mount: Deactivated successfully.
Nov 23 09:50:11 np0005532585.localdomain podman[293629]: 2025-11-23 09:50:11.318592484 +0000 UTC m=+0.086007721 container remove 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:50:11 np0005532585.localdomain systemd[1]: libpod-conmon-5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b.scope: Deactivated successfully.
Nov 23 09:50:11 np0005532585.localdomain sudo[293575]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:11 np0005532585.localdomain sudo[293651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:11 np0005532585.localdomain sudo[293651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:11 np0005532585.localdomain sudo[293651]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:11 np0005532585.localdomain sudo[293669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:11 np0005532585.localdomain sudo[293669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:11 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:50:11 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:50:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:50:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:50:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:50:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:50:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:50:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18707 "" "Go-http-client/1.1"
Nov 23 09:50:12 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Nov 23 09:50:12 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 2025-11-23 09:50:12.122530193 +0000 UTC m=+0.064303032 container create 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7)
Nov 23 09:50:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1.scope.
Nov 23 09:50:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 2025-11-23 09:50:12.182521048 +0000 UTC m=+0.124293887 container init 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55)
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 2025-11-23 09:50:12.191434927 +0000 UTC m=+0.133207726 container start 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 2025-11-23 09:50:12.191648405 +0000 UTC m=+0.133421284 container attach 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, release=553, CEPH_POINT_RELEASE=)
Nov 23 09:50:12 np0005532585.localdomain dreamy_murdock[293718]: 167 167
Nov 23 09:50:12 np0005532585.localdomain systemd[1]: libpod-5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1.scope: Deactivated successfully.
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 2025-11-23 09:50:12.194942567 +0000 UTC m=+0.136715416 container died 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7)
Nov 23 09:50:12 np0005532585.localdomain podman[293703]: 2025-11-23 09:50:12.105142809 +0000 UTC m=+0.046915638 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:12 np0005532585.localdomain podman[293723]: 2025-11-23 09:50:12.264863994 +0000 UTC m=+0.064472017 container remove 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, version=7, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:50:12 np0005532585.localdomain systemd[1]: libpod-conmon-5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1.scope: Deactivated successfully.
Nov 23 09:50:12 np0005532585.localdomain sudo[293669]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:12 np0005532585.localdomain sudo[293739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:12 np0005532585.localdomain sudo[293739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:12 np0005532585.localdomain sudo[293739]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:12 np0005532585.localdomain sudo[293757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:12 np0005532585.localdomain sudo[293757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:12 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Nov 23 09:50:12 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 23 09:50:12 np0005532585.localdomain ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:50:12 np0005532585.localdomain ceph-mon[289735]: paxos.3).electionLogic(34) init, last seen epoch 34
Nov 23 09:50:12 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:12 np0005532585.localdomain podman[293792]: 
Nov 23 09:50:12 np0005532585.localdomain podman[293792]: 2025-11-23 09:50:12.932595844 +0000 UTC m=+0.081854321 container create d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, RELEASE=main, vcs-type=git)
Nov 23 09:50:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65.scope.
Nov 23 09:50:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:12 np0005532585.localdomain podman[293792]: 2025-11-23 09:50:12.900507301 +0000 UTC m=+0.049765788 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:13 np0005532585.localdomain podman[293792]: 2025-11-23 09:50:13.007884658 +0000 UTC m=+0.157143135 container init d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553)
Nov 23 09:50:13 np0005532585.localdomain podman[293792]: 2025-11-23 09:50:13.018016284 +0000 UTC m=+0.167274761 container start d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:50:13 np0005532585.localdomain podman[293792]: 2025-11-23 09:50:13.018283084 +0000 UTC m=+0.167541611 container attach d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, release=553, vendor=Red Hat, Inc., distribution-scope=public, version=7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main)
Nov 23 09:50:13 np0005532585.localdomain loving_noether[293807]: 167 167
Nov 23 09:50:13 np0005532585.localdomain systemd[1]: libpod-d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65.scope: Deactivated successfully.
Nov 23 09:50:13 np0005532585.localdomain podman[293792]: 2025-11-23 09:50:13.020783271 +0000 UTC m=+0.170041808 container died d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, architecture=x86_64, vcs-type=git, release=553, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main)
Nov 23 09:50:13 np0005532585.localdomain podman[293812]: 2025-11-23 09:50:13.114609535 +0000 UTC m=+0.081441877 container remove d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Nov 23 09:50:13 np0005532585.localdomain systemd[1]: libpod-conmon-d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65.scope: Deactivated successfully.
Nov 23 09:50:13 np0005532585.localdomain sudo[293757]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3115e936c8354cf2c48908567ba22294542a7e35221659601213687c3198cf86-merged.mount: Deactivated successfully.
Nov 23 09:50:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:14.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:15 np0005532585.localdomain sshd[293828]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:50:16 np0005532585.localdomain sshd[293828]: Invalid user packer from 107.172.15.139 port 35344
Nov 23 09:50:16 np0005532585.localdomain sshd[293828]: Received disconnect from 107.172.15.139 port 35344:11: Bye Bye [preauth]
Nov 23 09:50:16 np0005532585.localdomain sshd[293828]: Disconnected from invalid user packer 107.172.15.139 port 35344 [preauth]
Nov 23 09:50:17 np0005532585.localdomain ceph-mds[287052]: mds.beacon.mds.np0005532585.jcltnl missed beacon ack from the monitors
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: paxos.3).electionLogic(35) init, last seen epoch 35, mid-election, bumping
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532584 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586 in quorum (ranks 0,1,2)
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532582 calling monitor election
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3,4)
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: monmap epoch 9
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:50:12.504455+0000
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532582
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005532585
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: osdmap e82: 6 total, 6 up, 6 in
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: mgrmap e19: np0005532583.orhywt(active, since 69s), standbys: np0005532582.gilwrz, np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:18 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:18 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:50:18 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:50:18 np0005532585.localdomain ceph-mon[289735]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:19.920 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:19.922 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:19.922 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:19.922 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:19.951 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:19.952 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:50:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:50:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:50:20 np0005532585.localdomain ceph-mon[289735]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:20 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:50:21 np0005532585.localdomain podman[293830]: 2025-11-23 09:50:21.027361977 +0000 UTC m=+0.081318073 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:50:21 np0005532585.localdomain podman[293830]: 2025-11-23 09:50:21.037955818 +0000 UTC m=+0.091911924 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:50:21 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:22 np0005532585.localdomain sudo[293848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:22 np0005532585.localdomain sudo[293848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:50:22 np0005532585.localdomain sudo[293848]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:22 np0005532585.localdomain sudo[293872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:50:22 np0005532585.localdomain sudo[293872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:22 np0005532585.localdomain podman[293866]: 2025-11-23 09:50:22.781355455 +0000 UTC m=+0.080316623 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:50:22 np0005532585.localdomain podman[293866]: 2025-11-23 09:50:22.791551373 +0000 UTC m=+0.090512511 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:50:22 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:50:22 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:50:22 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:50:22 np0005532585.localdomain ceph-mon[289735]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:22 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:22 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:23 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:23 np0005532585.localdomain sudo[293872]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:23 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.200:0/681254074' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:50:24 np0005532585.localdomain ceph-mon[289735]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:24.952 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:24.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:24.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:24.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:24.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:24.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:25 np0005532585.localdomain sudo[293938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:50:25 np0005532585.localdomain sudo[293938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[293938]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain sudo[293956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:50:25 np0005532585.localdomain sudo[293956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[293956]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain sudo[293974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:25 np0005532585.localdomain sudo[293974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[293974]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain sudo[293992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:25 np0005532585.localdomain sudo[293992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[293992]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain sudo[294010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:25 np0005532585.localdomain sudo[294010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[294010]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain sudo[294044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:25 np0005532585.localdomain sudo[294044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[294044]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain sudo[294062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:25 np0005532585.localdomain sudo[294062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[294062]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='client.34248 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: Reconfig service osd.default_drive_group
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:50:25 np0005532585.localdomain sudo[294080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:50:25 np0005532585.localdomain sudo[294080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:25 np0005532585.localdomain sudo[294080]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:26 np0005532585.localdomain sudo[294098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294098]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:26 np0005532585.localdomain sudo[294116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294116]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:26 np0005532585.localdomain sudo[294134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294134]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:26 np0005532585.localdomain sudo[294152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294152]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:26 np0005532585.localdomain sudo[294170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294170]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:26 np0005532585.localdomain sudo[294204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294204]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:26 np0005532585.localdomain sudo[294222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294222]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain sudo[294240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain sudo[294240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:26 np0005532585.localdomain sudo[294240]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e9 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3357125401' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 e83: 6 total, 6 up, 6 in
Nov 23 09:50:26 np0005532585.localdomain sshd[290624]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:50:26 np0005532585.localdomain systemd[1]: session-64.scope: Deactivated successfully.
Nov 23 09:50:26 np0005532585.localdomain systemd[1]: session-64.scope: Consumed 21.368s CPU time.
Nov 23 09:50:26 np0005532585.localdomain systemd-logind[761]: Session 64 logged out. Waiting for processes to exit.
Nov 23 09:50:26 np0005532585.localdomain systemd-logind[761]: Removed session 64.
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.200:0/3357125401' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Activating manager daemon np0005532582.gilwrz
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: osdmap e83: 6 total, 6 up, 6 in
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: mgrmap e20: np0005532582.gilwrz(active, starting, since 0.0482114s), standbys: np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' 
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mds metadata", "who": "mds.np0005532585.jcltnl"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mds metadata", "who": "mds.np0005532584.aoxjmw"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mds metadata", "who": "mds.np0005532586.mfohsb"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr metadata", "who": "np0005532582.gilwrz", "id": "np0005532582.gilwrz"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr metadata", "who": "np0005532581.sxlgsx", "id": "np0005532581.sxlgsx"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mds metadata"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mon metadata"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: Manager daemon np0005532582.gilwrz is now available
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"}]': finished
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch
Nov 23 09:50:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"}]': finished
Nov 23 09:50:27 np0005532585.localdomain sshd[294258]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:50:27 np0005532585.localdomain sshd[294258]: Accepted publickey for ceph-admin from 192.168.122.104 port 56046 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 09:50:27 np0005532585.localdomain systemd-logind[761]: New session 65 of user ceph-admin.
Nov 23 09:50:27 np0005532585.localdomain systemd[1]: Started Session 65 of User ceph-admin.
Nov 23 09:50:27 np0005532585.localdomain sshd[294258]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 09:50:27 np0005532585.localdomain sudo[294262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:27 np0005532585.localdomain sudo[294262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:27 np0005532585.localdomain sudo[294262]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:27 np0005532585.localdomain sudo[294280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:50:27 np0005532585.localdomain sudo[294280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:27 np0005532585.localdomain ceph-mon[289735]: removing stray HostCache host record np0005532581.localdomain.devices.0
Nov 23 09:50:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/mirror_snapshot_schedule"} : dispatch
Nov 23 09:50:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/mirror_snapshot_schedule"} : dispatch
Nov 23 09:50:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/trash_purge_schedule"} : dispatch
Nov 23 09:50:27 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/trash_purge_schedule"} : dispatch
Nov 23 09:50:27 np0005532585.localdomain ceph-mon[289735]: mgrmap e21: np0005532582.gilwrz(active, since 1.11447s), standbys: np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:50:28 np0005532585.localdomain podman[294373]: 2025-11-23 09:50:28.106686148 +0000 UTC m=+0.065670245 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64)
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:28 np0005532585.localdomain podman[294373]: 2025-11-23 09:50:28.206595342 +0000 UTC m=+0.165579449 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True)
Nov 23 09:50:28 np0005532585.localdomain sudo[294280]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:28 np0005532585.localdomain sudo[294496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:28 np0005532585.localdomain sudo[294496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:28 np0005532585.localdomain sudo[294496]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:28 np0005532585.localdomain sudo[294514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:50:28 np0005532585.localdomain sudo[294514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:29 np0005532585.localdomain sudo[294514]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:29 np0005532585.localdomain sudo[294563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:29 np0005532585.localdomain sudo[294563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:29 np0005532585.localdomain sudo[294563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:29 np0005532585.localdomain sudo[294581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:50:29 np0005532585.localdomain sudo[294581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:50:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Bus STARTING
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Serving on https://172.18.0.104:7150
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Client ('172.18.0.104', 33588) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Serving on http://172.18.0.104:8765
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Bus STARTED
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:29 np0005532585.localdomain ceph-mon[289735]: mgrmap e22: np0005532582.gilwrz(active, since 3s), standbys: np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx
Nov 23 09:50:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:29.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:30 np0005532585.localdomain sudo[294581]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:30 np0005532585.localdomain sudo[294619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:50:30 np0005532585.localdomain sudo[294619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:30 np0005532585.localdomain sudo[294619]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:50:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:50:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:50:30 np0005532585.localdomain sudo[294640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:50:30 np0005532585.localdomain sudo[294640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:30 np0005532585.localdomain sudo[294640]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:30 np0005532585.localdomain podman[294637]: 2025-11-23 09:50:30.902612077 +0000 UTC m=+0.100621638 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:50:30 np0005532585.localdomain sudo[294688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:30 np0005532585.localdomain sudo[294688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:30 np0005532585.localdomain podman[294638]: 2025-11-23 09:50:30.958980599 +0000 UTC m=+0.156539726 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:50:30 np0005532585.localdomain sudo[294688]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:30 np0005532585.localdomain podman[294637]: 2025-11-23 09:50:30.992337263 +0000 UTC m=+0.190346834 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 23 09:50:31 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:50:31 np0005532585.localdomain systemd[1]: tmp-crun.iGezYM.mount: Deactivated successfully.
Nov 23 09:50:31 np0005532585.localdomain podman[294639]: 2025-11-23 09:50:31.01912489 +0000 UTC m=+0.213100045 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:50:31 np0005532585.localdomain sudo[294722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:31 np0005532585.localdomain sudo[294722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294722]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain podman[294638]: 2025-11-23 09:50:31.043649357 +0000 UTC m=+0.241208514 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 23 09:50:31 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:50:31 np0005532585.localdomain podman[294639]: 2025-11-23 09:50:31.065400207 +0000 UTC m=+0.259375372 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 23 09:50:31 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:50:31 np0005532585.localdomain sudo[294751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294751]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:31 np0005532585.localdomain ceph-mon[289735]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:31 np0005532585.localdomain sudo[294786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294786]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294804]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:50:31 np0005532585.localdomain sudo[294822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294822]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:31 np0005532585.localdomain sudo[294840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:31 np0005532585.localdomain sudo[294858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294858]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294876]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:31 np0005532585.localdomain sudo[294894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294894]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294912]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294946]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:31 np0005532585.localdomain sudo[294964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:31 np0005532585.localdomain sudo[294964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:31 np0005532585.localdomain sudo[294964]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[294982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:32 np0005532585.localdomain sudo[294982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[294982]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:50:32 np0005532585.localdomain sudo[295000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295000]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:50:32 np0005532585.localdomain sudo[295018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295018]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Standby manager daemon np0005532583.orhywt started
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: mgrmap e23: np0005532582.gilwrz(active, since 5s), standbys: np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532581.sxlgsx, np0005532583.orhywt
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr metadata", "who": "np0005532583.orhywt", "id": "np0005532583.orhywt"} : dispatch
Nov 23 09:50:32 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:32 np0005532585.localdomain sudo[295036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295036]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:32 np0005532585.localdomain sudo[295054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295054]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295072]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295106]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295124]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:32 np0005532585.localdomain sudo[295142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295142]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:32 np0005532585.localdomain sudo[295160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295160]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:32 np0005532585.localdomain sudo[295178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295178]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295196]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:32 np0005532585.localdomain sudo[295214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295214]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295232]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:32 np0005532585.localdomain sudo[295266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:32 np0005532585.localdomain sudo[295266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:32 np0005532585.localdomain sudo[295266]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:33 np0005532585.localdomain sudo[295284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:33 np0005532585.localdomain sudo[295284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:33 np0005532585.localdomain sudo[295284]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:33 np0005532585.localdomain sudo[295302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain sudo[295302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:33 np0005532585.localdomain sudo[295302]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:33 np0005532585.localdomain sudo[295320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:50:33 np0005532585.localdomain sudo[295320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:33 np0005532585.localdomain sudo[295320]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)...
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.871676) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434871720, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2648, "num_deletes": 255, "total_data_size": 8169082, "memory_usage": 8738464, "flush_reason": "Manual Compaction"}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434898863, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 4911183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12415, "largest_seqno": 15058, "table_properties": {"data_size": 4900044, "index_size": 6876, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 28832, "raw_average_key_size": 22, "raw_value_size": 4875651, "raw_average_value_size": 3854, "num_data_blocks": 297, "num_entries": 1265, "num_filter_entries": 1265, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891379, "oldest_key_time": 1763891379, "file_creation_time": 1763891434, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 27267 microseconds, and 6918 cpu microseconds.
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.898940) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 4911183 bytes OK
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.898965) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.900446) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.900462) EVENT_LOG_v1 {"time_micros": 1763891434900457, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.900481) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8156195, prev total WAL file size 8156195, number of live WAL files 2.
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.901612) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4796KB)], [18(11MB)]
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434901641, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17125019, "oldest_snapshot_seqno": -1}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10114 keys, 15126695 bytes, temperature: kUnknown
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434975365, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 15126695, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15066385, "index_size": 33905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 270406, "raw_average_key_size": 26, "raw_value_size": 14890600, "raw_average_value_size": 1472, "num_data_blocks": 1306, "num_entries": 10114, "num_filter_entries": 10114, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891434, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.975682) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 15126695 bytes
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.981044) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 231.9 rd, 204.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.7, 11.6 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(6.6) write-amplify(3.1) OK, records in: 10666, records dropped: 552 output_compression: NoCompression
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.981079) EVENT_LOG_v1 {"time_micros": 1763891434981064, "job": 8, "event": "compaction_finished", "compaction_time_micros": 73838, "compaction_time_cpu_micros": 21403, "output_level": 6, "num_output_files": 1, "total_output_size": 15126695, "num_input_records": 10666, "num_output_records": 10114, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434981928, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434983725, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.901541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.983788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:34 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:34.996 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:34.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:34.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:34.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:35.025 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:35.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)...
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:35 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)...
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:50:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:38 np0005532585.localdomain podman[295339]: 2025-11-23 09:50:38.037090927 +0000 UTC m=+0.085130567 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:50:38 np0005532585.localdomain podman[295339]: 2025-11-23 09:50:38.050247583 +0000 UTC m=+0.098287233 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:50:38 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:50:38 np0005532585.localdomain podman[295338]: 2025-11-23 09:50:38.016311956 +0000 UTC m=+0.070113124 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:50:38 np0005532585.localdomain podman[295338]: 2025-11-23 09:50:38.100413111 +0000 UTC m=+0.154214239 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:50:38 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:50:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.856744) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439856811, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 459, "num_deletes": 254, "total_data_size": 475331, "memory_usage": 486184, "flush_reason": "Manual Compaction"}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439860555, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 290260, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15063, "largest_seqno": 15517, "table_properties": {"data_size": 287465, "index_size": 842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6400, "raw_average_key_size": 18, "raw_value_size": 281734, "raw_average_value_size": 795, "num_data_blocks": 33, "num_entries": 354, "num_filter_entries": 354, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891435, "oldest_key_time": 1763891435, "file_creation_time": 1763891439, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 3859 microseconds, and 1773 cpu microseconds.
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.860605) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 290260 bytes OK
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.860631) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862091) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862114) EVENT_LOG_v1 {"time_micros": 1763891439862107, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862134) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 472392, prev total WAL file size 472392, number of live WAL files 2.
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862932) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303130' seq:72057594037927935, type:22 .. '6B760031323635' seq:0, type:0; will stop at (end)
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(283KB)], [21(14MB)]
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439863010, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15416955, "oldest_snapshot_seqno": -1}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 9939 keys, 14425287 bytes, temperature: kUnknown
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439940422, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14425287, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14367161, "index_size": 32169, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24901, "raw_key_size": 268375, "raw_average_key_size": 27, "raw_value_size": 14195328, "raw_average_value_size": 1428, "num_data_blocks": 1215, "num_entries": 9939, "num_filter_entries": 9939, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891439, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.940988) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14425287 bytes
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.942636) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.4 rd, 185.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.4 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(102.8) write-amplify(49.7) OK, records in: 10468, records dropped: 529 output_compression: NoCompression
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.942668) EVENT_LOG_v1 {"time_micros": 1763891439942654, "job": 10, "event": "compaction_finished", "compaction_time_micros": 77712, "compaction_time_cpu_micros": 41715, "output_level": 6, "num_output_files": 1, "total_output_size": 14425287, "num_input_records": 10468, "num_output_records": 9939, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439942856, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439945203, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945258) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:39 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:50:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:40.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:40.028 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:40.028 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:40.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:40.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:40.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='client.34288 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:40 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:41 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:50:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:50:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:50:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:50:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:50:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18701 "" "Go-http-client/1.1"
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: from='client.27101 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: Saving service mon spec with placement label:mon
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:50:42 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:43 np0005532585.localdomain sudo[295380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:43 np0005532585.localdomain sudo[295380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:43 np0005532585.localdomain sudo[295380]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:43 np0005532585.localdomain sudo[295398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:43 np0005532585.localdomain sudo[295398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: from='client.34293 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532584", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:50:43 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:43 np0005532585.localdomain podman[295432]: 
Nov 23 09:50:43 np0005532585.localdomain podman[295432]: 2025-11-23 09:50:43.957810066 +0000 UTC m=+0.063935554 container create 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64)
Nov 23 09:50:43 np0005532585.localdomain systemd[1]: Started libpod-conmon-9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f.scope.
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:44 np0005532585.localdomain podman[295432]: 2025-11-23 09:50:43.925231871 +0000 UTC m=+0.031357329 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:44 np0005532585.localdomain podman[295432]: 2025-11-23 09:50:44.028865688 +0000 UTC m=+0.134991116 container init 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, vcs-type=git, ceph=True)
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: tmp-crun.w25BR2.mount: Deactivated successfully.
Nov 23 09:50:44 np0005532585.localdomain podman[295432]: 2025-11-23 09:50:44.043965064 +0000 UTC m=+0.150090482 container start 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:50:44 np0005532585.localdomain podman[295432]: 2025-11-23 09:50:44.044949114 +0000 UTC m=+0.151074562 container attach 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, release=553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:50:44 np0005532585.localdomain laughing_lovelace[295448]: 167 167
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: libpod-9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f.scope: Deactivated successfully.
Nov 23 09:50:44 np0005532585.localdomain podman[295432]: 2025-11-23 09:50:44.051118734 +0000 UTC m=+0.157244232 container died 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, vendor=Red Hat, Inc., version=7, ceph=True)
Nov 23 09:50:44 np0005532585.localdomain podman[295453]: 2025-11-23 09:50:44.152617955 +0000 UTC m=+0.089152090 container remove 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=553)
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: libpod-conmon-9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f.scope: Deactivated successfully.
Nov 23 09:50:44 np0005532585.localdomain sudo[295398]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:44 np0005532585.localdomain sudo[295469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:44 np0005532585.localdomain sudo[295469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:44 np0005532585.localdomain sudo[295469]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:44 np0005532585.localdomain sudo[295487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:44 np0005532585.localdomain sudo[295487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.200:0/94382026' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:50:44 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 2025-11-23 09:50:44.814190586 +0000 UTC m=+0.065110320 container create 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553)
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: Started libpod-conmon-6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef.scope.
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 2025-11-23 09:50:44.882272166 +0000 UTC m=+0.133191910 container init 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, io.openshift.expose-services=)
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 2025-11-23 09:50:44.78319702 +0000 UTC m=+0.034116774 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 2025-11-23 09:50:44.894879405 +0000 UTC m=+0.145799139 container start 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55)
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 2025-11-23 09:50:44.895148134 +0000 UTC m=+0.146067888 container attach 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:50:44 np0005532585.localdomain nifty_liskov[295539]: 167 167
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: libpod-6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef.scope: Deactivated successfully.
Nov 23 09:50:44 np0005532585.localdomain podman[295523]: 2025-11-23 09:50:44.897756114 +0000 UTC m=+0.148675878 container died 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-917432b2277fe1b1fa97d837ab7ed86a34cd6fd4d8ff5631b4888c907f975a36-merged.mount: Deactivated successfully.
Nov 23 09:50:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-dd0aba95c9dc7660b2050c506df3a775fc72ef170499f6c3180ebd13d2879ebe-merged.mount: Deactivated successfully.
Nov 23 09:50:44 np0005532585.localdomain podman[295544]: 2025-11-23 09:50:44.998914365 +0000 UTC m=+0.088689747 container remove 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-09-24T08:57:55, ceph=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=)
Nov 23 09:50:45 np0005532585.localdomain systemd[1]: libpod-conmon-6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef.scope: Deactivated successfully.
Nov 23 09:50:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:45.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:45 np0005532585.localdomain sudo[295487]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:45 np0005532585.localdomain sudo[295568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:45 np0005532585.localdomain sudo[295568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:45 np0005532585.localdomain sudo[295568]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:45 np0005532585.localdomain sudo[295586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:45 np0005532585.localdomain sudo[295586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 2025-11-23 09:50:45.824250637 +0000 UTC m=+0.067498823 container create 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:50:45 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:45 np0005532585.localdomain systemd[1]: Started libpod-conmon-41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d.scope.
Nov 23 09:50:45 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 2025-11-23 09:50:45.788186524 +0000 UTC m=+0.031434840 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 2025-11-23 09:50:45.896372412 +0000 UTC m=+0.139620598 container init 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, version=7, architecture=x86_64, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main)
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 2025-11-23 09:50:45.903643437 +0000 UTC m=+0.146891613 container start 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, distribution-scope=public)
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 2025-11-23 09:50:45.903925016 +0000 UTC m=+0.147173252 container attach 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vcs-type=git, RELEASE=main, distribution-scope=public, GIT_BRANCH=main)
Nov 23 09:50:45 np0005532585.localdomain friendly_sanderson[295637]: 167 167
Nov 23 09:50:45 np0005532585.localdomain systemd[1]: libpod-41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d.scope: Deactivated successfully.
Nov 23 09:50:45 np0005532585.localdomain podman[295622]: 2025-11-23 09:50:45.906595198 +0000 UTC m=+0.149843394 container died 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, release=553, io.buildah.version=1.33.12, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Nov 23 09:50:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-478051fc5f52bdfc203655a6c47efb3e04afb9e5da28239d050065c7dbd05235-merged.mount: Deactivated successfully.
Nov 23 09:50:46 np0005532585.localdomain podman[295642]: 2025-11-23 09:50:46.002853907 +0000 UTC m=+0.087575562 container remove 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Nov 23 09:50:46 np0005532585.localdomain systemd[1]: libpod-conmon-41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d.scope: Deactivated successfully.
Nov 23 09:50:46 np0005532585.localdomain sudo[295586]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:46 np0005532585.localdomain sudo[295666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:46 np0005532585.localdomain sudo[295666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:46 np0005532585.localdomain sudo[295666]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:46 np0005532585.localdomain sudo[295684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:46 np0005532585.localdomain sudo[295684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 2025-11-23 09:50:46.81297057 +0000 UTC m=+0.062662614 container create 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, RELEASE=main, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:50:46 np0005532585.localdomain systemd[1]: Started libpod-conmon-9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a.scope.
Nov 23 09:50:46 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:50:46 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 2025-11-23 09:50:46.879936536 +0000 UTC m=+0.129628580 container init 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 2025-11-23 09:50:46.890017537 +0000 UTC m=+0.139709551 container start 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 2025-11-23 09:50:46.890242384 +0000 UTC m=+0.139934428 container attach 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=553)
Nov 23 09:50:46 np0005532585.localdomain sharp_benz[295735]: 167 167
Nov 23 09:50:46 np0005532585.localdomain systemd[1]: libpod-9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a.scope: Deactivated successfully.
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 2025-11-23 09:50:46.793488079 +0000 UTC m=+0.043180113 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:46 np0005532585.localdomain podman[295720]: 2025-11-23 09:50:46.892985549 +0000 UTC m=+0.142677603 container died 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True)
Nov 23 09:50:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5a97c8649ddad270819df4c60c1a350127d60bddbf3b62ad28ce97e19a5a9013-merged.mount: Deactivated successfully.
Nov 23 09:50:46 np0005532585.localdomain podman[295740]: 2025-11-23 09:50:46.988349581 +0000 UTC m=+0.084846479 container remove 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, RELEASE=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:50:46 np0005532585.localdomain systemd[1]: libpod-conmon-9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a.scope: Deactivated successfully.
Nov 23 09:50:47 np0005532585.localdomain sudo[295684]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:47 np0005532585.localdomain sudo[295756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:47 np0005532585.localdomain sudo[295756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:47 np0005532585.localdomain sudo[295756]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:47 np0005532585.localdomain sudo[295774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:47 np0005532585.localdomain sudo[295774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 2025-11-23 09:50:47.685506919 +0000 UTC m=+0.074979475 container create e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Nov 23 09:50:47 np0005532585.localdomain systemd[1]: Started libpod-conmon-e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec.scope.
Nov 23 09:50:47 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 2025-11-23 09:50:47.741071972 +0000 UTC m=+0.130544538 container init e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, io.openshift.expose-services=, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main)
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 2025-11-23 09:50:47.749813613 +0000 UTC m=+0.139286179 container start e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc.)
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 2025-11-23 09:50:47.750137562 +0000 UTC m=+0.139610118 container attach e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True)
Nov 23 09:50:47 np0005532585.localdomain practical_mirzakhani[295824]: 167 167
Nov 23 09:50:47 np0005532585.localdomain systemd[1]: libpod-e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec.scope: Deactivated successfully.
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 2025-11-23 09:50:47.752432042 +0000 UTC m=+0.141904618 container died e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, vendor=Red Hat, Inc.)
Nov 23 09:50:47 np0005532585.localdomain podman[295809]: 2025-11-23 09:50:47.654727479 +0000 UTC m=+0.044200065 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:47 np0005532585.localdomain podman[295829]: 2025-11-23 09:50:47.848159967 +0000 UTC m=+0.082482597 container remove e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.33.12, version=7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:50:47 np0005532585.localdomain systemd[1]: libpod-conmon-e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec.scope: Deactivated successfully.
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:50:47 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:47 np0005532585.localdomain sudo[295774]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:47 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d3572bb347915371c7593debeb562cd783f888af3a100444ab4b71a2119ae1a9-merged.mount: Deactivated successfully.
Nov 23 09:50:48 np0005532585.localdomain sudo[295846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:48 np0005532585.localdomain sudo[295846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:48 np0005532585.localdomain sudo[295846]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:48 np0005532585.localdomain sudo[295864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:48 np0005532585.localdomain sudo[295864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:48 np0005532585.localdomain sshd[295882]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 2025-11-23 09:50:48.572612156 +0000 UTC m=+0.078216144 container create cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, RELEASE=main, name=rhceph, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public)
Nov 23 09:50:48 np0005532585.localdomain systemd[1]: Started libpod-conmon-cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a.scope.
Nov 23 09:50:48 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 2025-11-23 09:50:48.638926172 +0000 UTC m=+0.144530120 container init cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, version=7, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 2025-11-23 09:50:48.541669641 +0000 UTC m=+0.047273629 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 2025-11-23 09:50:48.647746814 +0000 UTC m=+0.153350772 container start cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553, distribution-scope=public)
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 2025-11-23 09:50:48.648083464 +0000 UTC m=+0.153687472 container attach cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:50:48 np0005532585.localdomain romantic_almeida[295916]: 167 167
Nov 23 09:50:48 np0005532585.localdomain systemd[1]: libpod-cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a.scope: Deactivated successfully.
Nov 23 09:50:48 np0005532585.localdomain podman[295901]: 2025-11-23 09:50:48.652512411 +0000 UTC m=+0.158116709 container died cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.expose-services=, version=7, release=553)
Nov 23 09:50:48 np0005532585.localdomain podman[295921]: 2025-11-23 09:50:48.766504467 +0000 UTC m=+0.098151249 container remove cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=553, vendor=Red Hat, Inc., version=7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git)
Nov 23 09:50:48 np0005532585.localdomain systemd[1]: libpod-conmon-cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a.scope: Deactivated successfully.
Nov 23 09:50:48 np0005532585.localdomain sshd[295882]: Invalid user mysql from 207.154.194.2 port 51810
Nov 23 09:50:48 np0005532585.localdomain sudo[295864]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:48 np0005532585.localdomain sshd[295882]: Received disconnect from 207.154.194.2 port 51810:11: Bye Bye [preauth]
Nov 23 09:50:48 np0005532585.localdomain sshd[295882]: Disconnected from invalid user mysql 207.154.194.2 port 51810 [preauth]
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:50:48 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e12745678485b3c6ddaf9e0b8d28ee823299482ab090dcc5737606e5e23bdca7-merged.mount: Deactivated successfully.
Nov 23 09:50:49 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:50:49 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.200:0/345064794' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 23 09:50:49 np0005532585.localdomain ceph-mon[289735]: mgrmap e24: np0005532582.gilwrz(active, since 23s), standbys: np0005532584.naxwxy, np0005532585.gzafiw, np0005532586.thmvqb, np0005532583.orhywt
Nov 23 09:50:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:50.067 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:50.070 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:50.070 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:50.071 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:50.113 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:50.114 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' 
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:50:50 np0005532585.localdomain ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 e84: 6 total, 6 up, 6 in
Nov 23 09:50:51 np0005532585.localdomain sshd[294258]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:50:51 np0005532585.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Nov 23 09:50:51 np0005532585.localdomain systemd[1]: session-65.scope: Consumed 9.604s CPU time.
Nov 23 09:50:51 np0005532585.localdomain systemd-logind[761]: Session 65 logged out. Waiting for processes to exit.
Nov 23 09:50:51 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:50:51 np0005532585.localdomain systemd-logind[761]: Removed session 65.
Nov 23 09:50:51 np0005532585.localdomain podman[295938]: 2025-11-23 09:50:51.216772059 +0000 UTC m=+0.064218442 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Nov 23 09:50:51 np0005532585.localdomain podman[295938]: 2025-11-23 09:50:51.227593013 +0000 UTC m=+0.075039386 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:50:51 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:50:51 np0005532585.localdomain sshd[295958]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:50:51 np0005532585.localdomain sshd[295958]: Accepted publickey for ceph-admin from 192.168.122.106 port 54498 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 09:50:51 np0005532585.localdomain systemd-logind[761]: New session 66 of user ceph-admin.
Nov 23 09:50:51 np0005532585.localdomain systemd[1]: Started Session 66 of User ceph-admin.
Nov 23 09:50:51 np0005532585.localdomain sshd[295958]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 09:50:51 np0005532585.localdomain sudo[295962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:51 np0005532585.localdomain sudo[295962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:51 np0005532585.localdomain sudo[295962]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:51 np0005532585.localdomain sudo[295980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:50:51 np0005532585.localdomain sudo[295980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.200:0/3363667457' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: Activating manager daemon np0005532584.naxwxy
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: osdmap e84: 6 total, 6 up, 6 in
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: mgrmap e25: np0005532584.naxwxy(active, starting, since 0.034012s), standbys: np0005532585.gzafiw, np0005532586.thmvqb, np0005532583.orhywt
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532582"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata", "who": "mds.np0005532585.jcltnl"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata", "who": "mds.np0005532584.aoxjmw"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata", "who": "mds.np0005532586.mfohsb"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532583.orhywt", "id": "np0005532583.orhywt"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: Manager daemon np0005532584.naxwxy is now available
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/mirror_snapshot_schedule"} : dispatch
Nov 23 09:50:51 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/trash_purge_schedule"} : dispatch
Nov 23 09:50:52 np0005532585.localdomain systemd[1]: tmp-crun.0L7NXx.mount: Deactivated successfully.
Nov 23 09:50:52 np0005532585.localdomain podman[296071]: 2025-11-23 09:50:52.520401497 +0000 UTC m=+0.098439888 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.component=rhceph-container, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:50:52 np0005532585.localdomain podman[296071]: 2025-11-23 09:50:52.648027855 +0000 UTC m=+0.226066246 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, ceph=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph)
Nov 23 09:50:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:50:52 np0005532585.localdomain podman[296140]: 2025-11-23 09:50:52.977455468 +0000 UTC m=+0.072769096 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:50:53 np0005532585.localdomain podman[296140]: 2025-11-23 09:50:53.064526285 +0000 UTC m=+0.159840003 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:50:53 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:50:53 np0005532585.localdomain ceph-mon[289735]: mgrmap e26: np0005532584.naxwxy(active, since 1.05285s), standbys: np0005532585.gzafiw, np0005532586.thmvqb, np0005532583.orhywt
Nov 23 09:50:53 np0005532585.localdomain ceph-mon[289735]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:53 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:53 np0005532585.localdomain sudo[295980]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:53 np0005532585.localdomain sudo[296210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:53 np0005532585.localdomain sudo[296210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:53 np0005532585.localdomain sudo[296210]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:53 np0005532585.localdomain sudo[296228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:50:53 np0005532585.localdomain sudo[296228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:54 np0005532585.localdomain sudo[296228]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Bus STARTING
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Serving on https://172.18.0.106:7150
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Client ('172.18.0.106', 58208) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Serving on http://172.18.0.106:8765
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Bus STARTED
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: Cluster is now healthy
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:54 np0005532585.localdomain sudo[296278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:50:54 np0005532585.localdomain sudo[296278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:54 np0005532585.localdomain sudo[296278]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:54 np0005532585.localdomain sudo[296296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:50:54 np0005532585.localdomain sudo[296296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:54 np0005532585.localdomain sudo[296296]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:54 np0005532585.localdomain sudo[296334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:50:54 np0005532585.localdomain sudo[296334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:54 np0005532585.localdomain sudo[296334]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:50:55 np0005532585.localdomain sudo[296352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296352]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:55.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:55.117 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:50:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:55.117 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:50:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:55.117 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:55 np0005532585.localdomain sudo[296370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:55 np0005532585.localdomain sudo[296370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:55.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:50:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:50:55.159 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:50:55 np0005532585.localdomain sudo[296370]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:55 np0005532585.localdomain sudo[296388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296388]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:55 np0005532585.localdomain sudo[296406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296406]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: mgrmap e27: np0005532584.naxwxy(active, since 3s), standbys: np0005532585.gzafiw, np0005532586.thmvqb, np0005532583.orhywt
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:50:55 np0005532585.localdomain ceph-mon[289735]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:55 np0005532585.localdomain sudo[296440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:55 np0005532585.localdomain sudo[296440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296440]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:50:55 np0005532585.localdomain sudo[296458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296458]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:50:55 np0005532585.localdomain sudo[296476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296476]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:55 np0005532585.localdomain sudo[296494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296494]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:55 np0005532585.localdomain sudo[296512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296512]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:55 np0005532585.localdomain sudo[296530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296530]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:55 np0005532585.localdomain sudo[296548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296548]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:55 np0005532585.localdomain sudo[296566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:55 np0005532585.localdomain sudo[296566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:55 np0005532585.localdomain sudo[296566]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:56 np0005532585.localdomain sudo[296600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296600]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:50:56 np0005532585.localdomain sudo[296618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296618]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:56 np0005532585.localdomain sudo[296636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296636]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:50:56 np0005532585.localdomain sudo[296654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296654]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:50:56 np0005532585.localdomain sudo[296672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296672]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:56 np0005532585.localdomain sudo[296690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296690]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:56 np0005532585.localdomain sudo[296708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296708]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:56 np0005532585.localdomain sudo[296726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:56 np0005532585.localdomain sudo[296760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296760]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:50:56 np0005532585.localdomain sudo[296778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296778]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain sudo[296796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:56 np0005532585.localdomain sudo[296796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296796]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:56 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:50:56 np0005532585.localdomain ceph-mon[289735]: Standby manager daemon np0005532582.gilwrz started
Nov 23 09:50:56 np0005532585.localdomain sudo[296814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:56 np0005532585.localdomain sudo[296814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:56 np0005532585.localdomain sudo[296814]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:50:57 np0005532585.localdomain sudo[296832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296832]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:57 np0005532585.localdomain sudo[296850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296850]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:50:57 np0005532585.localdomain sudo[296868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296868]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:57 np0005532585.localdomain sudo[296886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296886]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:57 np0005532585.localdomain sudo[296920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296920]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:50:57 np0005532585.localdomain sudo[296938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296938]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain sudo[296956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain sudo[296956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:57 np0005532585.localdomain sudo[296956]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: mgrmap e28: np0005532584.naxwxy(active, since 5s), standbys: np0005532585.gzafiw, np0005532586.thmvqb, np0005532583.orhywt, np0005532582.gilwrz
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532582.gilwrz", "id": "np0005532582.gilwrz"} : dispatch
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:50:57 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:50:58 np0005532585.localdomain sudo[296974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:50:58 np0005532585.localdomain sudo[296974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:50:58 np0005532585.localdomain sudo[296974]: pam_unix(sudo:session): session closed for user root
Nov 23 09:50:58 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:50:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:50:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:50:58 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:50:58 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532582 (monmap changed)...
Nov 23 09:50:58 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532582 on np0005532582.localdomain
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:50:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:50:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:51:00 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:00.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:00.161 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/970189041' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.32:0/970189041' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:51:01 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:51:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:51:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:51:02 np0005532585.localdomain systemd[1]: tmp-crun.kVJ3Zd.mount: Deactivated successfully.
Nov 23 09:51:02 np0005532585.localdomain podman[296994]: 2025-11-23 09:51:02.027551929 +0000 UTC m=+0.076574553 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., version=9.6)
Nov 23 09:51:02 np0005532585.localdomain podman[296994]: 2025-11-23 09:51:02.06515066 +0000 UTC m=+0.114173334 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 23 09:51:02 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:51:02 np0005532585.localdomain podman[296992]: 2025-11-23 09:51:02.067305886 +0000 UTC m=+0.118692273 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532586 (monmap changed)...
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:51:02 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:02 np0005532585.localdomain podman[296992]: 2025-11-23 09:51:02.152310239 +0000 UTC m=+0.203696636 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 09:51:02 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:51:02 np0005532585.localdomain podman[296993]: 2025-11-23 09:51:02.122720676 +0000 UTC m=+0.173464573 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:51:02 np0005532585.localdomain podman[296993]: 2025-11-23 09:51:02.202088524 +0000 UTC m=+0.252832371 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 23 09:51:02 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:51:02 np0005532585.localdomain sudo[297055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:51:02 np0005532585.localdomain sudo[297055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:02 np0005532585.localdomain sudo[297055]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:03 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:51:03 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:03.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:04 np0005532585.localdomain ceph-mon[289735]: from='client.34343 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:51:04 np0005532585.localdomain ceph-mon[289735]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.206 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.207 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.207 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:51:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:05.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:51:05 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/1484728382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:05 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/131716085' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:05 np0005532585.localdomain ceph-mon[289735]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:51:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:06.057 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:51:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:06.057 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:51:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:06.058 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:51:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:06.058 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:51:06 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.108:0/159818063' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:06 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.106:0/1313297216' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:06 np0005532585.localdomain ceph-mon[289735]: from='client.27196 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532582", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.064 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.092 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.094 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.115 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:51:07 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@3(peon) e10  my rank is now 2 (was 3)
Nov 23 09:51:07 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Nov 23 09:51:07 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Nov 23 09:51:07 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: paxos.2).electionLogic(40) init, last seen epoch 40
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:51:07 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.556 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.621 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.622 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.776 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.776 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11693MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.777 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.777 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.852 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.852 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.853 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:51:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:07.895 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:51:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:51:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:51:09 np0005532585.localdomain podman[297106]: 2025-11-23 09:51:09.015070857 +0000 UTC m=+0.069099822 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:51:09 np0005532585.localdomain podman[297107]: 2025-11-23 09:51:09.029381099 +0000 UTC m=+0.076424809 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:51:09 np0005532585.localdomain podman[297107]: 2025-11-23 09:51:09.040161571 +0000 UTC m=+0.087205271 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:51:09 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:51:09 np0005532585.localdomain podman[297106]: 2025-11-23 09:51:09.054338368 +0000 UTC m=+0.108367363 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:51:09 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:51:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:51:09.287 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:51:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:51:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:51:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:51:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: from='client.27206 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532582"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: Remove daemons mon.np0005532582
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: Safe to remove mon.np0005532582: new quorum should be ['np0005532583', 'np0005532586', 'np0005532585', 'np0005532584'] (from ['np0005532583', 'np0005532586', 'np0005532585', 'np0005532584'])
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: Removing monitor np0005532582 from monmap...
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: Removing daemon mon.np0005532582 from np0005532582.localdomain -- ports []
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532586 calling monitor election
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532584 calling monitor election
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585 calling monitor election
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: from='client.34373 172.18.0.107:0/920168224' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 calling monitor election
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3)
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: monmap epoch 10
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: last_changed 2025-11-23T09:51:07.282003+0000
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: min_mon_release 18 (reef)
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: election_strategy: 1
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005532585
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: osdmap e84: 6 total, 6 up, 6 in
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mgrmap e28: np0005532584.naxwxy(active, since 18s), standbys: np0005532585.gzafiw, np0005532586.thmvqb, np0005532583.orhywt, np0005532582.gilwrz
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: overall HEALTH_OK
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:51:09 np0005532585.localdomain ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1036816168' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.775 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.880s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.783 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.808 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.811 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.811 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.932 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.932 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.956 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.956 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:09.956 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:51:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:10.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:10.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:10.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:51:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:10.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:10.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:10.251 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:10 np0005532585.localdomain ceph-mon[289735]: from='client.? 172.18.0.107:0/1036816168' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:51:10 np0005532585.localdomain sudo[297158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:51:10 np0005532585.localdomain sudo[297158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:10 np0005532585.localdomain sudo[297158]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:10 np0005532585.localdomain sudo[297176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:51:10 np0005532585.localdomain sudo[297176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:10 np0005532585.localdomain sudo[297176]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:10 np0005532585.localdomain sudo[297194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:10 np0005532585.localdomain sudo[297194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:10 np0005532585.localdomain sudo[297194]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.805 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.806 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.807 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.821 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.822 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a573f0-6b66-4e6b-aaa1-8ff41655e050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.807355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0e95f5a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': 'e6a50e1018370596dedffe2c113df05fe7208268793dfef62a4129a91249c8e5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.807355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0e9777e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '1dcee196c66de7a06e246a5b2ee1574f4c1de7b49ea619097a3937abc7bf4ce7'}]}, 'timestamp': '2025-11-23 09:51:10.822696', '_unique_id': '7986bf046f74410389bafba07e51a27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.826 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.826 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.830 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '601f3146-abd9-4c30-a6ff-e604269168b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.826658', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0eabe54-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '2d140033d85caac5eeda3009d8da6a7ebf3cc9a1a4e6be8c80add60d3c4620d8'}]}, 'timestamp': '2025-11-23 09:51:10.831050', '_unique_id': '1e186175852348c6b393bb6dfb97e480'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.833 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49d9d6eb-ff91-479e-b86d-e2008edc835c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.834138', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0efc1ce-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '789b10708935340e399dd8b339824807c5d50cc339d35d5c9a069be8ea592dd3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.834138', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0efd830-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'e6501661dc2c62852b6eb01e3f40f317bdc046ffa477c9e7e61bf17390fe501b'}]}, 'timestamp': '2025-11-23 09:51:10.864446', '_unique_id': '50cf2f7612cc4e4487dd264676fa5e22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain sudo[297212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:10 np0005532585.localdomain sudo[297212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:10 np0005532585.localdomain sudo[297212]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.887 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29d0cf6d-2c5f-49ae-9611-21d712cd0824', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:51:10.867554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f0f3673e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.064666941, 'message_signature': '5fd6a625a714766fb5af35958e01826ed84f1f66d715d703051d9f30272046c1'}]}, 'timestamp': '2025-11-23 09:51:10.887772', '_unique_id': '427437a78f2c4ba89f4226a2538b8217'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.890 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d3daf3d-02d0-4c94-9be7-9d09469f96e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.890500', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f3e5ba-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '3fd6da6fe57d1c2c205c5b41cfe50687265e5f574b8aa6c3a0542b8ecee1a28b'}]}, 'timestamp': '2025-11-23 09:51:10.891023', '_unique_id': 'f69ae16018224fb2b8000299f694c9e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5356448-ed75-4964-8d49-ffe452c81782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.893471', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f45982-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': 'a537bdf488741e5b91e5a9bdcdcd1298329f89e6edaa0e37edd66bb2c552e1ff'}]}, 'timestamp': '2025-11-23 09:51:10.893984', '_unique_id': '66e6e1302a014f9984d90b9e211c47b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.896 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.896 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65ab2133-5371-48ee-b85c-2e0c058ef6bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.896183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f4c4da-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'b584ea59c482d52b152a8dc7e4301b591172990599571d9f6885fb80ba3e5a28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.896183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f4d8da-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '856ee90b8dc8ecaef7626af316036de1c8638559e519919d37c451d3a0a5189d'}]}, 'timestamp': '2025-11-23 09:51:10.897249', '_unique_id': 'fe8a8f05fba4454793fb4d2a281e7767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a42a68-b7c7-4dbe-953b-ad5d39a7221b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.899733', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f54f7c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': 'eba635c764846702664af2ef3f4456754261356fa082ca05b6ee245f85834123'}]}, 'timestamp': '2025-11-23 09:51:10.900245', '_unique_id': '38fd3d0b2cb743f0bf6d75f4c220bdc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '694340f5-9741-4816-a880-29e01367cf19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.902654', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f5c29a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '6f8acb5223d71c3aa7e3c7941c262417a32a2923bf6362e5a67aae096f77cae3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.902654', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f5d3de-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'd933e89c9767edf91c786bf178d9a5a9db4e09c3823ffb87f8c8e67fdac743df'}]}, 'timestamp': '2025-11-23 09:51:10.903603', '_unique_id': 'ebf351737b17480b844e05173bd963e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '747aa262-5fff-4914-8617-1f217c61f493', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.905854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f63e3c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '3a63689698f9ce479e5479887afcefaa7cbde51d5b6aba5f5d350d16db12301c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.905854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f65070-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '1e28cc3e66e5cca0fcb7d542fb5884412fbc3725ebe63e7ee2755b64b5f88ac6'}]}, 'timestamp': '2025-11-23 09:51:10.906795', '_unique_id': 'b7d9b2d155eb4457906c3b58c234b09e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '853c0b40-3b26-45c7-811c-6e6658257e4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.909177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f6c0b4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': 'a368c7833b3363dfee606bd9b78dca7f1feb4fa43b23b623586421307fa130b9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.909177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f6d2f2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '9b6e747a5f12ee6cb71dca749bd948efbe9f12fb3a1b719a3e9ab8351aa4f158'}]}, 'timestamp': '2025-11-23 09:51:10.910137', '_unique_id': '7bdcd83a05f442b7b5e6b37204f2f31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '334c423a-9315-49a9-8b63-140f9866ef52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.912736', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f74bec-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '8569f6e62fac87a2ca45ff54bd75a1d1b36174c7bc8480a845851847e6541b32'}]}, 'timestamp': '2025-11-23 09:51:10.913263', '_unique_id': '8d4cc1c93f5748f780d88cfe33040503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82d07b94-ed25-40fc-826a-cbe9c22234c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.915580', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f7ba28-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'c28477c3c047ff47ab1e446b08a83f7e6db2f025452158fc48f5ca76f0f7d9db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.915580', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f7d076-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'b8bb84872df4be6d64efb4d777e0ae0b87074f70cfbbeb3270972e7fefc10e8c'}]}, 'timestamp': '2025-11-23 09:51:10.916628', '_unique_id': '533c7cf35ceb49a0ba9d36b432359b2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 13490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eda64f0c-ba1f-4145-ba95-2a75926d0de0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13490000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:51:10.918052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f0f81608-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.064666941, 'message_signature': '2898c540613a23299f8a4e9d1d756a647616f0fccc667c191966b4e922575cc1'}]}, 'timestamp': '2025-11-23 09:51:10.918333', '_unique_id': '0bd755fdf3084fa0b607d77669d655c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5636759-62cb-45f6-9b44-33560dd6dde6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.919652', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f8547e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': 'a99d45e11a7d3ce264ac59c9f2198684d5d2e8594d2476a83c4cc579437fb17a'}]}, 'timestamp': '2025-11-23 09:51:10.919963', '_unique_id': '58af22b5b93c44f1bac61e5a4e969f5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a98e950-7e91-4287-85c9-f61cbdeb6147', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.921401', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f89916-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '5cb5fef50b3b39d5b09ec3b7de4bb9d67349a0c9e3c172ad8934308e2bfa3bf5'}]}, 'timestamp': '2025-11-23 09:51:10.921699', '_unique_id': '4e3596895c594787b17852b9bc6ca590'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fce0e00e-aaee-4404-9832-b73868dbcd48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.923160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f8dd90-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '1cd41788db4d2934381ac29e6a36403b52fcfbd3cd4e15e28fa4156682182869'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.923160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f8e8ee-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '27d1f57f8a4fef1e9e1119d9d98a97d50fee9b67e44417839f11723cbed93b29'}]}, 'timestamp': '2025-11-23 09:51:10.923727', '_unique_id': '329c3c20a5224545863157faf6e31f70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262abf73-2b9b-46e3-ae77-9da8b3c7d347', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.925127', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f92a7a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '4210aff083d9660d31edfeb4552d779cbff2cbe6804337f134714eec3cbe3686'}]}, 'timestamp': '2025-11-23 09:51:10.925420', '_unique_id': '409135656b394b72b595640768bb073b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b046b14-a0cd-43fb-891d-0fe024ee23d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.927242', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f97d18-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'd7680ee9d4825b751f0ce32014eefb8d006370904ff9dde678f792f95587aaeb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.927242', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f98916-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '8fcd23080fe7bcb6af579d930944df7ba8b7efbf85f727aaf3cc82dc39efbd6b'}]}, 'timestamp': '2025-11-23 09:51:10.927828', '_unique_id': 'b182b52deb2c40c38ad006887e5556e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0040a4c1-897f-4bd4-b0ed-1ebff5575028', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.929273', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f9cca0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '7d6ed2c09aa486defc547aa4c1558ddde6c6aa6bee60ce7f4d408efd13bbecf5'}]}, 'timestamp': '2025-11-23 09:51:10.929575', '_unique_id': '8f02a1b991a04c888b810b4e7242ed4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf86e683-1b41-4e52-8293-902dbd2a0446', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.931010', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0fa1048-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '31efdb954757a95d8ca930947272468951e17ee11ab52e2015145a09e3ff0f67'}]}, 'timestamp': '2025-11-23 09:51:10.931304', '_unique_id': '70f03cf8dd9646fb8b1c599138f23d75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:51:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:51:10 np0005532585.localdomain sudo[297230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:10 np0005532585.localdomain sudo[297230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:10 np0005532585.localdomain sudo[297230]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:11 np0005532585.localdomain sudo[297264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297264]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:11 np0005532585.localdomain sudo[297282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297282]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain sudo[297300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297300]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:51:11 np0005532585.localdomain sudo[297318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297318]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:51:11 np0005532585.localdomain sudo[297336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297336]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:11 np0005532585.localdomain sudo[297354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297354]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: from='client.34349 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532582.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: Removed label mon from host np0005532582.localdomain
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain ceph-mon[289735]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:11 np0005532585.localdomain sudo[297372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:11 np0005532585.localdomain sudo[297372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297372]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:11 np0005532585.localdomain sudo[297390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297390]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:11 np0005532585.localdomain sudo[297424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297424]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain sudo[297442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:11 np0005532585.localdomain sudo[297442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297442]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:51:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:51:11 np0005532585.localdomain sudo[297460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:11 np0005532585.localdomain sudo[297460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:11 np0005532585.localdomain sudo[297460]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:51:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:51:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:51:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18698 "" "Go-http-client/1.1"
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='client.34386 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532582.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Removed label mgr from host np0005532582.localdomain
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:12 np0005532585.localdomain ceph-mon[289735]: Removing daemon mgr.np0005532582.gilwrz from np0005532582.localdomain -- ports [8765]
Nov 23 09:51:13 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:14 np0005532585.localdomain ceph-mon[289735]: from='client.34357 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532582.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:14 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:14 np0005532585.localdomain ceph-mon[289735]: Removed label _admin from host np0005532582.localdomain
Nov 23 09:51:14 np0005532585.localdomain ceph-mon[289735]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:14 np0005532585.localdomain sudo[297478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:51:14 np0005532585.localdomain sudo[297478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:14 np0005532585.localdomain sudo[297478]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "mgr.np0005532582.gilwrz"} : dispatch
Nov 23 09:51:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005532582.gilwrz"}]': finished
Nov 23 09:51:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:15 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:51:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:15.252 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:16 np0005532585.localdomain ceph-mon[289735]: Removing key for mgr.np0005532582.gilwrz
Nov 23 09:51:16 np0005532585.localdomain ceph-mon[289735]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:16 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:51:16 np0005532585.localdomain sudo[297496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:51:16 np0005532585.localdomain sudo[297496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:16 np0005532585.localdomain sudo[297496]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: Removing np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: Removing np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: Removing np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:51:17 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)...
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:18 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:51:19 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)...
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:51:20 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:20.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:20.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:20.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:51:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:20.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:20.256 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:20.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:21 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:51:21 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:51:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:51:21 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:51:22 np0005532585.localdomain systemd[1]: tmp-crun.ctg3Jb.mount: Deactivated successfully.
Nov 23 09:51:22 np0005532585.localdomain podman[297514]: 2025-11-23 09:51:22.02726085 +0000 UTC m=+0.083885619 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:51:22 np0005532585.localdomain podman[297514]: 2025-11-23 09:51:22.042289164 +0000 UTC m=+0.098913893 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 09:51:22 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:51:22 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:23 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:51:24 np0005532585.localdomain podman[297533]: 2025-11-23 09:51:24.017499841 +0000 UTC m=+0.073911702 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:51:24 np0005532585.localdomain podman[297533]: 2025-11-23 09:51:24.050397476 +0000 UTC m=+0.106809337 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:51:24 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:51:24 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:25.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='client.34390 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005532582.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: Added label _no_schedule to host np0005532582.localdomain
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532582.localdomain
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:51:25 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:26 np0005532585.localdomain sudo[297556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:26 np0005532585.localdomain sudo[297556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:26 np0005532585.localdomain sudo[297556]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:26 np0005532585.localdomain sudo[297574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:26 np0005532585.localdomain sudo[297574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 2025-11-23 09:51:26.874007595 +0000 UTC m=+0.074711626 container create 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64)
Nov 23 09:51:26 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:51:26 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:51:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:51:26 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:26 np0005532585.localdomain systemd[1]: Started libpod-conmon-0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b.scope.
Nov 23 09:51:26 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 2025-11-23 09:51:26.843518515 +0000 UTC m=+0.044222566 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 2025-11-23 09:51:26.943413236 +0000 UTC m=+0.144117267 container init 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, io.openshift.tags=rhceph ceph, version=7, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git)
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 2025-11-23 09:51:26.955715176 +0000 UTC m=+0.156419197 container start 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 2025-11-23 09:51:26.956114838 +0000 UTC m=+0.156818899 container attach 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, name=rhceph, release=553, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git)
Nov 23 09:51:26 np0005532585.localdomain jolly_black[297623]: 167 167
Nov 23 09:51:26 np0005532585.localdomain systemd[1]: libpod-0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b.scope: Deactivated successfully.
Nov 23 09:51:26 np0005532585.localdomain podman[297608]: 2025-11-23 09:51:26.972805613 +0000 UTC m=+0.173509604 container died 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:27 np0005532585.localdomain podman[297628]: 2025-11-23 09:51:27.055611517 +0000 UTC m=+0.071617969 container remove 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_CLEAN=True, name=rhceph, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:27 np0005532585.localdomain systemd[1]: libpod-conmon-0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b.scope: Deactivated successfully.
Nov 23 09:51:27 np0005532585.localdomain sudo[297574]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:27 np0005532585.localdomain sudo[297644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:27 np0005532585.localdomain sudo[297644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:27 np0005532585.localdomain sudo[297644]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:27 np0005532585.localdomain sudo[297662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:27 np0005532585.localdomain sudo[297662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 2025-11-23 09:51:27.65471325 +0000 UTC m=+0.052070116 container create f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:51:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc.scope.
Nov 23 09:51:27 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 2025-11-23 09:51:27.705014902 +0000 UTC m=+0.102371788 container init f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, vendor=Red Hat, Inc., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True)
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 2025-11-23 09:51:27.713985919 +0000 UTC m=+0.111342795 container start f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:51:27 np0005532585.localdomain jolly_hamilton[297712]: 167 167
Nov 23 09:51:27 np0005532585.localdomain systemd[1]: libpod-f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc.scope: Deactivated successfully.
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 2025-11-23 09:51:27.714201325 +0000 UTC m=+0.111558201 container attach f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, release=553, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 2025-11-23 09:51:27.719153088 +0000 UTC m=+0.116509994 container died f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, RELEASE=main, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55)
Nov 23 09:51:27 np0005532585.localdomain podman[297697]: 2025-11-23 09:51:27.631270978 +0000 UTC m=+0.028627834 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:27 np0005532585.localdomain podman[297717]: 2025-11-23 09:51:27.810432445 +0000 UTC m=+0.082604710 container remove f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55)
Nov 23 09:51:27 np0005532585.localdomain systemd[1]: libpod-conmon-f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc.scope: Deactivated successfully.
Nov 23 09:51:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-de045266c0b70436cf46a9f4dbb8c09f353a59902278a1e1bd6eb1745b0fd095-merged.mount: Deactivated successfully.
Nov 23 09:51:27 np0005532585.localdomain sudo[297662]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:28 np0005532585.localdomain sudo[297742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:28 np0005532585.localdomain sudo[297742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:28 np0005532585.localdomain sudo[297742]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='client.34394 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005532582.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain"} : dispatch
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain"}]': finished
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:28 np0005532585.localdomain sudo[297760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:28 np0005532585.localdomain sudo[297760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:28 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 2025-11-23 09:51:28.60927435 +0000 UTC m=+0.075881862 container create dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:51:28 np0005532585.localdomain systemd[1]: Started libpod-conmon-dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399.scope.
Nov 23 09:51:28 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 2025-11-23 09:51:28.677875046 +0000 UTC m=+0.144482528 container init dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 2025-11-23 09:51:28.578573862 +0000 UTC m=+0.045181424 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 2025-11-23 09:51:28.683409396 +0000 UTC m=+0.150016878 container start dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, name=rhceph, RELEASE=main)
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 2025-11-23 09:51:28.68351701 +0000 UTC m=+0.150124492 container attach dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:51:28 np0005532585.localdomain flamboyant_bose[297810]: 167 167
Nov 23 09:51:28 np0005532585.localdomain systemd[1]: libpod-dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399.scope: Deactivated successfully.
Nov 23 09:51:28 np0005532585.localdomain podman[297795]: 2025-11-23 09:51:28.687108631 +0000 UTC m=+0.153716173 container died dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553)
Nov 23 09:51:28 np0005532585.localdomain podman[297815]: 2025-11-23 09:51:28.787124816 +0000 UTC m=+0.086164599 container remove dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, GIT_BRANCH=main, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:51:28 np0005532585.localdomain systemd[1]: libpod-conmon-dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399.scope: Deactivated successfully.
Nov 23 09:51:28 np0005532585.localdomain systemd[1]: tmp-crun.TRFUem.mount: Deactivated successfully.
Nov 23 09:51:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-479a262231933904238feb63bcc0dc71b5c814c0bc6e19ad648ea609f66ddc6a-merged.mount: Deactivated successfully.
Nov 23 09:51:28 np0005532585.localdomain sudo[297760]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:29 np0005532585.localdomain sudo[297840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:29 np0005532585.localdomain sudo[297840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:29 np0005532585.localdomain sudo[297840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: from='client.26928 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005532582.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: Removed host np0005532582.localdomain
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:51:29 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:29 np0005532585.localdomain sudo[297858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:29 np0005532585.localdomain sudo[297858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 2025-11-23 09:51:29.596924529 +0000 UTC m=+0.066101870 container create 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7)
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: Started libpod-conmon-7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc.scope.
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 2025-11-23 09:51:29.657477797 +0000 UTC m=+0.126655138 container init 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.expose-services=)
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 2025-11-23 09:51:29.667371173 +0000 UTC m=+0.136548524 container start 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 2025-11-23 09:51:29.667971931 +0000 UTC m=+0.137149342 container attach 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:51:29 np0005532585.localdomain kind_kalam[297907]: 167 167
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: libpod-7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc.scope: Deactivated successfully.
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 2025-11-23 09:51:29.572194447 +0000 UTC m=+0.041371778 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:29 np0005532585.localdomain podman[297892]: 2025-11-23 09:51:29.67183865 +0000 UTC m=+0.141016011 container died 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Nov 23 09:51:29 np0005532585.localdomain sshd[297913]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:51:29 np0005532585.localdomain podman[297914]: 2025-11-23 09:51:29.771699751 +0000 UTC m=+0.088338446 container remove 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.expose-services=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553)
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: libpod-conmon-7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc.scope: Deactivated successfully.
Nov 23 09:51:29 np0005532585.localdomain sshd[297913]: Accepted publickey for tripleo-admin from 192.168.122.11 port 44802 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 09:51:29 np0005532585.localdomain sudo[297858]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:29 np0005532585.localdomain systemd-logind[761]: New session 67 of user tripleo-admin.
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 23 09:51:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fe6ea3d5aaa49791904d8d0a71b2aeed2bf21b87520fe710a00ce138e37436f8-merged.mount: Deactivated successfully.
Nov 23 09:51:29 np0005532585.localdomain systemd[297934]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 23 09:51:29 np0005532585.localdomain sudo[297935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:29 np0005532585.localdomain sudo[297935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:29 np0005532585.localdomain sudo[297935]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:51:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:51:30 np0005532585.localdomain sudo[297959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:30 np0005532585.localdomain sudo[297959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Queued start job for default target Main User Target.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Created slice User Application Slice.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Reached target Paths.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Reached target Timers.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Starting D-Bus User Message Bus Socket...
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Starting Create User's Volatile Files and Directories...
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Listening on D-Bus User Message Bus Socket.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Reached target Sockets.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Finished Create User's Volatile Files and Directories.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Reached target Basic System.
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Reached target Main User Target.
Nov 23 09:51:30 np0005532585.localdomain systemd[297934]: Startup finished in 162ms.
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: Started Session 67 of User tripleo-admin.
Nov 23 09:51:30 np0005532585.localdomain sshd[297913]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:51:30 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:30.261 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 2025-11-23 09:51:30.478668022 +0000 UTC m=+0.076466970 container create 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, io.buildah.version=1.33.12)
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: Started libpod-conmon-8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704.scope.
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 2025-11-23 09:51:30.447790669 +0000 UTC m=+0.045589627 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 2025-11-23 09:51:30.577217712 +0000 UTC m=+0.175016660 container init 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 2025-11-23 09:51:30.587752797 +0000 UTC m=+0.185551745 container start 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, version=7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 2025-11-23 09:51:30.588084627 +0000 UTC m=+0.185883625 container attach 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=)
Nov 23 09:51:30 np0005532585.localdomain vigorous_chandrasekhar[298109]: 167 167
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: libpod-8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704.scope: Deactivated successfully.
Nov 23 09:51:30 np0005532585.localdomain podman[298077]: 2025-11-23 09:51:30.593012839 +0000 UTC m=+0.190811837 container died 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Nov 23 09:51:30 np0005532585.localdomain podman[298130]: 2025-11-23 09:51:30.700813226 +0000 UTC m=+0.094329852 container remove 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, distribution-scope=public, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: libpod-conmon-8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704.scope: Deactivated successfully.
Nov 23 09:51:30 np0005532585.localdomain sudo[298163]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aykzlabswpudgyxhozlatalsiyxphggx ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763891490.1944115-61749-239201748609965/AnsiballZ_lineinfile.py
Nov 23 09:51:30 np0005532585.localdomain sudo[298163]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 09:51:30 np0005532585.localdomain sudo[297959]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:30 np0005532585.localdomain python3[298167]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.104/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 09:51:30 np0005532585.localdomain sudo[298168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:30 np0005532585.localdomain sudo[298168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:30 np0005532585.localdomain sudo[298168]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0869e6f5bd71a32330c1fe830274780e13e8dc09eb2283d10474c8ffc37805ab-merged.mount: Deactivated successfully.
Nov 23 09:51:30 np0005532585.localdomain sudo[298163]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:30 np0005532585.localdomain sudo[298188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:30 np0005532585.localdomain sudo[298188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:51:31 np0005532585.localdomain ceph-mon[289735]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 2025-11-23 09:51:31.402355358 +0000 UTC m=+0.084679463 container create 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:51:31 np0005532585.localdomain systemd[1]: Started libpod-conmon-9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32.scope.
Nov 23 09:51:31 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 2025-11-23 09:51:31.455917011 +0000 UTC m=+0.138241146 container init 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc.)
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 2025-11-23 09:51:31.462932397 +0000 UTC m=+0.145256492 container start 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:51:31 np0005532585.localdomain busy_shtern[298338]: 167 167
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 2025-11-23 09:51:31.463156184 +0000 UTC m=+0.145480329 container attach 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, io.openshift.tags=rhceph ceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container)
Nov 23 09:51:31 np0005532585.localdomain systemd[1]: libpod-9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32.scope: Deactivated successfully.
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 2025-11-23 09:51:31.466193908 +0000 UTC m=+0.148518033 container died 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=553, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:31 np0005532585.localdomain podman[298295]: 2025-11-23 09:51:31.371324311 +0000 UTC m=+0.053648496 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:31 np0005532585.localdomain podman[298349]: 2025-11-23 09:51:31.531281976 +0000 UTC m=+0.061732206 container remove 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container)
Nov 23 09:51:31 np0005532585.localdomain systemd[1]: libpod-conmon-9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32.scope: Deactivated successfully.
Nov 23 09:51:31 np0005532585.localdomain sudo[298401]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnvtdeskoysswvaeginamlrfxjaeoelw ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763891491.1803975-61765-219150511570531/AnsiballZ_command.py
Nov 23 09:51:31 np0005532585.localdomain sudo[298188]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:31 np0005532585.localdomain sudo[298401]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 09:51:31 np0005532585.localdomain python3[298403]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.104/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:51:31 np0005532585.localdomain sudo[298401]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-01e53491aa1cafe287592c7c2fbb8b8ac2528f37f535f1d3c7046663c850de0c-merged.mount: Deactivated successfully.
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:51:32 np0005532585.localdomain podman[298511]: 2025-11-23 09:51:32.297905427 +0000 UTC m=+0.085681735 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:51:32 np0005532585.localdomain sudo[298585]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlokodjwhypwtrpqtrlbgqlukcwkuiey ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763891491.901428-61776-147974007797171/AnsiballZ_command.py
Nov 23 09:51:32 np0005532585.localdomain podman[298511]: 2025-11-23 09:51:32.336245569 +0000 UTC m=+0.124021827 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 09:51:32 np0005532585.localdomain sudo[298585]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:51:32 np0005532585.localdomain podman[298512]: 2025-11-23 09:51:32.35409701 +0000 UTC m=+0.138769202 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:51:32 np0005532585.localdomain podman[298512]: 2025-11-23 09:51:32.369223157 +0000 UTC m=+0.153895379 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:51:32 np0005532585.localdomain podman[298530]: 2025-11-23 09:51:32.413279106 +0000 UTC m=+0.184089910 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:51:32 np0005532585.localdomain podman[298530]: 2025-11-23 09:51:32.447347167 +0000 UTC m=+0.218157931 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:51:32 np0005532585.localdomain python3[298597]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.104 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.611693) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492611751, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2583, "num_deletes": 254, "total_data_size": 8106438, "memory_usage": 8660384, "flush_reason": "Manual Compaction"}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492631483, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4896659, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15522, "largest_seqno": 18100, "table_properties": {"data_size": 4885941, "index_size": 6583, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27662, "raw_average_key_size": 22, "raw_value_size": 4862418, "raw_average_value_size": 3969, "num_data_blocks": 286, "num_entries": 1225, "num_filter_entries": 1225, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891440, "oldest_key_time": 1763891440, "file_creation_time": 1763891492, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 19849 microseconds, and 9831 cpu microseconds.
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.631543) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4896659 bytes OK
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.631571) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.633076) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.633097) EVENT_LOG_v1 {"time_micros": 1763891492633090, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.633120) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8093887, prev total WAL file size 8093887, number of live WAL files 2.
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.634593) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4781KB)], [24(13MB)]
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492634656, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 19321946, "oldest_snapshot_seqno": -1}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10612 keys, 16121559 bytes, temperature: kUnknown
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492716461, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 16121559, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16058820, "index_size": 35118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 284598, "raw_average_key_size": 26, "raw_value_size": 15875185, "raw_average_value_size": 1495, "num_data_blocks": 1346, "num_entries": 10612, "num_filter_entries": 10612, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891492, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.716940) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 16121559 bytes
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.718580) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.7 rd, 196.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.7, 13.8 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(7.2) write-amplify(3.3) OK, records in: 11164, records dropped: 552 output_compression: NoCompression
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.718614) EVENT_LOG_v1 {"time_micros": 1763891492718600, "job": 12, "event": "compaction_finished", "compaction_time_micros": 81977, "compaction_time_cpu_micros": 49452, "output_level": 6, "num_output_files": 1, "total_output_size": 16121559, "num_input_records": 11164, "num_output_records": 10612, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492719489, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492721835, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.634526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:51:32 np0005532585.localdomain ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:51:32 np0005532585.localdomain systemd[1]: tmp-crun.vxo10U.mount: Deactivated successfully.
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:51:33 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:34 np0005532585.localdomain sudo[298585]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:34 np0005532585.localdomain ceph-mon[289735]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:51:34 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:51:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:35.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:35.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:51:35 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:36 np0005532585.localdomain sudo[298626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:51:36 np0005532585.localdomain sudo[298626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:36 np0005532585.localdomain sudo[298626]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='client.34402 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: Saving service mon spec with placement label:mon
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:36 np0005532585.localdomain ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:51:37 np0005532585.localdomain ceph-mon[289735]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:51:38 np0005532585.localdomain sshd[298644]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:51:38 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb159600 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Nov 23 09:51:38 np0005532585.localdomain ceph-mon[289735]: mon.np0005532585@2(peon) e11  removed from monmap, suicide.
Nov 23 09:51:38 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb159080 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Nov 23 09:51:38 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Nov 23 09:51:38 np0005532585.localdomain sudo[298645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:38 np0005532585.localdomain sudo[298645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:38 np0005532585.localdomain sudo[298645]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:38 np0005532585.localdomain podman[298659]: 2025-11-23 09:51:38.737587005 +0000 UTC m=+0.053740169 container died 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, ceph=True, name=rhceph, io.openshift.expose-services=, version=7, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:51:38 np0005532585.localdomain systemd[1]: tmp-crun.zznJAV.mount: Deactivated successfully.
Nov 23 09:51:38 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1-merged.mount: Deactivated successfully.
Nov 23 09:51:38 np0005532585.localdomain podman[298659]: 2025-11-23 09:51:38.777547819 +0000 UTC m=+0.093700943 container remove 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, io.buildah.version=1.33.12, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:51:38 np0005532585.localdomain sudo[298675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b --name mon.np0005532585 --force
Nov 23 09:51:38 np0005532585.localdomain sudo[298675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:51:39 np0005532585.localdomain podman[298760]: 2025-11-23 09:51:39.410081912 +0000 UTC m=+0.144365375 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:51:39 np0005532585.localdomain podman[298760]: 2025-11-23 09:51:39.420460532 +0000 UTC m=+0.154743955 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:51:39 np0005532585.localdomain podman[298758]: 2025-11-23 09:51:39.508125336 +0000 UTC m=+0.246102972 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 23 09:51:39 np0005532585.localdomain podman[298758]: 2025-11-23 09:51:39.525238545 +0000 UTC m=+0.263216181 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: tmp-crun.AMozBn.mount: Deactivated successfully.
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: ceph-46550e70-79cb-5f55-bf6d-1204b97e083b@mon.np0005532585.service: Deactivated successfully.
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: Stopped Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: ceph-46550e70-79cb-5f55-bf6d-1204b97e083b@mon.np0005532585.service: Consumed 8.465s CPU time.
Nov 23 09:51:39 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:51:40 np0005532585.localdomain systemd-rc-local-generator[298856]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:51:40 np0005532585.localdomain systemd-sysv-generator[298860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:40.265 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:40 np0005532585.localdomain sudo[298675]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:40 np0005532585.localdomain sudo[298870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:40 np0005532585.localdomain sudo[298870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:40 np0005532585.localdomain sudo[298870]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:40 np0005532585.localdomain sshd[298644]: Received disconnect from 107.172.15.139 port 54190:11: Bye Bye [preauth]
Nov 23 09:51:40 np0005532585.localdomain sshd[298644]: Disconnected from authenticating user root 107.172.15.139 port 54190 [preauth]
Nov 23 09:51:40 np0005532585.localdomain sudo[298888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:51:40 np0005532585.localdomain sudo[298888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:41 np0005532585.localdomain systemd[1]: tmp-crun.ygnu2R.mount: Deactivated successfully.
Nov 23 09:51:41 np0005532585.localdomain podman[298980]: 2025-11-23 09:51:41.371098551 +0000 UTC m=+0.096035604 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, name=rhceph, architecture=x86_64)
Nov 23 09:51:41 np0005532585.localdomain podman[298980]: 2025-11-23 09:51:41.506291292 +0000 UTC m=+0.231228345 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55)
Nov 23 09:51:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:51:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:51:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:51:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151667 "" "Go-http-client/1.1"
Nov 23 09:51:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:51:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18220 "" "Go-http-client/1.1"
Nov 23 09:51:42 np0005532585.localdomain sudo[298888]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:43 np0005532585.localdomain sudo[299078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:51:43 np0005532585.localdomain sudo[299078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:43 np0005532585.localdomain sudo[299078]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:43 np0005532585.localdomain sudo[299096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:51:43 np0005532585.localdomain sudo[299096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:43 np0005532585.localdomain sudo[299096]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299114]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:44 np0005532585.localdomain sudo[299132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299132]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299150]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299184]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299202]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:51:44 np0005532585.localdomain sudo[299220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299220]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:51:44 np0005532585.localdomain sudo[299238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299238]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:51:44 np0005532585.localdomain sudo[299256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299256]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299274]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:44 np0005532585.localdomain sudo[299292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299292]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299310]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299344]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:44 np0005532585.localdomain sudo[299362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:51:44 np0005532585.localdomain sudo[299362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:44 np0005532585.localdomain sudo[299362]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:45 np0005532585.localdomain sudo[299380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:51:45 np0005532585.localdomain sudo[299380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:45 np0005532585.localdomain sudo[299380]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:45.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:45 np0005532585.localdomain sudo[299398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:51:45 np0005532585.localdomain sudo[299398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:45 np0005532585.localdomain sudo[299398]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:50.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:50.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:50.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:51:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:50.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:50.281 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:51:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:50.282 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:51:51 np0005532585.localdomain sshd[299416]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:51:51 np0005532585.localdomain sshd[299416]: Received disconnect from 207.154.194.2 port 52168:11: Bye Bye [preauth]
Nov 23 09:51:51 np0005532585.localdomain sshd[299416]: Disconnected from authenticating user root 207.154.194.2 port 52168 [preauth]
Nov 23 09:51:51 np0005532585.localdomain sudo[299418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:51 np0005532585.localdomain sudo[299418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:51 np0005532585.localdomain sudo[299418]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:51 np0005532585.localdomain sudo[299436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:51 np0005532585.localdomain sudo[299436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: tmp-crun.13zmaT.mount: Deactivated successfully.
Nov 23 09:51:52 np0005532585.localdomain podman[299468]: 2025-11-23 09:51:52.441087047 +0000 UTC m=+0.099437898 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 23 09:51:52 np0005532585.localdomain podman[299468]: 2025-11-23 09:51:52.452302143 +0000 UTC m=+0.110652944 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 2025-11-23 09:51:52.473374404 +0000 UTC m=+0.098865402 container create ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: Started libpod-conmon-ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b.scope.
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 2025-11-23 09:51:52.436348631 +0000 UTC m=+0.061839639 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 2025-11-23 09:51:52.55367307 +0000 UTC m=+0.179164048 container init ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55)
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 2025-11-23 09:51:52.567562099 +0000 UTC m=+0.193053077 container start ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, release=553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, version=7)
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 2025-11-23 09:51:52.567858558 +0000 UTC m=+0.193349546 container attach ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, GIT_CLEAN=True, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, ceph=True, version=7, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12)
Nov 23 09:51:52 np0005532585.localdomain great_cartwright[299504]: 167 167
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: libpod-ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b.scope: Deactivated successfully.
Nov 23 09:51:52 np0005532585.localdomain podman[299477]: 2025-11-23 09:51:52.571648315 +0000 UTC m=+0.197139333 container died ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=)
Nov 23 09:51:52 np0005532585.localdomain podman[299509]: 2025-11-23 09:51:52.666390168 +0000 UTC m=+0.085973803 container remove ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:51:52 np0005532585.localdomain systemd[1]: libpod-conmon-ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b.scope: Deactivated successfully.
Nov 23 09:51:52 np0005532585.localdomain sudo[299436]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:52 np0005532585.localdomain sudo[299526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:52 np0005532585.localdomain sudo[299526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:52 np0005532585.localdomain sudo[299526]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:52 np0005532585.localdomain sudo[299544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:52 np0005532585.localdomain sudo[299544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 
Nov 23 09:51:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-23ae74c38d711b8b9a8ca0a30471e2bc5566d22040d2eaf932254cefbb30d7b4-merged.mount: Deactivated successfully.
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 2025-11-23 09:51:53.428224941 +0000 UTC m=+0.082959610 container create 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_BRANCH=main, RELEASE=main, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:53 np0005532585.localdomain systemd[1]: Started libpod-conmon-2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7.scope.
Nov 23 09:51:53 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 2025-11-23 09:51:53.390860498 +0000 UTC m=+0.045595247 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 2025-11-23 09:51:53.49269341 +0000 UTC m=+0.147428079 container init 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 2025-11-23 09:51:53.503806592 +0000 UTC m=+0.158541261 container start 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=)
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 2025-11-23 09:51:53.50406723 +0000 UTC m=+0.158801899 container attach 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, version=7, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:53 np0005532585.localdomain sleepy_goldstine[299594]: 167 167
Nov 23 09:51:53 np0005532585.localdomain systemd[1]: libpod-2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7.scope: Deactivated successfully.
Nov 23 09:51:53 np0005532585.localdomain podman[299579]: 2025-11-23 09:51:53.507367693 +0000 UTC m=+0.162102392 container died 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Nov 23 09:51:53 np0005532585.localdomain podman[299599]: 2025-11-23 09:51:53.600190447 +0000 UTC m=+0.085439458 container remove 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, release=553, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, distribution-scope=public, GIT_BRANCH=main)
Nov 23 09:51:53 np0005532585.localdomain systemd[1]: libpod-conmon-2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7.scope: Deactivated successfully.
Nov 23 09:51:53 np0005532585.localdomain sudo[299544]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:53 np0005532585.localdomain sudo[299624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:53 np0005532585.localdomain sudo[299624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:53 np0005532585.localdomain sudo[299624]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:53 np0005532585.localdomain sudo[299642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:53 np0005532585.localdomain sudo[299642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:51:54 np0005532585.localdomain podman[299660]: 2025-11-23 09:51:54.275723847 +0000 UTC m=+0.083065884 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:51:54 np0005532585.localdomain podman[299660]: 2025-11-23 09:51:54.291269616 +0000 UTC m=+0.098611603 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: tmp-crun.KnNkxO.mount: Deactivated successfully.
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8f101ae6276431bad24401ca6c7eed7501adeb83fe683350b44a57b67c6504b2-merged.mount: Deactivated successfully.
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 2025-11-23 09:51:54.427142608 +0000 UTC m=+0.078493702 container create c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public)
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: Started libpod-conmon-c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4.scope.
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 2025-11-23 09:51:54.394575193 +0000 UTC m=+0.045926357 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 2025-11-23 09:51:54.506327181 +0000 UTC m=+0.157678275 container init c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, vcs-type=git, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55)
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 2025-11-23 09:51:54.518613771 +0000 UTC m=+0.169964865 container start c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, com.redhat.component=rhceph-container, release=553, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True)
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 2025-11-23 09:51:54.518884559 +0000 UTC m=+0.170235673 container attach c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True)
Nov 23 09:51:54 np0005532585.localdomain clever_mcclintock[299714]: 167 167
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: libpod-c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4.scope: Deactivated successfully.
Nov 23 09:51:54 np0005532585.localdomain podman[299699]: 2025-11-23 09:51:54.521593873 +0000 UTC m=+0.172944967 container died c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph)
Nov 23 09:51:54 np0005532585.localdomain podman[299719]: 2025-11-23 09:51:54.618392908 +0000 UTC m=+0.084296131 container remove c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, GIT_CLEAN=True, name=rhceph, architecture=x86_64, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553)
Nov 23 09:51:54 np0005532585.localdomain systemd[1]: libpod-conmon-c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4.scope: Deactivated successfully.
Nov 23 09:51:54 np0005532585.localdomain sudo[299642]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:54 np0005532585.localdomain sudo[299743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:54 np0005532585.localdomain sudo[299743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:54 np0005532585.localdomain sudo[299743]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:54 np0005532585.localdomain sudo[299761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:54 np0005532585.localdomain sudo[299762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:55 np0005532585.localdomain sudo[299761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:55 np0005532585.localdomain sudo[299762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:55 np0005532585.localdomain sudo[299761]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:55 np0005532585.localdomain sudo[299797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:55 np0005532585.localdomain sudo[299797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:51:55.283 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0a686a22fcbc80824f12418328d51472afb4a2c25dbb4a8c0bba9f01058edc01-merged.mount: Deactivated successfully.
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 2025-11-23 09:51:55.423612151 +0000 UTC m=+0.067848995 container create 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64)
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: Started libpod-conmon-65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593.scope.
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 2025-11-23 09:51:55.482393774 +0000 UTC m=+0.126630648 container init 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12)
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 2025-11-23 09:51:55.495192828 +0000 UTC m=+0.139429702 container start 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph)
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 2025-11-23 09:51:55.495514718 +0000 UTC m=+0.139751592 container attach 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, version=7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Nov 23 09:51:55 np0005532585.localdomain zealous_noyce[299859]: 167 167
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: libpod-65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593.scope: Deactivated successfully.
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 2025-11-23 09:51:55.398556138 +0000 UTC m=+0.042793082 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:55 np0005532585.localdomain podman[299838]: 2025-11-23 09:51:55.498072327 +0000 UTC m=+0.142309231 container died 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=)
Nov 23 09:51:55 np0005532585.localdomain podman[299864]: 2025-11-23 09:51:55.588979002 +0000 UTC m=+0.078344158 container remove 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_CLEAN=True)
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: libpod-conmon-65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593.scope: Deactivated successfully.
Nov 23 09:51:55 np0005532585.localdomain sudo[299762]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:55 np0005532585.localdomain sudo[299887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:51:55 np0005532585.localdomain sudo[299887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:55 np0005532585.localdomain sudo[299887]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:55 np0005532585.localdomain sudo[299924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:55 np0005532585.localdomain sudo[299924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 2025-11-23 09:51:55.922405249 +0000 UTC m=+0.077471042 container create ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main)
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: Started libpod-conmon-ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236.scope.
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 2025-11-23 09:51:55.977458676 +0000 UTC m=+0.132524479 container init ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, io.buildah.version=1.33.12, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph)
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 2025-11-23 09:51:55.986473155 +0000 UTC m=+0.141538948 container start ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main)
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 2025-11-23 09:51:55.986806735 +0000 UTC m=+0.141872548 container attach ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, release=553, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=)
Nov 23 09:51:55 np0005532585.localdomain priceless_nightingale[299964]: 167 167
Nov 23 09:51:55 np0005532585.localdomain systemd[1]: libpod-ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236.scope: Deactivated successfully.
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 2025-11-23 09:51:55.990092017 +0000 UTC m=+0.145157860 container died ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, version=7, release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:51:55 np0005532585.localdomain podman[299948]: 2025-11-23 09:51:55.892287839 +0000 UTC m=+0.047353652 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:56 np0005532585.localdomain podman[299969]: 2025-11-23 09:51:56.089189183 +0000 UTC m=+0.091948077 container remove ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, version=7, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True)
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: libpod-conmon-ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236.scope: Deactivated successfully.
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 2025-11-23 09:51:56.197153295 +0000 UTC m=+0.070319491 container create 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main)
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9.scope.
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 2025-11-23 09:51:56.252194112 +0000 UTC m=+0.125360328 container init 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True)
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 2025-11-23 09:51:56.26022199 +0000 UTC m=+0.133388186 container start 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, GIT_CLEAN=True, ceph=True, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 2025-11-23 09:51:56.260418996 +0000 UTC m=+0.133585212 container attach 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, architecture=x86_64, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, release=553)
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 2025-11-23 09:51:56.172556666 +0000 UTC m=+0.045722892 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: libpod-8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9.scope: Deactivated successfully.
Nov 23 09:51:56 np0005532585.localdomain podman[299985]: 2025-11-23 09:51:56.359742981 +0000 UTC m=+0.232909207 container died 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: tmp-crun.7n1Igb.mount: Deactivated successfully.
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0b66821f6b0214633642870aee82d6d400b109c83764b6fa0093be26ceba0836-merged.mount: Deactivated successfully.
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e-merged.mount: Deactivated successfully.
Nov 23 09:51:56 np0005532585.localdomain podman[300036]: 2025-11-23 09:51:56.465405201 +0000 UTC m=+0.090573536 container remove 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: libpod-conmon-8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9.scope: Deactivated successfully.
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:51:56 np0005532585.localdomain systemd-rc-local-generator[300076]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:51:56 np0005532585.localdomain systemd-sysv-generator[300080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:56 np0005532585.localdomain systemd[1]: Reloading.
Nov 23 09:51:57 np0005532585.localdomain systemd-rc-local-generator[300116]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 09:51:57 np0005532585.localdomain systemd-sysv-generator[300124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: Starting Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b...
Nov 23 09:51:57 np0005532585.localdomain podman[300181]: 
Nov 23 09:51:57 np0005532585.localdomain podman[300181]: 2025-11-23 09:51:57.671227641 +0000 UTC m=+0.078558655 container create 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: tmp-crun.lXn0m3.mount: Deactivated successfully.
Nov 23 09:51:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff)
Nov 23 09:51:57 np0005532585.localdomain podman[300181]: 2025-11-23 09:51:57.637949784 +0000 UTC m=+0.045280788 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:57 np0005532585.localdomain podman[300181]: 2025-11-23 09:51:57.737869607 +0000 UTC m=+0.145200611 container init 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, version=7, release=553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: tmp-crun.UQmXQ3.mount: Deactivated successfully.
Nov 23 09:51:57 np0005532585.localdomain podman[300181]: 2025-11-23 09:51:57.753285162 +0000 UTC m=+0.160616176 container start 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12)
Nov 23 09:51:57 np0005532585.localdomain bash[300181]: 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf
Nov 23 09:51:57 np0005532585.localdomain systemd[1]: Started Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b.
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: pidfile_write: ignore empty --pid-file
Nov 23 09:51:57 np0005532585.localdomain sudo[299797]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: load: jerasure load: lrc 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: RocksDB version: 7.9.2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Git sha 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: DB SUMMARY
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: DB Session ID:  R30MDH64VRAWCJ1C6PRG
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: CURRENT file:  CURRENT
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005532585/store.db dir, Total Num: 0, files: 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005532585/store.db: 000004.log size: 761 ; 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                         Options.error_if_exists: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.create_if_missing: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                                     Options.env: 0x56515fa7d9e0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                                Options.info_log: 0x5651615bcd20
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                              Options.statistics: (nil)
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                               Options.use_fsync: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                              Options.db_log_dir: 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                                 Options.wal_dir: 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                    Options.write_buffer_manager: 0x5651615cd540
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.unordered_write: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                               Options.row_cache: None
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                              Options.wal_filter: None
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.two_write_queues: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.wal_compression: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.atomic_flush: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.max_background_jobs: 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.max_background_compactions: -1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.max_subcompactions: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.max_total_wal_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                          Options.max_open_files: -1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:       Options.compaction_readahead_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Compression algorithms supported:
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kZSTD supported: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kXpressCompression supported: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kBZip2Compression supported: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kLZ4Compression supported: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kZlibCompression supported: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kLZ4HCCompression supported: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         kSnappyCompression supported: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:           Options.merge_operator: 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:        Options.compaction_filter: None
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5651615bc980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x5651615b9350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:        Options.write_buffer_size: 33554432
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:  Options.max_write_buffer_number: 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.compression: NoCompression
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.num_levels: 7
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.table_properties_collectors: 
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                           Options.bloom_locality: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                               Options.ttl: 2592000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                       Options.enable_blob_files: false
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                           Options.min_blob_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4d2c9233-e977-47c6-b4f9-0c301abf625f
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891517810836, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891517813616, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891517813735, "job": 1, "event": "recovery_finished"}
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5651615e0e00
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: DB pointer 0x5651616d6000
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 does not exist in monmap, will attempt to join an existing cluster
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5651615b9350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 6.6e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: using public_addr v2:172.18.0.104:0/0 -> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: starting mon.np0005532585 rank -1 at public addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] at bind addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005532585 fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(???) e0 preinit fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing) e11 sync_obtain_latest_monmap
Nov 23 09:51:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing) e11 sync_obtain_latest_monmap obtained monmap e11
Nov 23 09:51:57 np0005532585.localdomain podman[300242]: 
Nov 23 09:51:57 np0005532585.localdomain podman[300242]: 2025-11-23 09:51:57.952816429 +0000 UTC m=+0.091043311 container create 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55)
Nov 23 09:51:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd.scope.
Nov 23 09:51:58 np0005532585.localdomain podman[300242]: 2025-11-23 09:51:57.914255799 +0000 UTC m=+0.052482711 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:51:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:51:58 np0005532585.localdomain podman[300242]: 2025-11-23 09:51:58.040690989 +0000 UTC m=+0.178917861 container init 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main)
Nov 23 09:51:58 np0005532585.localdomain podman[300242]: 2025-11-23 09:51:58.053296658 +0000 UTC m=+0.191523570 container start 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, RELEASE=main)
Nov 23 09:51:58 np0005532585.localdomain podman[300242]: 2025-11-23 09:51:58.053760683 +0000 UTC m=+0.191987605 container attach 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, distribution-scope=public, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:51:58 np0005532585.localdomain eager_thompson[300256]: 167 167
Nov 23 09:51:58 np0005532585.localdomain systemd[1]: libpod-67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd.scope: Deactivated successfully.
Nov 23 09:51:58 np0005532585.localdomain podman[300242]: 2025-11-23 09:51:58.061120839 +0000 UTC m=+0.199347791 container died 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:51:58 np0005532585.localdomain podman[300261]: 2025-11-23 09:51:58.135782053 +0000 UTC m=+0.065805412 container remove 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, io.buildah.version=1.33.12)
Nov 23 09:51:58 np0005532585.localdomain systemd[1]: libpod-conmon-67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd.scope: Deactivated successfully.
Nov 23 09:51:58 np0005532585.localdomain sudo[299924]: pam_unix(sudo:session): session closed for user root
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).mds e16 new map
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        14
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-11-23T08:00:26.486221+0000
                                                           modified        2025-11-23T09:47:19.846415+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        79
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26392}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26392 members: 26392
                                                           [mds.mds.np0005532586.mfohsb{0:26392} state up:active seq 12 addr [v2:172.18.0.108:6808/2718449296,v1:172.18.0.108:6809/2718449296] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005532585.jcltnl{-1:17133} state up:standby seq 1 addr [v2:172.18.0.107:6808/563301557,v1:172.18.0.107:6809/563301557] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005532584.aoxjmw{-1:17139} state up:standby seq 1 addr [v2:172.18.0.106:6808/2261302276,v1:172.18.0.106:6809/2261302276] compat {c=[1],r=[1],i=[17ff]}]
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 3314933000852226048, adjusting msgr requires
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:51:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).paxosservice(auth 1..38) refresh upgraded, format 0 -> 3
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:51:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:51:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.286 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.289 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:00.292 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:52:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 4944 writes, 22K keys, 4944 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4944 writes, 665 syncs, 7.43 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 100 writes, 292 keys, 100 commit groups, 1.0 writes per commit group, ingest: 0.32 MB, 0.00 MB/s
                                                          Interval WAL: 100 writes, 47 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/392405253' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/392405253' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:52:02 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:52:02 np0005532585.localdomain sudo[300277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:02 np0005532585.localdomain sudo[300277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:52:02 np0005532585.localdomain sudo[300277]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:52:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:52:03 np0005532585.localdomain systemd[1]: tmp-crun.zvmW8A.mount: Deactivated successfully.
Nov 23 09:52:03 np0005532585.localdomain sudo[300308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:52:03 np0005532585.localdomain sudo[300308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:03 np0005532585.localdomain podman[300295]: 2025-11-23 09:52:03.01630571 +0000 UTC m=+0.077751750 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:52:03 np0005532585.localdomain podman[300302]: 2025-11-23 09:52:03.0341549 +0000 UTC m=+0.085382355 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:52:03 np0005532585.localdomain podman[300302]: 2025-11-23 09:52:03.069713278 +0000 UTC m=+0.120940752 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 23 09:52:03 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:52:03 np0005532585.localdomain podman[300295]: 2025-11-23 09:52:03.103008755 +0000 UTC m=+0.164454805 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 23 09:52:03 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:52:03 np0005532585.localdomain podman[300296]: 2025-11-23 09:52:03.071252856 +0000 UTC m=+0.127817025 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:52:03 np0005532585.localdomain podman[300296]: 2025-11-23 09:52:03.154237175 +0000 UTC m=+0.210801314 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 23 09:52:03 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:52:03 np0005532585.localdomain podman[300444]: 2025-11-23 09:52:03.859178113 +0000 UTC m=+0.100845783 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:52:03 np0005532585.localdomain podman[300444]: 2025-11-23 09:52:03.975492921 +0000 UTC m=+0.217160580 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Nov 23 09:52:04 np0005532585.localdomain sudo[300308]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:04 np0005532585.localdomain sudo[300563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:04 np0005532585.localdomain sudo[300563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:04 np0005532585.localdomain sudo[300563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:04 np0005532585.localdomain sudo[300581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:52:04 np0005532585.localdomain sudo[300581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 09:52:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5910 writes, 25K keys, 5910 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5910 writes, 864 syncs, 6.84 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 174 writes, 477 keys, 174 commit groups, 1.0 writes per commit group, ingest: 0.65 MB, 0.00 MB/s
                                                          Interval WAL: 174 writes, 76 syncs, 2.29 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 09:52:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:05.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:05.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:05.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:05 np0005532585.localdomain sudo[300581]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:05 np0005532585.localdomain sudo[300631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:52:05 np0005532585.localdomain sudo[300631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:05 np0005532585.localdomain sudo[300631]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2486142759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3793959948' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:52:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.443 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.444 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.445 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.445 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.780 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.797 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.798 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.798 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.799 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:06.799 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.267 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.268 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.268 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.268 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.269 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:52:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.729 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.795 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.795 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.966 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.967 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11705MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:52:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:07.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.221 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.222 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.222 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.265 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:52:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.687 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.693 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.712 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.714 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:52:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:08.715 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:52:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:52:09.289 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:52:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:52:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:52:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:52:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:52:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:09.715 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:09.715 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:52:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:52:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:52:10 np0005532585.localdomain systemd[1]: tmp-crun.L8BmlU.mount: Deactivated successfully.
Nov 23 09:52:10 np0005532585.localdomain podman[300694]: 2025-11-23 09:52:10.0294073 +0000 UTC m=+0.084701704 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:52:10 np0005532585.localdomain podman[300694]: 2025-11-23 09:52:10.044155956 +0000 UTC m=+0.099450400 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:52:10 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:52:10 np0005532585.localdomain podman[300693]: 2025-11-23 09:52:10.127802006 +0000 UTC m=+0.183858583 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:52:10 np0005532585.localdomain podman[300693]: 2025-11-23 09:52:10.168352697 +0000 UTC m=+0.224409244 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 23 09:52:10 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:52:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:10.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:10.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:10 np0005532585.localdomain sudo[300735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:52:10 np0005532585.localdomain sudo[300735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:10 np0005532585.localdomain sudo[300735]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:52:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:52:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:52:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:52:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:52:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18701 "" "Go-http-client/1.1"
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/230438963' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2283958204' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/4261675563' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2756094432' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/1611615758' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='client.26985 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: Reconfig service osd.default_drive_group
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr handle_mgr_map Activating!
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr handle_mgr_map I am now activating
Nov 23 09:52:12 np0005532585.localdomain sshd[295958]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:52:12 np0005532585.localdomain systemd[1]: session-66.scope: Deactivated successfully.
Nov 23 09:52:12 np0005532585.localdomain systemd[1]: session-66.scope: Consumed 23.454s CPU time.
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: balancer
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] Starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain systemd-logind[761]: Session 66 logged out. Waiting for processes to exit.
Nov 23 09:52:12 np0005532585.localdomain systemd-logind[761]: Removed session 66.
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] Optimize plan auto_2025-11-23_09:52:12
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Nov 23 09:52:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [cephadm WARNING root] removing stray HostCache host record np0005532582.localdomain.devices.0
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005532582.localdomain.devices.0
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: cephadm
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: crash
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: devicehealth
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: iostat
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: nfs
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: orchestrator
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [devicehealth INFO root] Starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: pg_autoscaler
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: progress
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Loading...
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f5eaa22c7f0>, <progress.module.GhostEvent object at 0x7f5eaa22c820>, <progress.module.GhostEvent object at 0x7f5eaa22cdc0>, <progress.module.GhostEvent object at 0x7f5eaa22cfd0>, <progress.module.GhostEvent object at 0x7f5eaa237040>, <progress.module.GhostEvent object at 0x7f5eaa237070>, <progress.module.GhostEvent object at 0x7f5eaa2370a0>, <progress.module.GhostEvent object at 0x7f5eaa2370d0>, <progress.module.GhostEvent object at 0x7f5eaa237100>, <progress.module.GhostEvent object at 0x7f5eaa237130>, <progress.module.GhostEvent object at 0x7f5eaa237160>, <progress.module.GhostEvent object at 0x7f5eaa237190>, <progress.module.GhostEvent object at 0x7f5eaa2371c0>, <progress.module.GhostEvent object at 0x7f5eaa2371f0>, <progress.module.GhostEvent object at 0x7f5eaa237220>, <progress.module.GhostEvent object at 0x7f5eaa237250>, <progress.module.GhostEvent object at 0x7f5eaa237280>, <progress.module.GhostEvent object at 0x7f5eaa2372b0>, <progress.module.GhostEvent object at 0x7f5eaa2372e0>, <progress.module.GhostEvent object at 0x7f5eaa237310>, <progress.module.GhostEvent object at 0x7f5eaa237340>, <progress.module.GhostEvent object at 0x7f5eaa237370>, <progress.module.GhostEvent object at 0x7f5eaa2373a0>, <progress.module.GhostEvent object at 0x7f5eaa2373d0>, <progress.module.GhostEvent object at 0x7f5eaa237400>, <progress.module.GhostEvent object at 0x7f5eaa237430>, <progress.module.GhostEvent object at 0x7f5eaa237460>, <progress.module.GhostEvent object at 0x7f5eaa237490>, <progress.module.GhostEvent object at 0x7f5eaa2374c0>, <progress.module.GhostEvent object at 0x7f5eaa2374f0>, <progress.module.GhostEvent object at 0x7f5eaa237520>, <progress.module.GhostEvent object at 0x7f5eaa237550>, <progress.module.GhostEvent object at 0x7f5eaa237580>, <progress.module.GhostEvent object at 0x7f5eaa2375b0>, <progress.module.GhostEvent object at 0x7f5eaa2375e0>, <progress.module.GhostEvent object at 0x7f5eaa237610>, <progress.module.GhostEvent object at 0x7f5eaa237640>, <progress.module.GhostEvent object at 0x7f5eaa237670>, <progress.module.GhostEvent object at 0x7f5eaa2376a0>, <progress.module.GhostEvent object at 0x7f5eaa2376d0>, <progress.module.GhostEvent object at 0x7f5eaa237700>, <progress.module.GhostEvent object at 0x7f5eaa237730>, <progress.module.GhostEvent object at 0x7f5eaa237760>, <progress.module.GhostEvent object at 0x7f5eaa237790>, <progress.module.GhostEvent object at 0x7f5eaa2377c0>, <progress.module.GhostEvent object at 0x7f5eaa2377f0>, <progress.module.GhostEvent object at 0x7f5eaa237820>, <progress.module.GhostEvent object at 0x7f5eaa237850>, <progress.module.GhostEvent object at 0x7f5eaa237880>, <progress.module.GhostEvent object at 0x7f5eaa2378b0>] historic events
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] _maybe_adjust
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Loaded OSDMap, ready.
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] recovery thread starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] starting setup
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: rbd_support
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: restful
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [restful INFO root] server_addr: :: server_port: 8003
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: status
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: telemetry
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [restful WARNING root] server not running: no certificate configured
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: mgr load Constructed class from module: volumes
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.916+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.917+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.917+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.917+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] PerfHandler: starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_task_task: vms, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_task_task: volumes, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_task_task: images, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_task_task: backups, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] TaskHandler: starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Nov 23 09:52:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] setup complete
Nov 23 09:52:13 np0005532585.localdomain sshd[300894]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:52:13 np0005532585.localdomain sshd[300894]: Accepted publickey for ceph-admin from 192.168.122.107 port 44862 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 09:52:13 np0005532585.localdomain systemd-logind[761]: New session 69 of user ceph-admin.
Nov 23 09:52:13 np0005532585.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Nov 23 09:52:13 np0005532585.localdomain sshd[300894]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 09:52:13 np0005532585.localdomain sudo[300898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:13 np0005532585.localdomain sudo[300898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:13 np0005532585.localdomain sudo[300898]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:13 np0005532585.localdomain sudo[300916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:52:13 np0005532585.localdomain sudo[300916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:13 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Bus STARTING
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Bus STARTING
Nov 23 09:52:14 np0005532585.localdomain podman[301015]: 2025-11-23 09:52:14.112464576 +0000 UTC m=+0.079393820 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=553, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Serving on http://172.18.0.107:8765
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Serving on http://172.18.0.107:8765
Nov 23 09:52:14 np0005532585.localdomain podman[301015]: 2025-11-23 09:52:14.217063602 +0000 UTC m=+0.183992856 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, release=553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Serving on https://172.18.0.107:7150
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Serving on https://172.18.0.107:7150
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Client ('172.18.0.107', 35764) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Bus STARTED
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Client ('172.18.0.107', 35764) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Bus STARTED
Nov 23 09:52:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:14 np0005532585.localdomain sudo[300916]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:14 np0005532585.localdomain sudo[301148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:14 np0005532585.localdomain sudo[301148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:14 np0005532585.localdomain sudo[301148]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:15 np0005532585.localdomain ceph-mgr[288287]: [devicehealth INFO root] Check health
Nov 23 09:52:15 np0005532585.localdomain sudo[301175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:52:15 np0005532585.localdomain sudo[301175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:15.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:15.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:15 np0005532585.localdomain sudo[301175]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:15 np0005532585.localdomain sudo[301225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:15 np0005532585.localdomain sudo[301225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:15 np0005532585.localdomain sudo[301225]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:15 np0005532585.localdomain sudo[301243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:52:15 np0005532585.localdomain sudo[301243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:16 np0005532585.localdomain sudo[301243]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e85 e85: 6 total, 6 up, 6 in
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/885747258' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: Activating manager daemon np0005532585.gzafiw
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: mgrmap e29: np0005532585.gzafiw(active, starting, since 0.0443887s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532582.gilwrz
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mds metadata", "who": "mds.np0005532585.jcltnl"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mds metadata", "who": "mds.np0005532584.aoxjmw"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mds metadata", "who": "mds.np0005532586.mfohsb"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr metadata", "who": "np0005532583.orhywt", "id": "np0005532583.orhywt"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr metadata", "who": "np0005532582.gilwrz", "id": "np0005532582.gilwrz"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mds metadata"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: Manager daemon np0005532585.gzafiw is now available
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"}]': finished
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"}]': finished
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532585.gzafiw/mirror_snapshot_schedule"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532585.gzafiw/trash_purge_schedule"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: removing stray HostCache host record np0005532582.localdomain.devices.0
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: mgrmap e30: np0005532585.gzafiw(active, since 1.06726s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532582.gilwrz
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Bus STARTING
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Serving on http://172.18.0.107:8765
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Serving on https://172.18.0.107:7150
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Client ('172.18.0.107', 35764) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Bus STARTED
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: mgrmap e31: np0005532585.gzafiw(active, since 2s), standbys: np0005532586.thmvqb, np0005532583.orhywt
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:16 np0005532585.localdomain sudo[301279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:52:16 np0005532585.localdomain sudo[301279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:16 np0005532585.localdomain sudo[301279]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:16 np0005532585.localdomain sudo[301297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:52:16 np0005532585.localdomain sudo[301297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:16 np0005532585.localdomain sudo[301297]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:16 np0005532585.localdomain sudo[301315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:16 np0005532585.localdomain sudo[301315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:16 np0005532585.localdomain sudo[301315]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:16 np0005532585.localdomain sudo[301333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:16 np0005532585.localdomain sudo[301333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:16 np0005532585.localdomain sudo[301333]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301351]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301385]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301403]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain sudo[301421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301421]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:17 np0005532585.localdomain sudo[301439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:17 np0005532585.localdomain sudo[301439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301439]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:17 np0005532585.localdomain sudo[301457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301457]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mgr.np0005532584.naxwxy 172.18.0.106:0/4210916137; not ready for session (expect reconnect)
Nov 23 09:52:17 np0005532585.localdomain sudo[301475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301475]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:17 np0005532585.localdomain sudo[301493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301493]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301511]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301545]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:17 np0005532585.localdomain sudo[301563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:17 np0005532585.localdomain sudo[301563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:17 np0005532585.localdomain sudo[301563]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain sudo[301581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:18 np0005532585.localdomain sudo[301581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301581]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain sudo[301599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:52:18 np0005532585.localdomain sudo[301599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301599]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain sudo[301617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:52:18 np0005532585.localdomain sudo[301617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301617]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain sudo[301635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:52:18 np0005532585.localdomain sudo[301635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301635]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain sudo[301653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:18 np0005532585.localdomain sudo[301653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301653]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:18 np0005532585.localdomain sudo[301671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:18 np0005532585.localdomain sudo[301671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301671]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain sudo[301705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:52:18 np0005532585.localdomain sudo[301705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301705]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain sudo[301723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:52:18 np0005532585.localdomain sudo[301723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301723]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain sudo[301741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain sudo[301741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301741]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain sudo[301759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:18 np0005532585.localdomain sudo[301759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301759]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:18 np0005532585.localdomain sudo[301777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:18 np0005532585.localdomain sudo[301777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301777]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:18 np0005532585.localdomain sudo[301795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:52:18 np0005532585.localdomain sudo[301795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:18 np0005532585.localdomain sudo[301795]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain sudo[301813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:19 np0005532585.localdomain sudo[301813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:19 np0005532585.localdomain sudo[301813]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain sudo[301831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:52:19 np0005532585.localdomain sudo[301831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:19 np0005532585.localdomain sudo[301831]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain sudo[301865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:52:19 np0005532585.localdomain sudo[301865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:19 np0005532585.localdomain sudo[301865]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain sudo[301883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:52:19 np0005532585.localdomain sudo[301883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:19 np0005532585.localdomain sudo[301883]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain sudo[301901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:19 np0005532585.localdomain sudo[301901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:19 np0005532585.localdomain sudo[301901]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 521307b1-a7a2-414b-b092-fbb546dc0500 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 521307b1-a7a2-414b-b092-fbb546dc0500 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 521307b1-a7a2-414b-b092-fbb546dc0500 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 23 09:52:19 np0005532585.localdomain sudo[301919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:52:19 np0005532585.localdomain sudo[301919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:19 np0005532585.localdomain sudo[301919]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:52:19 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:52:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:20.300 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:20.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:20.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:52:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:20.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:20.302 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:20.304 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:20 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:20 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: mgrmap e32: np0005532585.gzafiw(active, since 4s), standbys: np0005532586.thmvqb, np0005532583.orhywt
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Standby manager daemon np0005532584.naxwxy started
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 5s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 23 09:52:20 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:52:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:52:20 np0005532585.localdomain sudo[301937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:20 np0005532585.localdomain sudo[301937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:20 np0005532585.localdomain sudo[301937]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:21 np0005532585.localdomain sudo[301955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:21 np0005532585.localdomain sudo[301955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:21 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:21 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 2025-11-23 09:52:21.485597713 +0000 UTC m=+0.065055498 container create ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=553)
Nov 23 09:52:21 np0005532585.localdomain systemd[1]: Started libpod-conmon-ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697.scope.
Nov 23 09:52:21 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 2025-11-23 09:52:21.540357112 +0000 UTC m=+0.119814887 container init ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_CLEAN=True)
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 2025-11-23 09:52:21.548048639 +0000 UTC m=+0.127506384 container start ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container)
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 2025-11-23 09:52:21.548171283 +0000 UTC m=+0.127629108 container attach ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, name=rhceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64)
Nov 23 09:52:21 np0005532585.localdomain pedantic_engelbart[302004]: 167 167
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 2025-11-23 09:52:21.564797806 +0000 UTC m=+0.144255581 container died ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:52:21 np0005532585.localdomain systemd[1]: libpod-ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697.scope: Deactivated successfully.
Nov 23 09:52:21 np0005532585.localdomain podman[301989]: 2025-11-23 09:52:21.466409981 +0000 UTC m=+0.045867766 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:52:21 np0005532585.localdomain podman[302009]: 2025-11-23 09:52:21.627273143 +0000 UTC m=+0.054162102 container remove ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7)
Nov 23 09:52:21 np0005532585.localdomain systemd[1]: libpod-conmon-ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697.scope: Deactivated successfully.
Nov 23 09:52:21 np0005532585.localdomain sudo[301955]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:21 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:52:21 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:52:21 np0005532585.localdomain sudo[302033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:52:21 np0005532585.localdomain sudo[302033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:21 np0005532585.localdomain sudo[302033]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:22 np0005532585.localdomain sudo[302051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:22 np0005532585.localdomain sudo[302051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:22 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:22 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 2025-11-23 09:52:22.457657852 +0000 UTC m=+0.077853554 container create 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True)
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: Started libpod-conmon-99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d.scope.
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9752fa5865112c512d3f7a5f95ecd2cb8913bae0943175cbf5dcbefa602d9cf4-merged.mount: Deactivated successfully.
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 2025-11-23 09:52:22.42615442 +0000 UTC m=+0.046350172 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 2025-11-23 09:52:22.556843821 +0000 UTC m=+0.177039533 container init 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 2025-11-23 09:52:22.569170612 +0000 UTC m=+0.189366324 container start 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., release=553, version=7)
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 2025-11-23 09:52:22.569562444 +0000 UTC m=+0.189758206 container attach 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, version=7)
Nov 23 09:52:22 np0005532585.localdomain funny_mendeleev[302101]: 167 167
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: libpod-99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d.scope: Deactivated successfully.
Nov 23 09:52:22 np0005532585.localdomain podman[302102]: 2025-11-23 09:52:22.609791025 +0000 UTC m=+0.092845236 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:52:22 np0005532585.localdomain podman[302086]: 2025-11-23 09:52:22.622686393 +0000 UTC m=+0.242882145 container died 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7)
Nov 23 09:52:22 np0005532585.localdomain podman[302102]: 2025-11-23 09:52:22.677182924 +0000 UTC m=+0.160237095 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:52:22 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 23 09:52:22 np0005532585.localdomain podman[302117]: 2025-11-23 09:52:22.730067345 +0000 UTC m=+0.142774815 container remove 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, version=7, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Nov 23 09:52:22 np0005532585.localdomain systemd[1]: libpod-conmon-99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d.scope: Deactivated successfully.
Nov 23 09:52:22 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:52:22 np0005532585.localdomain sudo[302051]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:22 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:52:22 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:52:23 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:23 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ad075ef5d574535aa6ab2256e568bae2affa1278efe97261b9789d7ad7244932-merged.mount: Deactivated successfully.
Nov 23 09:52:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:52:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.44396 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:52:24 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:24 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:52:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:52:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:52:25 np0005532585.localdomain podman[302148]: 2025-11-23 09:52:25.0270715 +0000 UTC m=+0.080652010 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:52:25 np0005532585.localdomain podman[302148]: 2025-11-23 09:52:25.038558964 +0000 UTC m=+0.092139454 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:52:25 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:52:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 2ad7aaa5-10dd-4472-80a4-9813541370c6 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 2ad7aaa5-10dd-4472-80a4-9813541370c6 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 2ad7aaa5-10dd-4472-80a4-9813541370c6 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 23 09:52:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:25.306 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:25.308 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:25.308 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:52:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:25.308 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:25.337 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:25.338 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:25 np0005532585.localdomain sudo[302171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:52:25 np0005532585.localdomain sudo[302171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:25 np0005532585.localdomain sudo[302171]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:25 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:25 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:26 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:26 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:52:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.44402 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:26 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Saving service mon spec with placement label:mon
Nov 23 09:52:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 9beaf830-fe6f-4101-965a-6cb54d294b00 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 9beaf830-fe6f-4101-965a-6cb54d294b00 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 9beaf830-fe6f-4101-965a-6cb54d294b00 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 23 09:52:27 np0005532585.localdomain sudo[302189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:52:27 np0005532585.localdomain sudo[302189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:27 np0005532585.localdomain sudo[302189]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:27 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='client.44396 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='client.44402 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: Saving service mon spec with placement label:mon
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532583 (monmap changed)...
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.524024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548524149, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 12105, "num_deletes": 294, "total_data_size": 21098157, "memory_usage": 22079120, "flush_reason": "Manual Compaction"}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548594790, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 19536601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 12110, "table_properties": {"data_size": 19467576, "index_size": 39159, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28805, "raw_key_size": 305052, "raw_average_key_size": 26, "raw_value_size": 19268000, "raw_average_value_size": 1673, "num_data_blocks": 1501, "num_entries": 11513, "num_filter_entries": 11513, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 1763891517, "file_creation_time": 1763891548, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 70907 microseconds, and 36020 cpu microseconds.
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.594880) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 19536601 bytes OK
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.594971) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.597460) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.597493) EVENT_LOG_v1 {"time_micros": 1763891548597482, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.597518) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 21013716, prev total WAL file size 21013716, number of live WAL files 2.
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.601868) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323634' seq:72057594037927935, type:22 .. '6B760031353238' seq:0, type:0; will stop at (end)
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(18MB) 8(1887B)]
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548602007, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 19538488, "oldest_snapshot_seqno": -1}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 11250 keys, 19533230 bytes, temperature: kUnknown
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548683464, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 19533230, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19464858, "index_size": 39151, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28165, "raw_key_size": 301020, "raw_average_key_size": 26, "raw_value_size": 19268651, "raw_average_value_size": 1712, "num_data_blocks": 1500, "num_entries": 11250, "num_filter_entries": 11250, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891548, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.683850) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 19533230 bytes
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.685711) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 239.4 rd, 239.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(18.6, 0.0 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 11518, records dropped: 268 output_compression: NoCompression
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.685743) EVENT_LOG_v1 {"time_micros": 1763891548685728, "job": 4, "event": "compaction_finished", "compaction_time_micros": 81608, "compaction_time_cpu_micros": 45489, "output_level": 6, "num_output_files": 1, "total_output_size": 19533230, "num_input_records": 11518, "num_output_records": 11250, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548689107, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548689183, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.601731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:52:29 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532586 (monmap changed)...
Nov 23 09:52:29 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532586 (monmap changed)...
Nov 23 09:52:29 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:52:29 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:52:29 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:29 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:52:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:52:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:30.339 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:30 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:30 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:30 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:52:31 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.27030 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532585", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:52:31 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:31 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:32 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:32 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532586 (monmap changed)...
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mon[300199]: from='client.27030 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532585", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:52:32 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:33 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:33 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:52:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:52:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:52:34 np0005532585.localdomain podman[302209]: 2025-11-23 09:52:34.021816395 +0000 UTC m=+0.076825752 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 09:52:34 np0005532585.localdomain podman[302209]: 2025-11-23 09:52:34.031267876 +0000 UTC m=+0.086277253 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 09:52:34 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:52:34 np0005532585.localdomain podman[302208]: 2025-11-23 09:52:34.073974024 +0000 UTC m=+0.130175897 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 23 09:52:34 np0005532585.localdomain podman[302210]: 2025-11-23 09:52:34.137631317 +0000 UTC m=+0.187186966 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41)
Nov 23 09:52:34 np0005532585.localdomain podman[302208]: 2025-11-23 09:52:34.150380851 +0000 UTC m=+0.206582784 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 23 09:52:34 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:52:34 np0005532585.localdomain podman[302210]: 2025-11-23 09:52:34.171725659 +0000 UTC m=+0.221281358 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 23 09:52:34 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:52:34 np0005532585.localdomain sshd[297986]: Received disconnect from 192.168.122.11 port 44802:11: disconnected by user
Nov 23 09:52:34 np0005532585.localdomain sshd[297986]: Disconnected from user tripleo-admin 192.168.122.11 port 44802
Nov 23 09:52:34 np0005532585.localdomain sshd[297913]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 23 09:52:34 np0005532585.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Nov 23 09:52:34 np0005532585.localdomain systemd[1]: session-67.scope: Consumed 1.543s CPU time.
Nov 23 09:52:34 np0005532585.localdomain systemd-logind[761]: Session 67 logged out. Waiting for processes to exit.
Nov 23 09:52:34 np0005532585.localdomain systemd-logind[761]: Removed session 67.
Nov 23 09:52:34 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:34 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:34 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:35.344 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:35.346 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:35.346 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:52:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:35.346 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:35.372 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:35.373 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:35 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:35 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:36 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:36 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:36 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:37 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:37 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:38 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:38 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:38 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:39 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:39 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id
Nov 23 09:52:39 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.27051 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532583", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:52:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:40.374 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:40 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:40 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.562670) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560562739, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 391, "num_deletes": 254, "total_data_size": 231811, "memory_usage": 240056, "flush_reason": "Manual Compaction"}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560567549, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 230728, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12111, "largest_seqno": 12501, "table_properties": {"data_size": 228059, "index_size": 716, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7357, "raw_average_key_size": 20, "raw_value_size": 222415, "raw_average_value_size": 624, "num_data_blocks": 28, "num_entries": 356, "num_filter_entries": 356, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891552, "oldest_key_time": 1763891552, "file_creation_time": 1763891560, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 4904 microseconds, and 1295 cpu microseconds.
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/2656055149' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/1137795569' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='client.27051 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532583", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.567587) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 230728 bytes OK
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.567605) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569290) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569302) EVENT_LOG_v1 {"time_micros": 1763891560569299, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569316) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 228880, prev total WAL file size 228962, number of live WAL files 2.
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569742) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(225KB)], [15(18MB)]
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560569767, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 19763958, "oldest_snapshot_seqno": -1}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11086 keys, 16905942 bytes, temperature: kUnknown
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560635393, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16905942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16840908, "index_size": 36197, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 298202, "raw_average_key_size": 26, "raw_value_size": 16649857, "raw_average_value_size": 1501, "num_data_blocks": 1373, "num_entries": 11086, "num_filter_entries": 11086, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891560, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.639686) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16905942 bytes
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.641185) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 300.7 rd, 257.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.6 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(158.9) write-amplify(73.3) OK, records in: 11606, records dropped: 520 output_compression: NoCompression
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.641207) EVENT_LOG_v1 {"time_micros": 1763891560641197, "job": 6, "event": "compaction_finished", "compaction_time_micros": 65732, "compaction_time_cpu_micros": 24383, "output_level": 6, "num_output_files": 1, "total_output_size": 16905942, "num_input_records": 11606, "num_output_records": 11086, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560641328, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560643013, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569687) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:40 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:52:40 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:52:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:52:41 np0005532585.localdomain podman[302269]: 2025-11-23 09:52:41.042140145 +0000 UTC m=+0.096822448 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:52:41 np0005532585.localdomain podman[302269]: 2025-11-23 09:52:41.052818505 +0000 UTC m=+0.107500778 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3)
Nov 23 09:52:41 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:52:41 np0005532585.localdomain systemd[1]: tmp-crun.qxhc1i.mount: Deactivated successfully.
Nov 23 09:52:41 np0005532585.localdomain podman[302270]: 2025-11-23 09:52:41.146514055 +0000 UTC m=+0.196652798 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:52:41 np0005532585.localdomain podman[302270]: 2025-11-23 09:52:41.161238339 +0000 UTC m=+0.211377082 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:52:41 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.27057 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532583"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Remove daemons mon.np0005532583
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005532583
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584'] (from ['np0005532586', 'np0005532584'])
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584'] (from ['np0005532586', 'np0005532584'])
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005532583 from monmap...
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing monitor np0005532583 from monmap...
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports []
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports []
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/1518513680 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55eed5af7c00 0x55eed596e680 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:52:41 np0005532585.localdomain ceph-osd[31905]: --2- [v2:172.18.0.107:6800/1293390152,v1:172.18.0.107:6801/1293390152] >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55d7b9741000 0x55d7bb6a2100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/1450606455 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55eed509d800 0x55eed48db700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/487837919 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x55eed537a400 0x55eed5428b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: client.27018 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/1151827140 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x55eed51f6800 0x55eed51fc580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/1334229557 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x55eed537a800 0x55eed5429b80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: client.27015 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:52:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:52:41 np0005532585.localdomain sudo[302311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:52:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:52:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:52:41 np0005532585.localdomain sudo[302311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:41 np0005532585.localdomain sudo[302311]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:52:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18718 "" "Go-http-client/1.1"
Nov 23 09:52:42 np0005532585.localdomain sudo[302329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:52:42 np0005532585.localdomain sudo[302329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302329]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain sudo[302347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:42 np0005532585.localdomain sudo[302347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302347]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain sudo[302365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:42 np0005532585.localdomain sudo[302365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302365]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain sudo[302383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:42 np0005532585.localdomain sudo[302383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302383]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain sudo[302417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:42 np0005532585.localdomain sudo[302417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302417]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory
Nov 23 09:52:42 np0005532585.localdomain sudo[302435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:42 np0005532585.localdomain sudo[302435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302435]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain sudo[302453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain sudo[302453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302453]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:42 np0005532585.localdomain sudo[302471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:42 np0005532585.localdomain sudo[302471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302471]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain sudo[302489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:42 np0005532585.localdomain sudo[302489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302489]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:42 np0005532585.localdomain sudo[302507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:42 np0005532585.localdomain sudo[302507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302507]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain sudo[302525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:42 np0005532585.localdomain sudo[302525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302525]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:52:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:52:42 np0005532585.localdomain sudo[302543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:42 np0005532585.localdomain sudo[302543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:42 np0005532585.localdomain sudo[302543]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:43 np0005532585.localdomain sudo[302577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:43 np0005532585.localdomain sudo[302577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:43 np0005532585.localdomain sudo[302577]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:43 np0005532585.localdomain sudo[302595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:43 np0005532585.localdomain sudo[302595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:43 np0005532585.localdomain sudo[302595]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:43 np0005532585.localdomain sudo[302613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:43 np0005532585.localdomain sudo[302613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:43 np0005532585.localdomain sudo[302613]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:43 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:43 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Activating special unit Exit the Session...
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped target Main User Target.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped target Basic System.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped target Paths.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped target Sockets.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped target Timers.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Closed D-Bus User Message Bus Socket.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Stopped Create User's Volatile Files and Directories.
Nov 23 09:52:44 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Removed slice User Application Slice.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Reached target Shutdown.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Finished Exit the Session.
Nov 23 09:52:44 np0005532585.localdomain systemd[297934]: Reached target Exit the Session.
Nov 23 09:52:44 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 23 09:52:44 np0005532585.localdomain systemd[1]: user-1003.slice: Consumed 2.080s CPU time.
Nov 23 09:52:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@-1(probing) e13  my rank is now 2 (was -1)
Nov 23 09:52:44 np0005532585.localdomain ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:52:44 np0005532585.localdomain ceph-mon[300199]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 23 09:52:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:44 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:45.375 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:45.378 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:52:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:45.378 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:52:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:45.378 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:45.379 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:52:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:45.382 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:45 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:45 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:46 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:46 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:46 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e13 handle_auth_request failed to assign global_id
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 9e93c160-06b9-44cd-a955-a2657d2a9665 (Updating mon deployment (+1 -> 4))
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.34482 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Removed label mon from host np0005532583.localdomain
Nov 23 09:52:47 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed label mon from host np0005532583.localdomain
Nov 23 09:52:48 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:48 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:48 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:49 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:49 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Remove daemons mon.np0005532583
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584'] (from ['np0005532586', 'np0005532584'])
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Removing monitor np0005532583 from monmap...
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports []
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584 in quorum (ranks 0,1)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: monmap epoch 12
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:52:41.580241+0000
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 29s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584 in quorum (ranks 0,1)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: monmap epoch 13
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:52:42.566799+0000
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 34s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Health check failed: 1/3 mons down, quorum np0005532586,np0005532584 (MON_DOWN)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005532586,np0005532584
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005532586,np0005532584
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]:     mon.np0005532585 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Deploying daemon mon.np0005532583 on np0005532583.localdomain
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='client.34482 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Removed label mon from host np0005532583.localdomain
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e13 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e13 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mgrc update_daemon_metadata mon.np0005532585 metadata {addrs=[v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005532585.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005532585.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 calling monitor election
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584,np0005532585 in quorum (ranks 0,1,2)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: monmap epoch 13
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:52:42.566799+0000
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 37s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005532586,np0005532584)
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: Cluster is now healthy
Nov 23 09:52:49 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 09:52:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Nov 23 09:52:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 9e93c160-06b9-44cd-a955-a2657d2a9665 (Updating mon deployment (+1 -> 4))
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 9e93c160-06b9-44cd-a955-a2657d2a9665 (Updating mon deployment (+1 -> 4)) in 3 seconds
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 789950ff-e8d9-41e1-82ae-3a055add8a86 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 789950ff-e8d9-41e1-82ae-3a055add8a86 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 789950ff-e8d9-41e1-82ae-3a055add8a86 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 23 09:52:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:50.381 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect)
Nov 23 09:52:50 np0005532585.localdomain sudo[302632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:52:50 np0005532585.localdomain sudo[302632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:50 np0005532585.localdomain sudo[302632]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (2) No such file or directory
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument
Nov 23 09:52:50 np0005532585.localdomain ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:52:50 np0005532585.localdomain ceph-mon[300199]: paxos.2).electionLogic(54) init, last seen epoch 54
Nov 23 09:52:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:50 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:51 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_report got status from non-daemon mon.np0005532585
Nov 23 09:52:51 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:51.428+0000 7f5ec5b52640 -1 mgr.server handle_report got status from non-daemon mon.np0005532585
Nov 23 09:52:51 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:51 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument
Nov 23 09:52:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id
Nov 23 09:52:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id
Nov 23 09:52:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id
Nov 23 09:52:52 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:52 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument
Nov 23 09:52:52 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:52 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:52:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:52:53 np0005532585.localdomain podman[302650]: 2025-11-23 09:52:53.032188055 +0000 UTC m=+0.082304200 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 09:52:53 np0005532585.localdomain podman[302650]: 2025-11-23 09:52:53.041204183 +0000 UTC m=+0.091320368 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:52:53 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:52:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id
Nov 23 09:52:53 np0005532585.localdomain sshd[302670]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:52:53 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:53 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument
Nov 23 09:52:54 np0005532585.localdomain sshd[302670]: Received disconnect from 207.154.194.2 port 52230:11: Bye Bye [preauth]
Nov 23 09:52:54 np0005532585.localdomain sshd[302670]: Disconnected from authenticating user root 207.154.194.2 port 52230 [preauth]
Nov 23 09:52:54 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:54 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument
Nov 23 09:52:54 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:52:55.385 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 calling monitor election
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 calling monitor election
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532583 calling monitor election
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584,np0005532585,np0005532583 in quorum (ranks 0,1,2,3)
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: monmap epoch 14
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:52:50.591476+0000
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 43s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:52:55 np0005532585.localdomain sudo[302672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:52:55 np0005532585.localdomain sudo[302672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:52:55 np0005532585.localdomain sudo[302672]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:55 np0005532585.localdomain sudo[302696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:52:55 np0005532585.localdomain sudo[302696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:55 np0005532585.localdomain sudo[302696]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:55 np0005532585.localdomain podman[302690]: 2025-11-23 09:52:55.907456407 +0000 UTC m=+0.087692606 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:52:55 np0005532585.localdomain podman[302690]: 2025-11-23 09:52:55.923950116 +0000 UTC m=+0.104186305 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:52:55 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:52:55 np0005532585.localdomain sudo[302724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:55 np0005532585.localdomain sudo[302724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:55 np0005532585.localdomain sudo[302724]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain sudo[302749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:56 np0005532585.localdomain sudo[302749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302749]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain sudo[302767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:56 np0005532585.localdomain sudo[302767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302767]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain sudo[302801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:56 np0005532585.localdomain sudo[302801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain sudo[302801]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e14 handle_auth_request failed to assign global_id
Nov 23 09:52:56 np0005532585.localdomain sudo[302819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:52:56 np0005532585.localdomain sudo[302819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302819]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain sudo[302837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain sudo[302837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302837]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain sudo[302855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:56 np0005532585.localdomain sudo[302855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302855]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.44437 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:56 np0005532585.localdomain sudo[302873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect)
Nov 23 09:52:56 np0005532585.localdomain sudo[302873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Removed label mgr from host np0005532583.localdomain
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005532583.localdomain
Nov 23 09:52:56 np0005532585.localdomain sudo[302873]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain sudo[302891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:56 np0005532585.localdomain sudo[302891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302891]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532583"} : dispatch
Nov 23 09:52:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:56 np0005532585.localdomain sudo[302909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:52:56 np0005532585.localdomain sudo[302909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302909]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain sudo[302927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:56 np0005532585.localdomain sudo[302927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302927]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:56 np0005532585.localdomain sudo[302961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:56 np0005532585.localdomain sudo[302961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:56 np0005532585.localdomain sudo[302961]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:57 np0005532585.localdomain sudo[302979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:52:57 np0005532585.localdomain sudo[302979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:57 np0005532585.localdomain sudo[302979]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:57 np0005532585.localdomain sudo[302997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:57 np0005532585.localdomain sudo[302997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:52:57 np0005532585.localdomain sudo[302997]: pam_unix(sudo:session): session closed for user root
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev b9da0499-3f45-4785-8af4-dab2f930c5d1 (Updating mgr deployment (-1 -> 3))
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005532583.orhywt from np0005532583.localdomain -- ports [8765]
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005532583.orhywt from np0005532583.localdomain -- ports [8765]
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_report got status from non-daemon mon.np0005532583
Nov 23 09:52:57 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:57.603+0000 7f5ec5b52640 -1 mgr.server handle_report got status from non-daemon mon.np0005532583
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1019640621 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54102 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Removed label _admin from host np0005532583.localdomain
Nov 23 09:52:57 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005532583.localdomain
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='client.44437 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: Removed label mgr from host np0005532583.localdomain
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:57 np0005532585.localdomain ceph-mon[300199]: Removing daemon mgr.np0005532583.orhywt from np0005532583.localdomain -- ports [8765]
Nov 23 09:52:58 np0005532585.localdomain sshd[303015]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:52:58 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:52:58 np0005532585.localdomain ceph-mon[300199]: from='client.54102 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:52:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:52:58 np0005532585.localdomain ceph-mon[300199]: Removed label _admin from host np0005532583.localdomain
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005532583.orhywt
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005532583.orhywt
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev b9da0499-3f45-4785-8af4-dab2f930c5d1 (Updating mgr deployment (-1 -> 3))
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event b9da0499-3f45-4785-8af4-dab2f930c5d1 (Updating mgr deployment (-1 -> 3)) in 2 seconds
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 577fc77b-eab4-40d2-9e0b-126d92a62b09 (Updating mon deployment (-1 -> 3))
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584', 'np0005532585'] (from ['np0005532586', 'np0005532584', 'np0005532585'])
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584', 'np0005532585'] (from ['np0005532586', 'np0005532584', 'np0005532585'])
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005532583 from monmap...
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing monitor np0005532583 from monmap...
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports []
Nov 23 09:52:59 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports []
Nov 23 09:52:59 np0005532585.localdomain ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:52:59 np0005532585.localdomain ceph-mon[300199]: paxos.2).electionLogic(58) init, last seen epoch 58
Nov 23 09:52:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:52:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:52:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:53:00 np0005532585.localdomain sshd[303015]: Received disconnect from 107.172.15.139 port 56954:11: Bye Bye [preauth]
Nov 23 09:53:00 np0005532585.localdomain sshd[303015]: Disconnected from authenticating user root 107.172.15.139 port 56954 [preauth]
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/636913553' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/636913553' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:00.387 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: Removing key for mgr.np0005532583.orhywt
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584', 'np0005532585'] (from ['np0005532586', 'np0005532584', 'np0005532585'])
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: Removing monitor np0005532583 from monmap...
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports []
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 calling monitor election
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584,np0005532585 in quorum (ranks 0,1,2)
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 calling monitor election
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: monmap epoch 15
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:52:59.410256+0000
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 46s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/636913553' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/636913553' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:53:00 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:53:00 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:01 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 577fc77b-eab4-40d2-9e0b-126d92a62b09 (Updating mon deployment (-1 -> 3))
Nov 23 09:53:01 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 577fc77b-eab4-40d2-9e0b-126d92a62b09 (Updating mon deployment (-1 -> 3)) in 2 seconds
Nov 23 09:53:01 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 29a52c30-22cc-4956-a3b3-845bf23eb769 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:53:01 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 29a52c30-22cc-4956-a3b3-845bf23eb769 (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:53:01 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 29a52c30-22cc-4956-a3b3-845bf23eb769 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 23 09:53:01 np0005532585.localdomain sudo[303017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:53:01 np0005532585.localdomain sudo[303017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:01 np0005532585.localdomain sudo[303017]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:01 np0005532585.localdomain ceph-mon[300199]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020047544 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:02 np0005532585.localdomain sudo[303035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:53:02 np0005532585.localdomain sudo[303035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:02 np0005532585.localdomain sudo[303035]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:53:02 np0005532585.localdomain sudo[303053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:53:02 np0005532585.localdomain sudo[303053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:02 np0005532585.localdomain sudo[303053]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:53:02 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:53:03 np0005532585.localdomain sudo[303071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303071]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:03 np0005532585.localdomain sudo[303089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303089]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303107]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303141]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303159]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain sudo[303177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303177]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain sudo[303195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:53:03 np0005532585.localdomain sudo[303195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303195]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain sudo[303213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:53:03 np0005532585.localdomain sudo[303213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303213]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303231]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:03 np0005532585.localdomain sudo[303249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303249]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: Removing np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:03 np0005532585.localdomain sudo[303267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303267]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:03 np0005532585.localdomain sudo[303301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:03 np0005532585.localdomain sudo[303301]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:03 np0005532585.localdomain sudo[303319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:04 np0005532585.localdomain sudo[303319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:04 np0005532585.localdomain sudo[303319]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:04 np0005532585.localdomain sudo[303337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:04 np0005532585.localdomain sudo[303337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:53:04 np0005532585.localdomain sudo[303337]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:04 np0005532585.localdomain podman[303355]: 2025-11-23 09:53:04.19034269 +0000 UTC m=+0.090574585 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:53:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:04.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:04 np0005532585.localdomain podman[303355]: 2025-11-23 09:53:04.226227078 +0000 UTC m=+0.126459033 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev 24c6c7df-b5c3-4b98-be65-f132a49e64bd (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev 24c6c7df-b5c3-4b98-be65-f132a49e64bd (Updating node-proxy deployment (+4 -> 4))
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event 24c6c7df-b5c3-4b98-be65-f132a49e64bd (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 23 09:53:04 np0005532585.localdomain podman[303372]: 2025-11-23 09:53:04.309327081 +0000 UTC m=+0.101598045 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: tmp-crun.2XLYxt.mount: Deactivated successfully.
Nov 23 09:53:04 np0005532585.localdomain podman[303373]: 2025-11-23 09:53:04.394513299 +0000 UTC m=+0.183919225 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 09:53:04 np0005532585.localdomain podman[303372]: 2025-11-23 09:53:04.400861935 +0000 UTC m=+0.193132859 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:53:04 np0005532585.localdomain podman[303373]: 2025-11-23 09:53:04.411324668 +0000 UTC m=+0.200730564 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:53:04 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:53:04 np0005532585.localdomain sudo[303419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:53:04 np0005532585.localdomain sudo[303419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:04 np0005532585.localdomain sudo[303419]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:53:04 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:05.230 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:05.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:05 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:05 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:05 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:05 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:05 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532583 (monmap changed)...
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.500 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.501 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.501 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.502 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:53:06 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:06 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:06 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:06 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:06 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:06.986 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.001 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.002 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.003 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.003 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.003 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.004 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 09:53:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:07.020 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3407580757' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:07 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:07 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:07 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:07 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054607 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:08.016 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2726720879' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3513795904' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:08 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:08 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:08 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:08 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:08 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.241 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.241 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:53:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:53:09.291 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:53:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:53:09.291 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:53:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:53:09.292 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3226986166' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.34497 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005532583.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Added label _no_schedule to host np0005532583.localdomain
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005532583.localdomain
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532583.localdomain
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532583.localdomain
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:53:09 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1028998822' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:09 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.706 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.777 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.778 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.992 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.993 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11665MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.994 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:53:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:09.994 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.166 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.167 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.167 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.376 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.392 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1028998822' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:10 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.807 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.808 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.811 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04f4ed9c-0995-4b03-b3ea-dc7fd153d98f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.808442', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '386e83a0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '30bf39fa25edec39fdabe77d5901037814038b1c81939aabcdd801b063df3823'}]}, 'timestamp': '2025-11-23 09:53:10.812847', '_unique_id': '4622551b5cc94ac1945f250f70eb4fea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:53:10 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3035427830' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.834 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.841 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.844 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e78e6cc1-314c-4b5e-8f81-cb983af9867c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.815757', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38736eba-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'a324e1610bb2f64bc18f4bcafa9abad6db1d2276d7dc92e70f0aef1220c7aa9b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.815757', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38738422-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '0e5ffe4a9e30b10c9ca688bddfad946d57b0798b7639e0a48d41e89fc64e0b72'}]}, 'timestamp': '2025-11-23 09:53:10.845563', '_unique_id': '1e34a06bae194c149e7968b3f478603d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd873577e-aa58-4889-8c1b-31101ab82b73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.848970', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38741be4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '2884c44b5650ed78043411fceab1611b1206c7b85e63b6f18e6877475a2aee39'}]}, 'timestamp': '2025-11-23 09:53:10.849486', '_unique_id': 'ca0943d2368d4d06ac8c8d7f222f7757'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9f8b86b-1f7d-4570-9ae2-3f03e988ac02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.851729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38748796-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '2cdb915e3f9ee9ab02c446481959ea4a964226ba54fdbab5f7701111b2ba4f4f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.851729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38749812-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'd5c74cb75117085aad28392c3dbd3744881bb85b7d681605a4024dbb854fa40f'}]}, 'timestamp': '2025-11-23 09:53:10.852613', '_unique_id': '19cdce8e606340f8a9357607810e681f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.854 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.862 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.865 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:53:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:10.865 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5188e5d0-3be0-4d87-94c5-6bf5b9188149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.854837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38767ed4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'dd7e5a02baa997ce9ad71a9f9145f435ce7ae3fa488637c20d9414771bc269cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.854837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38769216-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'bd91e5596d4b776b4cdd28eb20f795fedfb3ce90bce9d957fac116697c9165d6'}]}, 'timestamp': '2025-11-23 09:53:10.865583', '_unique_id': '11b98da210a84f1d877100b003b0f3ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db6ad8e4-f74e-446d-9194-d5a369c92b8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.868302', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38770f7a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'b1094e5c30b8174733a73694c88f85b793d02ccfbc0b32c07ce66f422695e675'}]}, 'timestamp': '2025-11-23 09:53:10.868807', '_unique_id': '32c575f10aed43c9969909ef95707d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da3891a4-b3cf-4897-850b-949f77faf6ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.871119', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38777cda-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'a03d46fd8c0e10440c0a0cf46f9152f46174a0c52479932fa297e7df9a08ed06'}]}, 'timestamp': '2025-11-23 09:53:10.871602', '_unique_id': '8b45316ecba442c7b38b7d1aebb973cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a78165c-df71-4b49-84d5-9be1eaf1c52c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.873808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3877e698-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'a77557c3bf8eaefadcfdbb32cc33d93e30b04f9d8a09a044b4efe07dac290722'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.873808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3877f8ea-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'd2c613e824e6353dcb84c9d6961b1cb7d247d986366cd050eb1a8a3cad79dc62'}]}, 'timestamp': '2025-11-23 09:53:10.874760', '_unique_id': '25c0933b1bb1420583145896ae86dd2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89af889e-66bc-4b5d-8550-3b0325883607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.876959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387860d2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '3dc4dbd56859afcb267a38dae79a6971b9bca4e1920d7b85161b64a3231aa587'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.876959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387870fe-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '030d0b40687c5fd38eab705a02f520a24124be9c57d0b051ccf45adf30aa04ca'}]}, 'timestamp': '2025-11-23 09:53:10.877852', '_unique_id': '7fdd844d51cb4ec48e2c0548b4374674'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8975f1a-a5c4-4da6-8569-45066d91c3ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.880104', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '3878dbd4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'c9d65427ceb7ead57f433076989073ad31ac31a153aa987c2b2f2568ade5c4e4'}]}, 'timestamp': '2025-11-23 09:53:10.880588', '_unique_id': '39fe9c702c394858adc95a281a3f088c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4208e150-fa33-4668-a835-e915dcd52b8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.882973', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38794d1c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '9022ab766ab6839248f677060feabcbbbab11396a02d347de317f6d3fe416951'}]}, 'timestamp': '2025-11-23 09:53:10.883515', '_unique_id': 'e7c79dba9b1e47aca0121700f05ee50b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52176427-62b2-4abf-ae33-875053584f88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.885675', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '3879b57c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'a86b9f69a46739d523a37178a5dc9a25abd2205b3bbea2e71115b7fb317183fc'}]}, 'timestamp': '2025-11-23 09:53:10.886158', '_unique_id': '460154ce56a2465092033192f157c684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c3931f7-5f0a-41cc-9034-b6d4e061a65f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.888296', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '387a1bfc-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '6f63ee8b083afc942a2a141a33d0135b6b2263b787d7920f9cf2da7571f0a654'}]}, 'timestamp': '2025-11-23 09:53:10.888781', '_unique_id': '28312eb738f943b5909f82ca902fbecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.890 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b226306-9d03-4eef-a61b-a6941107c0b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.890936', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '387a8236-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '84309b21b360080aab6adffa05ceaf7b2049dc0ba533f46cf18248a73c4bca76'}]}, 'timestamp': '2025-11-23 09:53:10.891395', '_unique_id': '4c9dee7978744d19b70e585550edece1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ead07db-8d23-4ead-845c-db5e118f49fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:53:10.893475', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '387d4cb4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.086825964, 'message_signature': '941cc42abe6d75bcc2fd9c13dbf8277a71f1892fd51a4ad4d864338b8b7d33de'}]}, 'timestamp': '2025-11-23 09:53:10.909599', '_unique_id': 'dfbbeb39d83d49bb91d5b1a0eeb8447f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a280f66-1321-4f4a-9775-f9c4517b61f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.911099', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '387d9232-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'cd0b542f3df869fee767a005836d67dbaa038ee43b60dad3990a8c8a7fd3cec4'}]}, 'timestamp': '2025-11-23 09:53:10.911381', '_unique_id': '3463bded98834fa5a56cb87ba34b369b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec3ecdbe-dac5-4537-a45d-519cc4c7fc3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.912673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387dd008-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'b950bcd03f8ce95fabe1ef444242d8892ab03af0582cd435622e0f58ec350cf5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.912673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387ddc2e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'c5d9dabd5b36c933bea6c6d9bf58a174c6eb267df893c326cf1bda1fd423b1b4'}]}, 'timestamp': '2025-11-23 09:53:10.913259', '_unique_id': 'e716de5561de4b77b974b48ef82ee461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97ed12af-b875-4388-9ec5-878488df66a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.914600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387e1afe-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'badd910b44b6e7349a0d24c6b1d6dc925906407f1cb068a38d93337b1c3a68a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.914600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387e259e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '1f88be1e0a0461a2edfc240a645870a05d3b106d73ba20a8011d327568c12dc4'}]}, 'timestamp': '2025-11-23 09:53:10.915138', '_unique_id': '93de40822e1541ebb93c787f88bde087'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.916 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 14100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dad0ca47-560b-4bba-a964-c792ea69964a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14100000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:53:10.916714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '387e6d7e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.086825964, 'message_signature': '0d810f7dddd362c9fd89e83c1483d41f816c08a04513bb1096f6eb4e8dce92d7'}]}, 'timestamp': '2025-11-23 09:53:10.917010', '_unique_id': 'f076422e836e4d76b2a5eb45f40ab351'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6df4f655-e0b2-4903-ae58-e8c45cd810ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.918308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387eabd6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '4331a7b69115f97aab11b3c2e8068def4441360e356eeccc24d622512e3dda3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.918308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387eb6da-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '29e3ace2baaaa34888bfc9f4637128749c1288f69becd8d51cf91fb671525248'}]}, 'timestamp': '2025-11-23 09:53:10.918857', '_unique_id': '094f4724c7974efb94217bbccda9909a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51444e12-bf01-4639-8a34-2051cb59f3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.920316', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387efa3c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': '699380721f6945cdcd48803a1929dde43f49bd3eddf1b184ea6bc739d7e68d4a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.920316', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387f0428-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'bd67b959a5ef882798780e3de686fedd929301a03cfd66dabadb4559761a0def'}]}, 'timestamp': '2025-11-23 09:53:10.920833', '_unique_id': 'd7e208350c1243c285a48714ace55e1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:53:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54137 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005532583.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='client.34497 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005532583.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: Added label _no_schedule to host np0005532583.localdomain
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532583.localdomain
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3035427830' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:53:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:11.866 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:11.867 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:53:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:53:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:53:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:53:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:53:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:53:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:53:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18721 "" "Go-http-client/1.1"
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:11 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:12 np0005532585.localdomain podman[303481]: 2025-11-23 09:53:12.016294217 +0000 UTC m=+0.124502113 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 23 09:53:12 np0005532585.localdomain podman[303481]: 2025-11-23 09:53:12.027387729 +0000 UTC m=+0.135595665 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 23 09:53:12 np0005532585.localdomain sudo[303505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:12 np0005532585.localdomain sudo[303505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:12 np0005532585.localdomain sudo[303505]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:53:12 np0005532585.localdomain podman[303482]: 2025-11-23 09:53:11.9813994 +0000 UTC m=+0.089595525 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:53:12 np0005532585.localdomain sudo[303539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:12 np0005532585.localdomain sudo[303539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:12 np0005532585.localdomain podman[303482]: 2025-11-23 09:53:12.111194654 +0000 UTC m=+0.219390789 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54143 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005532583.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Removed host np0005532583.localdomain
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed host np0005532583.localdomain
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='client.54137 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005532583.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain"} : dispatch
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 2025-11-23 09:53:12.556300276 +0000 UTC m=+0.070103014 container create c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main)
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925.scope.
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 2025-11-23 09:53:12.53049302 +0000 UTC m=+0.044295788 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 2025-11-23 09:53:12.633394234 +0000 UTC m=+0.147196982 container init c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12)
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 2025-11-23 09:53:12.644509837 +0000 UTC m=+0.158312575 container start c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64)
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 2025-11-23 09:53:12.64492295 +0000 UTC m=+0.158725718 container attach c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=553, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, version=7)
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: libpod-c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925.scope: Deactivated successfully.
Nov 23 09:53:12 np0005532585.localdomain agitated_fermi[303588]: 167 167
Nov 23 09:53:12 np0005532585.localdomain podman[303573]: 2025-11-23 09:53:12.650504092 +0000 UTC m=+0.164306860 container died c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., ceph=True, RELEASE=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, description=Red Hat Ceph Storage 7)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] Optimize plan auto_2025-11-23_09:53:12
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] do_upmap
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] pools ['backups', 'vms', 'volumes', 'images', 'manila_metadata', '.mgr', 'manila_data']
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [balancer INFO root] prepared 0/10 changes
Nov 23 09:53:12 np0005532585.localdomain podman[303593]: 2025-11-23 09:53:12.754935384 +0000 UTC m=+0.091559185 container remove c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: libpod-conmon-c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925.scope: Deactivated successfully.
Nov 23 09:53:12 np0005532585.localdomain sudo[303539]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] _maybe_adjust
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033244564838079286 of space, bias 1.0, pg target 0.6648912967615858 quantized to 32 (current 32)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 23 09:53:12 np0005532585.localdomain sudo[303611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:12 np0005532585.localdomain sudo[303611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:12 np0005532585.localdomain sudo[303611]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 23 09:53:12 np0005532585.localdomain ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 23 09:53:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b937137decf7b11ca88ddfa440c5cceb6d2d6afe5a265f025f5d493d2b31e3e4-merged.mount: Deactivated successfully.
Nov 23 09:53:13 np0005532585.localdomain sudo[303629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:13 np0005532585.localdomain sudo[303629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:13.210 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:13.231 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:13.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: from='client.54143 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005532583.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain"}]': finished
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: Removed host np0005532583.localdomain
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:53:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 2025-11-23 09:53:13.462164933 +0000 UTC m=+0.080995040 container create 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:53:13 np0005532585.localdomain systemd[1]: Started libpod-conmon-89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05.scope.
Nov 23 09:53:13 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 2025-11-23 09:53:13.429370201 +0000 UTC m=+0.048200388 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 2025-11-23 09:53:13.534939898 +0000 UTC m=+0.153770005 container init 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 2025-11-23 09:53:13.54085376 +0000 UTC m=+0.159683837 container start 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 2025-11-23 09:53:13.541147409 +0000 UTC m=+0.159977556 container attach 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:13 np0005532585.localdomain heuristic_napier[303679]: 167 167
Nov 23 09:53:13 np0005532585.localdomain systemd[1]: libpod-89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05.scope: Deactivated successfully.
Nov 23 09:53:13 np0005532585.localdomain podman[303664]: 2025-11-23 09:53:13.543815522 +0000 UTC m=+0.162645639 container died 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:53:13 np0005532585.localdomain podman[303684]: 2025-11-23 09:53:13.637852392 +0000 UTC m=+0.081366851 container remove 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:53:13 np0005532585.localdomain systemd[1]: libpod-conmon-89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05.scope: Deactivated successfully.
Nov 23 09:53:13 np0005532585.localdomain sudo[303629]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:13 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:13 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:13 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:13 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:13 np0005532585.localdomain sudo[303708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:13 np0005532585.localdomain sudo[303708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:13 np0005532585.localdomain sudo[303708]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5ceeb530356ef132c2f1ea8a59e40c7004053bced38a643a03cd26874370f264-merged.mount: Deactivated successfully.
Nov 23 09:53:14 np0005532585.localdomain sudo[303726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:14 np0005532585.localdomain sudo[303726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 2025-11-23 09:53:14.46304007 +0000 UTC m=+0.077620366 container create 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:14 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:14 np0005532585.localdomain systemd[1]: Started libpod-conmon-6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947.scope.
Nov 23 09:53:14 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 2025-11-23 09:53:14.428309149 +0000 UTC m=+0.042889465 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 2025-11-23 09:53:14.532222265 +0000 UTC m=+0.146802581 container init 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64)
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 2025-11-23 09:53:14.542047318 +0000 UTC m=+0.156627614 container start 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=7, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 2025-11-23 09:53:14.542388598 +0000 UTC m=+0.156968914 container attach 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, version=7, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:14 np0005532585.localdomain inspiring_villani[303777]: 167 167
Nov 23 09:53:14 np0005532585.localdomain systemd[1]: libpod-6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947.scope: Deactivated successfully.
Nov 23 09:53:14 np0005532585.localdomain podman[303761]: 2025-11-23 09:53:14.546498835 +0000 UTC m=+0.161079161 container died 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, vcs-type=git, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container)
Nov 23 09:53:14 np0005532585.localdomain podman[303782]: 2025-11-23 09:53:14.637954536 +0000 UTC m=+0.082962590 container remove 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:53:14 np0005532585.localdomain systemd[1]: libpod-conmon-6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947.scope: Deactivated successfully.
Nov 23 09:53:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:14 np0005532585.localdomain sudo[303726]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:14 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:14 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:14 np0005532585.localdomain sudo[303806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:14 np0005532585.localdomain sudo[303806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:14 np0005532585.localdomain sudo[303806]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7f6c70242f52446dac1dcd3eb1e2c564747320354fb2a8253c3ac502d1630808-merged.mount: Deactivated successfully.
Nov 23 09:53:15 np0005532585.localdomain sudo[303824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:15 np0005532585.localdomain sudo[303824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:15.394 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 2025-11-23 09:53:15.479555131 +0000 UTC m=+0.068443453 container create 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:53:15 np0005532585.localdomain systemd[1]: Started libpod-conmon-43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe.scope.
Nov 23 09:53:15 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 2025-11-23 09:53:15.546562358 +0000 UTC m=+0.135450690 container init 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, vendor=Red Hat, Inc., release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64)
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 2025-11-23 09:53:15.455579871 +0000 UTC m=+0.044468173 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 2025-11-23 09:53:15.557736193 +0000 UTC m=+0.146624525 container start 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, release=553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True)
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 2025-11-23 09:53:15.55796845 +0000 UTC m=+0.146856772 container attach 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:53:15 np0005532585.localdomain jovial_volhard[303874]: 167 167
Nov 23 09:53:15 np0005532585.localdomain systemd[1]: libpod-43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe.scope: Deactivated successfully.
Nov 23 09:53:15 np0005532585.localdomain podman[303858]: 2025-11-23 09:53:15.560634442 +0000 UTC m=+0.149522764 container died 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12, name=rhceph)
Nov 23 09:53:15 np0005532585.localdomain podman[303879]: 2025-11-23 09:53:15.65101952 +0000 UTC m=+0.079301207 container remove 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, release=553, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:53:15 np0005532585.localdomain systemd[1]: libpod-conmon-43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe.scope: Deactivated successfully.
Nov 23 09:53:15 np0005532585.localdomain sudo[303824]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:15 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:15 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:15 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:15 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:15 np0005532585.localdomain sudo[303895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:15 np0005532585.localdomain sudo[303895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:15 np0005532585.localdomain sudo[303895]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:15 np0005532585.localdomain sudo[303913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:15 np0005532585.localdomain sudo[303913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:15 np0005532585.localdomain systemd[1]: tmp-crun.qnQnne.mount: Deactivated successfully.
Nov 23 09:53:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-413c556e0e98758aae564a1a7fa9c1190a15dd9491b65f66503e920a26dd779a-merged.mount: Deactivated successfully.
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 2025-11-23 09:53:16.356774914 +0000 UTC m=+0.078932827 container create 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc.)
Nov 23 09:53:16 np0005532585.localdomain systemd[1]: Started libpod-conmon-7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94.scope.
Nov 23 09:53:16 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 2025-11-23 09:53:16.418525559 +0000 UTC m=+0.140683472 container init 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55)
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 2025-11-23 09:53:16.323369813 +0000 UTC m=+0.045527786 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 2025-11-23 09:53:16.428328331 +0000 UTC m=+0.150486244 container start 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, release=553, CEPH_POINT_RELEASE=, ceph=True)
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 2025-11-23 09:53:16.428554188 +0000 UTC m=+0.150712131 container attach 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 23 09:53:16 np0005532585.localdomain goofy_blackwell[303963]: 167 167
Nov 23 09:53:16 np0005532585.localdomain systemd[1]: libpod-7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94.scope: Deactivated successfully.
Nov 23 09:53:16 np0005532585.localdomain podman[303947]: 2025-11-23 09:53:16.431132588 +0000 UTC m=+0.153290531 container died 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, ceph=True, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Nov 23 09:53:16 np0005532585.localdomain podman[303968]: 2025-11-23 09:53:16.520416072 +0000 UTC m=+0.076532002 container remove 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:53:16 np0005532585.localdomain systemd[1]: libpod-conmon-7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94.scope: Deactivated successfully.
Nov 23 09:53:16 np0005532585.localdomain sudo[303913]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:53:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:53:16 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:53:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:53:16 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:16 np0005532585.localdomain sudo[303984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:16 np0005532585.localdomain sudo[303984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:16 np0005532585.localdomain sudo[303984]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:53:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:16 np0005532585.localdomain sudo[304002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:16 np0005532585.localdomain sudo[304002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8840d7c18e69bc4b3f17c85ae92b3622af07e263943383135ddeed63f20ffc20-merged.mount: Deactivated successfully.
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 2025-11-23 09:53:17.373262153 +0000 UTC m=+0.074475629 container create 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 09:53:17 np0005532585.localdomain systemd[1]: Started libpod-conmon-879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12.scope.
Nov 23 09:53:17 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 2025-11-23 09:53:17.342569626 +0000 UTC m=+0.043783112 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 2025-11-23 09:53:17.446476952 +0000 UTC m=+0.147690428 container init 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main)
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 2025-11-23 09:53:17.454623703 +0000 UTC m=+0.155837169 container start 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=553, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, name=rhceph)
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 2025-11-23 09:53:17.454916042 +0000 UTC m=+0.156129548 container attach 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:53:17 np0005532585.localdomain practical_golick[304052]: 167 167
Nov 23 09:53:17 np0005532585.localdomain systemd[1]: libpod-879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12.scope: Deactivated successfully.
Nov 23 09:53:17 np0005532585.localdomain podman[304037]: 2025-11-23 09:53:17.456549503 +0000 UTC m=+0.157762999 container died 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Nov 23 09:53:17 np0005532585.localdomain podman[304057]: 2025-11-23 09:53:17.551083589 +0000 UTC m=+0.081292989 container remove 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, release=553, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, distribution-scope=public, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:17 np0005532585.localdomain systemd[1]: libpod-conmon-879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12.scope: Deactivated successfully.
Nov 23 09:53:17 np0005532585.localdomain sudo[304002]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:17 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:17 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:17 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:17 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8de9a3bc21e15e1fba062036739d1c9a95aa30d6ec253814481023ba07294928-merged.mount: Deactivated successfully.
Nov 23 09:53:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 23 09:53:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 23 09:53:18 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:53:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:53:18 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:18 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:18 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:18 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:18 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:18 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:53:18 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:19 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Nov 23 09:53:19 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Nov 23 09:53:19 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:53:19 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:53:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:20.396 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Saving service mon spec with placement label:mon
Nov 23 09:53:20 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Nov 23 09:53:21 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:53:21 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:53:21 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:53:21 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:22 np0005532585.localdomain sudo[304073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:22 np0005532585.localdomain sudo[304073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:22 np0005532585.localdomain sudo[304073]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:22 np0005532585.localdomain sudo[304091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:53:22 np0005532585.localdomain sudo[304091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:22 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532586", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: from='client.54146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: Saving service mon spec with placement label:mon
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:22 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:22 np0005532585.localdomain sudo[304091]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev e59b1355-8ccd-4c28-832b-9038b911cfdf (Updating node-proxy deployment (+3 -> 3))
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev e59b1355-8ccd-4c28-832b-9038b911cfdf (Updating node-proxy deployment (+3 -> 3))
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event e59b1355-8ccd-4c28-832b-9038b911cfdf (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 23 09:53:23 np0005532585.localdomain sudo[304140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:53:23 np0005532585.localdomain sudo[304140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:53:23 np0005532585.localdomain sudo[304140]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:23 np0005532585.localdomain podman[304158]: 2025-11-23 09:53:23.447051493 +0000 UTC m=+0.093128813 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: from='client.54149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532586", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:23 np0005532585.localdomain podman[304158]: 2025-11-23 09:53:23.486442759 +0000 UTC m=+0.132520039 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:53:23 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54155 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532586"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Remove daemons mon.np0005532586
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005532586
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005532586: new quorum should be ['np0005532584', 'np0005532585'] (from ['np0005532584', 'np0005532585'])
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005532586: new quorum should be ['np0005532584', 'np0005532585'] (from ['np0005532584', 'np0005532585'])
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005532586 from monmap...
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing monitor np0005532586 from monmap...
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005532586 from np0005532586.localdomain -- ports []
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005532586 from np0005532586.localdomain -- ports []
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@2(peon) e16  my rank is now 1 (was 2)
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: client.27018 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 23 09:53:23 np0005532585.localdomain ceph-mgr[288287]: client.27015 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532584"} v 0)
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532585"} v 0)
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(probing) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:23 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/1151827140 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55eed51f6c00 0x55eed51fcb00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: --2- 172.18.0.107:0/1518513680 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55eed629b400 0x55eed4fc1700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: paxos.1).electionLogic(60) init, last seen epoch 60
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e16 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e16 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: Remove daemons mon.np0005532586
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: Safe to remove mon.np0005532586: new quorum should be ['np0005532584', 'np0005532585'] (from ['np0005532584', 'np0005532585'])
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: Removing monitor np0005532586 from monmap...
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: Removing daemon mon.np0005532586 from np0005532586.localdomain -- ports []
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 calling monitor election
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 is new leader, mons np0005532584,np0005532585 in quorum (ranks 0,1)
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: monmap epoch 16
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:53:23.789795+0000
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 71s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:53:24 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 09:53:24 np0005532585.localdomain sudo[304177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:53:24 np0005532585.localdomain sudo[304177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304177]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:53:24 np0005532585.localdomain sudo[304195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304195]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:24 np0005532585.localdomain sudo[304213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304213]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:24 np0005532585.localdomain sudo[304231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304231]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:24 np0005532585.localdomain sudo[304249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304249]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:24 np0005532585.localdomain sudo[304283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304283]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:24 np0005532585.localdomain sudo[304301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304301]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain sudo[304319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain sudo[304319]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:24 np0005532585.localdomain sudo[304337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:53:24 np0005532585.localdomain sudo[304337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304337]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:53:24 np0005532585.localdomain sudo[304355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304355]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:24 np0005532585.localdomain sudo[304373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:24 np0005532585.localdomain sudo[304373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:24 np0005532585.localdomain sudo[304373]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:25 np0005532585.localdomain sudo[304391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:25 np0005532585.localdomain sudo[304391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:25 np0005532585.localdomain sudo[304391]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:25 np0005532585.localdomain sudo[304409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:25 np0005532585.localdomain sudo[304409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:25 np0005532585.localdomain sudo[304409]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:25 np0005532585.localdomain sudo[304443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:25 np0005532585.localdomain sudo[304443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:25 np0005532585.localdomain sudo[304443]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:25 np0005532585.localdomain sudo[304461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:25 np0005532585.localdomain sudo[304461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:25 np0005532585.localdomain sudo[304461]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:25.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:25 np0005532585.localdomain sudo[304479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:25 np0005532585.localdomain sudo[304479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:25 np0005532585.localdomain sudo[304479]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev a56fc888-20b8-44aa-a4ef-7fb77787df56 (Updating node-proxy deployment (+3 -> 3))
Nov 23 09:53:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev a56fc888-20b8-44aa-a4ef-7fb77787df56 (Updating node-proxy deployment (+3 -> 3))
Nov 23 09:53:25 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event a56fc888-20b8-44aa-a4ef-7fb77787df56 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 23 09:53:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:25 np0005532585.localdomain sudo[304497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:53:25 np0005532585.localdomain sudo[304497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:53:25 np0005532585.localdomain sudo[304497]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:26 np0005532585.localdomain podman[304515]: 2025-11-23 09:53:26.08452531 +0000 UTC m=+0.087841390 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:53:26 np0005532585.localdomain podman[304515]: 2025-11-23 09:53:26.100334258 +0000 UTC m=+0.103650338 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:53:26 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:26 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:28 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:29 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:29 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:29 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:29 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:29.934 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:53:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:29.965 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 23 09:53:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:29.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:53:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:29.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:53:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:53:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:30.012 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:53:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:30.401 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:30.404 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:30 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:30 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:30 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:30 np0005532585.localdomain sudo[304537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:30 np0005532585.localdomain sudo[304537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:30 np0005532585.localdomain sudo[304537]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:30 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:30 np0005532585.localdomain sudo[304555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:30 np0005532585.localdomain sudo[304555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:30 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 2025-11-23 09:53:31.214935727 +0000 UTC m=+0.074529780 container create b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main)
Nov 23 09:53:31 np0005532585.localdomain systemd[1]: Started libpod-conmon-b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198.scope.
Nov 23 09:53:31 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 2025-11-23 09:53:31.183082194 +0000 UTC m=+0.042676317 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 2025-11-23 09:53:31.285603037 +0000 UTC m=+0.145197100 container init b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 2025-11-23 09:53:31.293877402 +0000 UTC m=+0.153471475 container start b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, release=553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git)
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 2025-11-23 09:53:31.294229533 +0000 UTC m=+0.153823596 container attach b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:53:31 np0005532585.localdomain pensive_euclid[304604]: 167 167
Nov 23 09:53:31 np0005532585.localdomain systemd[1]: libpod-b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198.scope: Deactivated successfully.
Nov 23 09:53:31 np0005532585.localdomain podman[304589]: 2025-11-23 09:53:31.296159663 +0000 UTC m=+0.155753716 container died b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:53:31 np0005532585.localdomain podman[304609]: 2025-11-23 09:53:31.380226277 +0000 UTC m=+0.071827278 container remove b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, release=553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:53:31 np0005532585.localdomain systemd[1]: libpod-conmon-b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198.scope: Deactivated successfully.
Nov 23 09:53:31 np0005532585.localdomain sudo[304555]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:31 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:31 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:31 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:31 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:31 np0005532585.localdomain sudo[304626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:31 np0005532585.localdomain sudo[304626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:31 np0005532585.localdomain sudo[304626]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:31 np0005532585.localdomain sudo[304644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:31 np0005532585.localdomain sudo[304644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:53:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 2025-11-23 09:53:32.078412056 +0000 UTC m=+0.081747963 container create 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553)
Nov 23 09:53:32 np0005532585.localdomain systemd[1]: Started libpod-conmon-9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b.scope.
Nov 23 09:53:32 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 2025-11-23 09:53:32.140057768 +0000 UTC m=+0.143393675 container init 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main)
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 2025-11-23 09:53:32.042963352 +0000 UTC m=+0.046299259 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 2025-11-23 09:53:32.150452829 +0000 UTC m=+0.153788736 container start 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, name=rhceph, vcs-type=git, version=7, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main)
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 2025-11-23 09:53:32.15084068 +0000 UTC m=+0.154176587 container attach 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, RELEASE=main, ceph=True)
Nov 23 09:53:32 np0005532585.localdomain competent_noyce[304695]: 167 167
Nov 23 09:53:32 np0005532585.localdomain systemd[1]: libpod-9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b.scope: Deactivated successfully.
Nov 23 09:53:32 np0005532585.localdomain podman[304680]: 2025-11-23 09:53:32.153611326 +0000 UTC m=+0.156947263 container died 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, description=Red Hat Ceph Storage 7, release=553, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:53:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-2bac5ef8c68c2242ae1212afde9f25aede8b2dfc4d4bf412b25b4a9e1b82837f-merged.mount: Deactivated successfully.
Nov 23 09:53:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-552b7356ded64562fa112310d8ffa65bce3d7c0257ead807842473869640e976-merged.mount: Deactivated successfully.
Nov 23 09:53:32 np0005532585.localdomain podman[304700]: 2025-11-23 09:53:32.250330049 +0000 UTC m=+0.087190380 container remove 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 23 09:53:32 np0005532585.localdomain systemd[1]: libpod-conmon-9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b.scope: Deactivated successfully.
Nov 23 09:53:32 np0005532585.localdomain sudo[304644]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:32 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:32 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:32 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:32 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:32 np0005532585.localdomain sudo[304725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:32 np0005532585.localdomain sudo[304725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:32 np0005532585.localdomain sudo[304725]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:32 np0005532585.localdomain sudo[304743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:32 np0005532585.localdomain sudo[304743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:32 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 2025-11-23 09:53:33.048201304 +0000 UTC m=+0.080004988 container create 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7)
Nov 23 09:53:33 np0005532585.localdomain systemd[1]: Started libpod-conmon-30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660.scope.
Nov 23 09:53:33 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 2025-11-23 09:53:33.114293094 +0000 UTC m=+0.146096778 container init 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, release=553, architecture=x86_64, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc.)
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 2025-11-23 09:53:33.017372613 +0000 UTC m=+0.049176337 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 2025-11-23 09:53:33.123500018 +0000 UTC m=+0.155303702 container start 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git)
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 2025-11-23 09:53:33.123801677 +0000 UTC m=+0.155605361 container attach 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., release=553, name=rhceph, distribution-scope=public, vcs-type=git, ceph=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=)
Nov 23 09:53:33 np0005532585.localdomain stoic_villani[304793]: 167 167
Nov 23 09:53:33 np0005532585.localdomain systemd[1]: libpod-30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660.scope: Deactivated successfully.
Nov 23 09:53:33 np0005532585.localdomain podman[304777]: 2025-11-23 09:53:33.126247692 +0000 UTC m=+0.158051406 container died 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, version=7, io.openshift.expose-services=, GIT_CLEAN=True, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main)
Nov 23 09:53:33 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ae27c3a1e2a4b872da410046d936a4c60bed3350896f3d7ac5e09e7da49b3cb5-merged.mount: Deactivated successfully.
Nov 23 09:53:33 np0005532585.localdomain podman[304798]: 2025-11-23 09:53:33.238009 +0000 UTC m=+0.099953505 container remove 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Nov 23 09:53:33 np0005532585.localdomain systemd[1]: libpod-conmon-30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660.scope: Deactivated successfully.
Nov 23 09:53:33 np0005532585.localdomain sudo[304743]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:33 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:33 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:33 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:33 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:33 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:33 np0005532585.localdomain sudo[304822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:33 np0005532585.localdomain sudo[304822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:33 np0005532585.localdomain sudo[304822]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:33 np0005532585.localdomain sudo[304840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:33 np0005532585.localdomain sudo[304840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 2025-11-23 09:53:34.067777969 +0000 UTC m=+0.078562574 container create cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: Started libpod-conmon-cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305.scope.
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 2025-11-23 09:53:34.12905646 +0000 UTC m=+0.139841045 container init cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7)
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 2025-11-23 09:53:34.037452083 +0000 UTC m=+0.048236718 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 2025-11-23 09:53:34.139733559 +0000 UTC m=+0.150518144 container start cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 2025-11-23 09:53:34.139912565 +0000 UTC m=+0.150697150 container attach cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git)
Nov 23 09:53:34 np0005532585.localdomain naughty_pascal[304889]: 167 167
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: libpod-cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305.scope: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain podman[304874]: 2025-11-23 09:53:34.144305131 +0000 UTC m=+0.155089716 container died cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, name=rhceph, RELEASE=main, GIT_CLEAN=True)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: tmp-crun.8BhCJP.mount: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c37fb1671fd0adf14e59310f87a9848b621d966f12b8c33e0aa6ce15e89f110b-merged.mount: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain podman[304894]: 2025-11-23 09:53:34.240497658 +0000 UTC m=+0.082255258 container remove cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, release=553, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, GIT_CLEAN=True, io.openshift.expose-services=)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: libpod-conmon-cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305.scope: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain sudo[304840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:34 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:34 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:34 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:34 np0005532585.localdomain sudo[304910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:34 np0005532585.localdomain sudo[304910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:53:34 np0005532585.localdomain sudo[304910]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:34 np0005532585.localdomain sudo[304931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:34 np0005532585.localdomain sudo[304931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:34 np0005532585.localdomain podman[304929]: 2025-11-23 09:53:34.529634508 +0000 UTC m=+0.082215737 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 09:53:34 np0005532585.localdomain podman[304929]: 2025-11-23 09:53:34.563284536 +0000 UTC m=+0.115865795 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:53:34 np0005532585.localdomain podman[304927]: 2025-11-23 09:53:34.592842688 +0000 UTC m=+0.143666174 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:53:34 np0005532585.localdomain podman[304927]: 2025-11-23 09:53:34.654415708 +0000 UTC m=+0.205239184 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain podman[304930]: 2025-11-23 09:53:34.668663857 +0000 UTC m=+0.218396998 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain podman[304930]: 2025-11-23 09:53:34.709172007 +0000 UTC m=+0.258905148 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, version=9.6, config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:53:34 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:53:34 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:34 np0005532585.localdomain podman[305027]: 
Nov 23 09:53:34 np0005532585.localdomain podman[305027]: 2025-11-23 09:53:34.961553543 +0000 UTC m=+0.082824676 container create 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, RELEASE=main, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:53:35 np0005532585.localdomain systemd[1]: Started libpod-conmon-732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057.scope.
Nov 23 09:53:35 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:35 np0005532585.localdomain podman[305027]: 2025-11-23 09:53:34.925306624 +0000 UTC m=+0.046577787 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:35 np0005532585.localdomain podman[305027]: 2025-11-23 09:53:35.025836866 +0000 UTC m=+0.147107989 container init 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:53:35 np0005532585.localdomain podman[305027]: 2025-11-23 09:53:35.036228977 +0000 UTC m=+0.157500110 container start 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=553, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public)
Nov 23 09:53:35 np0005532585.localdomain podman[305027]: 2025-11-23 09:53:35.036533116 +0000 UTC m=+0.157804279 container attach 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., release=553, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:53:35 np0005532585.localdomain gallant_euler[305042]: 167 167
Nov 23 09:53:35 np0005532585.localdomain systemd[1]: libpod-732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057.scope: Deactivated successfully.
Nov 23 09:53:35 np0005532585.localdomain podman[305027]: 2025-11-23 09:53:35.039858969 +0000 UTC m=+0.161130132 container died 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.buildah.version=1.33.12, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main)
Nov 23 09:53:35 np0005532585.localdomain podman[305047]: 2025-11-23 09:53:35.137705107 +0000 UTC m=+0.085605791 container remove 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:53:35 np0005532585.localdomain systemd[1]: libpod-conmon-732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057.scope: Deactivated successfully.
Nov 23 09:53:35 np0005532585.localdomain sudo[304931]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:35 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:35 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:35 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:35 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:35.403 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:35.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:35 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:36 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 23 09:53:36 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 23 09:53:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Nov 23 09:53:36 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:53:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:36 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:36 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:53:36 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:53:36 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005532586.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:53:37 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:38 np0005532585.localdomain ceph-mon[300199]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:38 np0005532585.localdomain ceph-mon[300199]: from='client.54157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005532586.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:38 np0005532585.localdomain ceph-mon[300199]: Deploying daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:53:38 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:53:38 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:53:38 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:53:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:40.406 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (2) No such file or directory
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532584"} v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532585"} v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: paxos.1).electionLogic(62) init, last seen epoch 62
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:40 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument
Nov 23 09:53:40 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:41 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:41 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:41 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument
Nov 23 09:53:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:53:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:53:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:53:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:53:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:53:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18709 "" "Go-http-client/1.1"
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:42 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: []
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f5ea71ca220>)]
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections..
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f5ea71ca400>)]
Nov 23 09:53:42 np0005532585.localdomain ceph-mgr[288287]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 23 09:53:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:53:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:53:43 np0005532585.localdomain podman[305065]: 2025-11-23 09:53:43.069296943 +0000 UTC m=+0.082482935 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:53:43 np0005532585.localdomain systemd[1]: tmp-crun.vZp3B3.mount: Deactivated successfully.
Nov 23 09:53:43 np0005532585.localdomain podman[305064]: 2025-11-23 09:53:43.127709255 +0000 UTC m=+0.142588850 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:53:43 np0005532585.localdomain podman[305065]: 2025-11-23 09:53:43.136943691 +0000 UTC m=+0.150129673 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:53:43 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:53:43 np0005532585.localdomain podman[305064]: 2025-11-23 09:53:43.164818791 +0000 UTC m=+0.179698326 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:53:43 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:53:43 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:43 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:43 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument
Nov 23 09:53:44 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:44 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:44 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument
Nov 23 09:53:44 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:45.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:53:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:53:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:53:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:53:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:45.410 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:53:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:45.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:45 np0005532585.localdomain ceph-mds[287052]: mds.beacon.mds.np0005532585.jcltnl missed beacon ack from the monitors
Nov 23 09:53:45 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: paxos.1).electionLogic(63) init, last seen epoch 63, mid-election, bumping
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:45 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:53:45 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 calling monitor election
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585 calling monitor election
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532586 calling monitor election
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532584 is new leader, mons np0005532584,np0005532585,np0005532586 in quorum (ranks 0,1,2)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: monmap epoch 17
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: last_changed 2025-11-23T09:53:40.507961+0000
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: created 2025-11-23T07:39:05.590972+0000
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: min_mon_release 18 (reef)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: election_strategy: 1
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532586
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: fsmap cephfs:1 {0=mds.np0005532586.mfohsb=up:active} 2 up:standby
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: osdmap e85: 6 total, 6 up, 6 in
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mgrmap e33: np0005532585.gzafiw(active, since 92s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:45 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:45 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:53:45 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:46 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect)
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0)
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)...
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: mgrmap e34: np0005532585.gzafiw(active, since 93s), standbys: np0005532586.thmvqb, np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:53:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:46 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v50: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:47 np0005532585.localdomain ceph-mgr[288287]: mgr.server handle_report got status from non-daemon mon.np0005532586
Nov 23 09:53:47 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:53:47.518+0000 7f5ec5b52640 -1 mgr.server handle_report got status from non-daemon mon.np0005532586
Nov 23 09:53:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: pgmap v50: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 23 09:53:48 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:48 np0005532585.localdomain sudo[305106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:53:48 np0005532585.localdomain sudo[305106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:48 np0005532585.localdomain sudo[305106]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:48 np0005532585.localdomain sudo[305124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:53:48 np0005532585.localdomain sudo[305124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:48 np0005532585.localdomain sudo[305124]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:48 np0005532585.localdomain sudo[305142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:48 np0005532585.localdomain sudo[305142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:48 np0005532585.localdomain sudo[305142]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:48 np0005532585.localdomain sudo[305160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:48 np0005532585.localdomain sudo[305160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:48 np0005532585.localdomain sudo[305160]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:48 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v51: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:48 np0005532585.localdomain sudo[305178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:48 np0005532585.localdomain sudo[305178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:48 np0005532585.localdomain sudo[305178]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:48 np0005532585.localdomain sudo[305212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:48 np0005532585.localdomain sudo[305212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:48 np0005532585.localdomain sudo[305212]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain sudo[305230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:53:49 np0005532585.localdomain sudo[305230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305230]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain sudo[305248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain sudo[305248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305248]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain sudo[305266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:53:49 np0005532585.localdomain sudo[305266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305266]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain sudo[305284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:53:49 np0005532585.localdomain sudo[305284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305284]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain sudo[305302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:49 np0005532585.localdomain sudo[305302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305302]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain sudo[305320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:49 np0005532585.localdomain sudo[305320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305320]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain sudo[305338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:49 np0005532585.localdomain sudo[305338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305338]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain sudo[305372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:49 np0005532585.localdomain sudo[305372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305372]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain sudo[305390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:53:49 np0005532585.localdomain sudo[305390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305390]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:49 np0005532585.localdomain sudo[305408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:49 np0005532585.localdomain sudo[305408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:49 np0005532585.localdomain sudo[305408]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] update: starting ev b7355ba6-1464-472e-87d9-7de50c82b6b7 (Updating node-proxy deployment (+3 -> 3))
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] complete: finished ev b7355ba6-1464-472e-87d9-7de50c82b6b7 (Updating node-proxy deployment (+3 -> 3))
Nov 23 09:53:49 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Completed event b7355ba6-1464-472e-87d9-7de50c82b6b7 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 23 09:53:49 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:50 np0005532585.localdomain sudo[305426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:53:50 np0005532585.localdomain sudo[305426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:50 np0005532585.localdomain sudo[305426]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:50 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:50 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:50 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:50 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:50.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: pgmap v51: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:50 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v52: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:50 np0005532585.localdomain ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:51 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:51 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:51 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:51 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2073028298' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532584 (monmap changed)...
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/2073028298' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:52 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:52 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:52 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:52 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: pgmap v52: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.2 (monmap changed)...
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.471623) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632471708, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2808, "num_deletes": 259, "total_data_size": 5572811, "memory_usage": 5643928, "flush_reason": "Manual Compaction"}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632491071, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 3233862, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12506, "largest_seqno": 15309, "table_properties": {"data_size": 3222584, "index_size": 6759, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3461, "raw_key_size": 31164, "raw_average_key_size": 22, "raw_value_size": 3196957, "raw_average_value_size": 2348, "num_data_blocks": 296, "num_entries": 1361, "num_filter_entries": 1361, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891560, "oldest_key_time": 1763891560, "file_creation_time": 1763891632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 19511 microseconds, and 8603 cpu microseconds.
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.491140) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 3233862 bytes OK
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.491167) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.493538) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.493564) EVENT_LOG_v1 {"time_micros": 1763891632493556, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.493587) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 5558860, prev total WAL file size 5558860, number of live WAL files 2.
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.495147) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(3158KB)], [18(16MB)]
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632495246, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 20139804, "oldest_snapshot_seqno": -1}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 11902 keys, 17224579 bytes, temperature: kUnknown
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632580427, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17224579, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17154272, "index_size": 39486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 318266, "raw_average_key_size": 26, "raw_value_size": 16948901, "raw_average_value_size": 1424, "num_data_blocks": 1513, "num_entries": 11902, "num_filter_entries": 11902, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.580812) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17224579 bytes
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.582764) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 236.1 rd, 201.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 16.1 +0.0 blob) out(16.4 +0.0 blob), read-write-amplify(11.6) write-amplify(5.3) OK, records in: 12447, records dropped: 545 output_compression: NoCompression
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.582796) EVENT_LOG_v1 {"time_micros": 1763891632582782, "job": 8, "event": "compaction_finished", "compaction_time_micros": 85306, "compaction_time_cpu_micros": 49640, "output_level": 6, "num_output_files": 1, "total_output_size": 17224579, "num_input_records": 12447, "num_output_records": 11902, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632583366, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632585744, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.494941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585779) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:53:52 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v53: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.5 (monmap changed)...
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: pgmap v53: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)...
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:53 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:54 np0005532585.localdomain podman[305444]: 2025-11-23 09:53:54.027923154 +0000 UTC m=+0.086937813 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118)
Nov 23 09:53:54 np0005532585.localdomain podman[305444]: 2025-11-23 09:53:54.042374979 +0000 UTC m=+0.101389598 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 09:53:54 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:53:54 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v54: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:54 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:54 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:54 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)...
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:54 np0005532585.localdomain sudo[305463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:54 np0005532585.localdomain sudo[305463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:54 np0005532585.localdomain sudo[305463]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:55 np0005532585.localdomain sudo[305481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:55 np0005532585.localdomain sudo[305481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:53:55.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 2025-11-23 09:53:55.483313384 +0000 UTC m=+0.084760266 container create 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:53:55 np0005532585.localdomain systemd[1]: Started libpod-conmon-7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70.scope.
Nov 23 09:53:55 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 2025-11-23 09:53:55.446867339 +0000 UTC m=+0.048314261 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 2025-11-23 09:53:55.559828954 +0000 UTC m=+0.161275846 container init 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, release=553, io.buildah.version=1.33.12, io.openshift.expose-services=)
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 2025-11-23 09:53:55.569882665 +0000 UTC m=+0.171329527 container start 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 2025-11-23 09:53:55.570079191 +0000 UTC m=+0.171526123 container attach 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Nov 23 09:53:55 np0005532585.localdomain silly_johnson[305532]: 167 167
Nov 23 09:53:55 np0005532585.localdomain systemd[1]: libpod-7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70.scope: Deactivated successfully.
Nov 23 09:53:55 np0005532585.localdomain podman[305517]: 2025-11-23 09:53:55.574185897 +0000 UTC m=+0.175632799 container died 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:53:55 np0005532585.localdomain podman[305537]: 2025-11-23 09:53:55.658774746 +0000 UTC m=+0.076318895 container remove 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, release=553, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:55 np0005532585.localdomain systemd[1]: libpod-conmon-7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70.scope: Deactivated successfully.
Nov 23 09:53:55 np0005532585.localdomain sudo[305481]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54166 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO root] Reconfig service osd.default_drive_group
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:55 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain sudo[305553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:55 np0005532585.localdomain sudo[305553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:55 np0005532585.localdomain sudo[305553]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain sudo[305571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:53:55 np0005532585.localdomain sudo[305571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: pgmap v54: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532585 (monmap changed)...
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:55 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:53:56 np0005532585.localdomain podman[305605]: 2025-11-23 09:53:56.40050747 +0000 UTC m=+0.094874469 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:53:56 np0005532585.localdomain podman[305605]: 2025-11-23 09:53:56.433493017 +0000 UTC m=+0.127859976 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 2025-11-23 09:53:56.459089877 +0000 UTC m=+0.132053165 container create cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph)
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa.scope.
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-2e23d4e8adf6283b26f1de8b6f4e7188602d3d0a3991af1e5a58cd866a31fd1f-merged.mount: Deactivated successfully.
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 2025-11-23 09:53:56.515346872 +0000 UTC m=+0.188310160 container init cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55)
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 2025-11-23 09:53:56.525915119 +0000 UTC m=+0.198878407 container start cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 2025-11-23 09:53:56.526080844 +0000 UTC m=+0.199044132 container attach cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.33.12, release=553, version=7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Nov 23 09:53:56 np0005532585.localdomain cranky_merkle[305646]: 167 167
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: libpod-cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa.scope: Deactivated successfully.
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 2025-11-23 09:53:56.530465119 +0000 UTC m=+0.203428427 container died cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, name=rhceph)
Nov 23 09:53:56 np0005532585.localdomain podman[305612]: 2025-11-23 09:53:56.432050062 +0000 UTC m=+0.105013370 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:56 np0005532585.localdomain podman[305651]: 2025-11-23 09:53:56.636530591 +0000 UTC m=+0.092007079 container remove cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 09:53:56 np0005532585.localdomain systemd[1]: libpod-conmon-cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa.scope: Deactivated successfully.
Nov 23 09:53:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v55: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:56 np0005532585.localdomain sudo[305571]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:56 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:56 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='client.54166 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: Reconfig service osd.default_drive_group
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.0 (monmap changed)...
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 23 09:53:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:56 np0005532585.localdomain sudo[305675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:56 np0005532585.localdomain sudo[305675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:56 np0005532585.localdomain sudo[305675]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:57 np0005532585.localdomain sudo[305693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:57 np0005532585.localdomain sudo[305693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 
Nov 23 09:53:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-627a3ecf9cef10f3d1f6014149f0a88b728aaff6dcf7bdbd90337aed590af978-merged.mount: Deactivated successfully.
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 2025-11-23 09:53:57.500217787 +0000 UTC m=+0.079272367 container create 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=)
Nov 23 09:53:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534.scope.
Nov 23 09:53:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 2025-11-23 09:53:57.465172215 +0000 UTC m=+0.044226815 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 2025-11-23 09:53:57.56675973 +0000 UTC m=+0.145814310 container init 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7)
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 2025-11-23 09:53:57.57584111 +0000 UTC m=+0.154895690 container start 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, GIT_BRANCH=main, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, io.openshift.tags=rhceph ceph)
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 2025-11-23 09:53:57.576139589 +0000 UTC m=+0.155194179 container attach 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, RELEASE=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Nov 23 09:53:57 np0005532585.localdomain heuristic_mclaren[305743]: 167 167
Nov 23 09:53:57 np0005532585.localdomain systemd[1]: libpod-40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534.scope: Deactivated successfully.
Nov 23 09:53:57 np0005532585.localdomain podman[305728]: 2025-11-23 09:53:57.580852945 +0000 UTC m=+0.159907555 container died 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, RELEASE=main, io.openshift.expose-services=, vcs-type=git)
Nov 23 09:53:57 np0005532585.localdomain podman[305748]: 2025-11-23 09:53:57.682927323 +0000 UTC m=+0.089701708 container remove 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, RELEASE=main, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55)
Nov 23 09:53:57 np0005532585.localdomain systemd[1]: libpod-conmon-40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534.scope: Deactivated successfully.
Nov 23 09:53:57 np0005532585.localdomain sshd[305767]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:53:57 np0005532585.localdomain sudo[305693]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:57 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:57 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:57 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:57 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:58 np0005532585.localdomain sudo[305774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:58 np0005532585.localdomain sudo[305774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:58 np0005532585.localdomain sudo[305774]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:58 np0005532585.localdomain sudo[305792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:58 np0005532585.localdomain sudo[305792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: pgmap v55: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.3 (monmap changed)...
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:58 np0005532585.localdomain sshd[305767]: Received disconnect from 207.154.194.2 port 56406:11: Bye Bye [preauth]
Nov 23 09:53:58 np0005532585.localdomain sshd[305767]: Disconnected from authenticating user root 207.154.194.2 port 56406 [preauth]
Nov 23 09:53:58 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b44c73eb5ff0b2833430782bf819cd408944db4dd8aad1159616aa45f110f56e-merged.mount: Deactivated successfully.
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 2025-11-23 09:53:58.565409539 +0000 UTC m=+0.078587946 container create d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:53:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e.scope.
Nov 23 09:53:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 2025-11-23 09:53:58.530717609 +0000 UTC m=+0.043896016 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 2025-11-23 09:53:58.63580136 +0000 UTC m=+0.148979737 container init d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7)
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 2025-11-23 09:53:58.646715967 +0000 UTC m=+0.159894344 container start d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 2025-11-23 09:53:58.647948005 +0000 UTC m=+0.161126442 container attach d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph)
Nov 23 09:53:58 np0005532585.localdomain musing_moser[305841]: 167 167
Nov 23 09:53:58 np0005532585.localdomain systemd[1]: libpod-d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e.scope: Deactivated successfully.
Nov 23 09:53:58 np0005532585.localdomain podman[305826]: 2025-11-23 09:53:58.650595566 +0000 UTC m=+0.163773963 container died d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 23 09:53:58 np0005532585.localdomain podman[305846]: 2025-11-23 09:53:58.743146611 +0000 UTC m=+0.079733430 container remove d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Nov 23 09:53:58 np0005532585.localdomain systemd[1]: libpod-conmon-d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e.scope: Deactivated successfully.
Nov 23 09:53:58 np0005532585.localdomain ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v56: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:53:58 np0005532585.localdomain sudo[305792]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:58 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:58 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:58 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:58 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:53:58 np0005532585.localdomain sudo[305863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:53:58 np0005532585.localdomain sudo[305863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:58 np0005532585.localdomain sudo[305863]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:59 np0005532585.localdomain sudo[305881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:53:59 np0005532585.localdomain sudo[305881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)...
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 2025-11-23 09:53:59.408974413 +0000 UTC m=+0.068596117 container create fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, io.openshift.expose-services=, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:53:59 np0005532585.localdomain systemd[1]: Started libpod-conmon-fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a.scope.
Nov 23 09:53:59 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 2025-11-23 09:53:59.458344737 +0000 UTC m=+0.117966441 container init fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, io.openshift.expose-services=, release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 2025-11-23 09:53:59.465552928 +0000 UTC m=+0.125174632 container start fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, version=7)
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 2025-11-23 09:53:59.465685022 +0000 UTC m=+0.125306736 container attach fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, release=553, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:53:59 np0005532585.localdomain peaceful_blackwell[305930]: 167 167
Nov 23 09:53:59 np0005532585.localdomain systemd[1]: libpod-fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a.scope: Deactivated successfully.
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 2025-11-23 09:53:59.469388007 +0000 UTC m=+0.129009791 container died fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 09:53:59 np0005532585.localdomain podman[305915]: 2025-11-23 09:53:59.387873123 +0000 UTC m=+0.047494917 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:53:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-96e19531e99dc2ffad06b89eb3ebb6f5fc4b36721e44dc3a08d52edaf26b42bd-merged.mount: Deactivated successfully.
Nov 23 09:53:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ca561776aa0e1f2ec164ea75e23726104b971a45983f6b547bbd3f80ea60c050-merged.mount: Deactivated successfully.
Nov 23 09:53:59 np0005532585.localdomain podman[305935]: 2025-11-23 09:53:59.55149126 +0000 UTC m=+0.071987002 container remove fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 23 09:53:59 np0005532585.localdomain systemd[1]: libpod-conmon-fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a.scope: Deactivated successfully.
Nov 23 09:53:59 np0005532585.localdomain sudo[305881]: pam_unix(sudo:session): session closed for user root
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0)
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0)
Nov 23 09:53:59 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:59 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:53:59 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:53:59 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:59 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:53:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:53:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: pgmap v56: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)...
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:00.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0)
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0)
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3957521171' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:54:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 e86: 6 total, 6 up, 6 in
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr handle_mgr_map I was active but no longer am
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  e: '/usr/bin/ceph-mgr'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  0: '/usr/bin/ceph-mgr'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  1: '-n'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  2: 'mgr.np0005532585.gzafiw'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  3: '-f'
Nov 23 09:54:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:00.607+0000 7f5f21ba1640 -1 mgr handle_mgr_map I was active but no longer am
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  4: '--setuser'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  5: 'ceph'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  6: '--setgroup'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  7: 'ceph'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  8: '--default-log-to-file=false'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  9: '--default-log-to-journald=true'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  10: '--default-log-to-stderr=false'
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr respawn  exe_path /proc/self/exe
Nov 23 09:54:00 np0005532585.localdomain sshd[300894]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:54:00 np0005532585.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Nov 23 09:54:00 np0005532585.localdomain systemd[1]: session-69.scope: Consumed 27.610s CPU time.
Nov 23 09:54:00 np0005532585.localdomain systemd-logind[761]: Session 69 logged out. Waiting for processes to exit.
Nov 23 09:54:00 np0005532585.localdomain systemd-logind[761]: Removed session 69.
Nov 23 09:54:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: ignoring --setuser ceph since I am not root
Nov 23 09:54:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: ignoring --setgroup ceph since I am not root
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: pidfile_write: ignore empty --pid-file
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'alerts'
Nov 23 09:54:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:00.807+0000 7f5982561140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'balancer'
Nov 23 09:54:00 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:00.873+0000 7f5982561140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 09:54:00 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'cephadm'
Nov 23 09:54:00 np0005532585.localdomain sshd[305976]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:54:01 np0005532585.localdomain sshd[305976]: Accepted publickey for ceph-admin from 192.168.122.108 port 41152 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 09:54:01 np0005532585.localdomain systemd-logind[761]: New session 70 of user ceph-admin.
Nov 23 09:54:01 np0005532585.localdomain systemd[1]: Started Session 70 of User ceph-admin.
Nov 23 09:54:01 np0005532585.localdomain sshd[305976]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 09:54:01 np0005532585.localdomain sudo[305980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:54:01 np0005532585.localdomain sudo[305980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:01 np0005532585.localdomain sudo[305980]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)...
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1389503190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1389503190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' 
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: Activating manager daemon np0005532586.thmvqb
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/3957521171' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: osdmap e86: 6 total, 6 up, 6 in
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: mgrmap e35: np0005532586.thmvqb(active, starting, since 0.0355882s), standbys: np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mds metadata", "who": "mds.np0005532585.jcltnl"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mds metadata", "who": "mds.np0005532584.aoxjmw"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mds metadata", "who": "mds.np0005532586.mfohsb"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mgr metadata", "who": "np0005532583.orhywt", "id": "np0005532583.orhywt"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mds metadata"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mon metadata"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: Manager daemon np0005532586.thmvqb is now available
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: removing stray HostCache host record np0005532583.localdomain.devices.0
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"}]': finished
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"}]': finished
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/mirror_snapshot_schedule"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/mirror_snapshot_schedule"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/trash_purge_schedule"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/trash_purge_schedule"} : dispatch
Nov 23 09:54:01 np0005532585.localdomain sudo[305998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:54:01 np0005532585.localdomain sudo[305998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:01 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'crash'
Nov 23 09:54:01 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:01.549+0000 7f5982561140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 09:54:01 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 09:54:01 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'dashboard'
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'devicehealth'
Nov 23 09:54:02 np0005532585.localdomain podman[306093]: 2025-11-23 09:54:02.069292176 +0000 UTC m=+0.079219966 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_BRANCH=main)
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.096+0000 7f5982561140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 09:54:02 np0005532585.localdomain podman[306093]: 2025-11-23 09:54:02.179376912 +0000 UTC m=+0.189304732 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, ceph=True, GIT_BRANCH=main, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, name=rhceph)
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]:   from numpy import show_config as show_numpy_config
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.245+0000 7f5982561140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'influx'
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.312+0000 7f5982561140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'insights'
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'iostat'
Nov 23 09:54:02 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.433+0000 7f5982561140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'k8sevents'
Nov 23 09:54:02 np0005532585.localdomain ceph-mon[300199]: mgrmap e36: np0005532586.thmvqb(active, since 1.05257s), standbys: np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:54:02 np0005532585.localdomain ceph-mon[300199]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'localpool'
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 09:54:02 np0005532585.localdomain sudo[305998]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:02 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'mirroring'
Nov 23 09:54:02 np0005532585.localdomain sudo[306214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:54:02 np0005532585.localdomain sudo[306214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:02 np0005532585.localdomain sudo[306214]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'nfs'
Nov 23 09:54:03 np0005532585.localdomain sudo[306232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:54:03 np0005532585.localdomain sudo[306232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.159+0000 7f5982561140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'orchestrator'
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.301+0000 7f5982561140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.367+0000 7f5982561140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'osd_support'
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.421+0000 7f5982561140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.487+0000 7f5982561140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'progress'
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.546+0000 7f5982561140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'prometheus'
Nov 23 09:54:03 np0005532585.localdomain sudo[306232]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:54:01] ENGINE Bus STARTING
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Serving on http://172.18.0.108:8765
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Serving on https://172.18.0.108:7150
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Bus STARTED
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Client ('172.18.0.108', 35224) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.846+0000 7f5982561140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'rbd_support'
Nov 23 09:54:03 np0005532585.localdomain sudo[306279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:54:03 np0005532585.localdomain sudo[306279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:03 np0005532585.localdomain sudo[306279]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:03 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.927+0000 7f5982561140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 09:54:03 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'restful'
Nov 23 09:54:03 np0005532585.localdomain sudo[306297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:54:03 np0005532585.localdomain sudo[306297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'rgw'
Nov 23 09:54:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.251+0000 7f5982561140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'rook'
Nov 23 09:54:04 np0005532585.localdomain sudo[306297]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:04 np0005532585.localdomain sudo[306334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:54:04 np0005532585.localdomain sudo[306334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:04 np0005532585.localdomain sudo[306334]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:04 np0005532585.localdomain sudo[306352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:54:04 np0005532585.localdomain sudo[306352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:04 np0005532585.localdomain sudo[306352]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.671+0000 7f5982561140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'selftest'
Nov 23 09:54:04 np0005532585.localdomain sudo[306370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:54:04 np0005532585.localdomain sudo[306370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:04 np0005532585.localdomain sudo[306370]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.731+0000 7f5982561140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'snap_schedule'
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'stats'
Nov 23 09:54:04 np0005532585.localdomain sudo[306406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:54:04 np0005532585.localdomain sudo[306406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:04 np0005532585.localdomain sudo[306406]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:04 np0005532585.localdomain podman[306387]: 2025-11-23 09:54:04.828455548 +0000 UTC m=+0.097071646 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'status'
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: tmp-crun.iPaacG.mount: Deactivated successfully.
Nov 23 09:54:04 np0005532585.localdomain sudo[306443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:54:04 np0005532585.localdomain podman[306389]: 2025-11-23 09:54:04.905182885 +0000 UTC m=+0.166996363 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 23 09:54:04 np0005532585.localdomain sudo[306443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:04 np0005532585.localdomain sudo[306443]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:04 np0005532585.localdomain podman[306389]: 2025-11-23 09:54:04.911991136 +0000 UTC m=+0.173804604 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 23 09:54:04 np0005532585.localdomain podman[306388]: 2025-11-23 09:54:04.920427006 +0000 UTC m=+0.189076955 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: mgrmap e37: np0005532586.thmvqb(active, since 3s), standbys: np0005532583.orhywt, np0005532584.naxwxy
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:54:04 np0005532585.localdomain podman[306388]: 2025-11-23 09:54:04.92737886 +0000 UTC m=+0.196028769 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:54:04 np0005532585.localdomain podman[306387]: 2025-11-23 09:54:04.962326078 +0000 UTC m=+0.230942156 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:54:04 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.967+0000 7f5982561140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 09:54:04 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'telegraf'
Nov 23 09:54:04 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:54:05 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.027+0000 7f5982561140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'telemetry'
Nov 23 09:54:05 np0005532585.localdomain sudo[306502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:54:05 np0005532585.localdomain sudo[306502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306502]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain sudo[306520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:54:05 np0005532585.localdomain sudo[306520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306520]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.166+0000 7f5982561140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 09:54:05 np0005532585.localdomain sudo[306538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:54:05 np0005532585.localdomain sudo[306538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306538]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain sudo[306556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:54:05 np0005532585.localdomain sudo[306556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306556]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.314+0000 7f5982561140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'volumes'
Nov 23 09:54:05 np0005532585.localdomain sudo[306574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:54:05 np0005532585.localdomain sudo[306574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306574]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:05.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:05 np0005532585.localdomain sudo[306592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:54:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:05.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:05 np0005532585.localdomain sudo[306592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306592]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.503+0000 7f5982561140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Loading python module 'zabbix'
Nov 23 09:54:05 np0005532585.localdomain sudo[306610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:54:05 np0005532585.localdomain sudo[306610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306610]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.561+0000 7f5982561140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x558ea9c4d600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Nov 23 09:54:05 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.108:6810/335107178
Nov 23 09:54:05 np0005532585.localdomain sudo[306628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:54:05 np0005532585.localdomain sudo[306628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306628]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain sudo[306662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:54:05 np0005532585.localdomain sudo[306662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306662]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain sudo[306680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:54:05 np0005532585.localdomain sudo[306680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306680]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain sudo[306698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:54:05 np0005532585.localdomain sudo[306698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306698]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:05 np0005532585.localdomain ceph-mon[300199]: Standby manager daemon np0005532585.gzafiw started
Nov 23 09:54:05 np0005532585.localdomain sudo[306716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:54:05 np0005532585.localdomain sudo[306716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:05 np0005532585.localdomain sudo[306716]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:54:06 np0005532585.localdomain sudo[306734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306734]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:54:06 np0005532585.localdomain sudo[306752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306752]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:54:06 np0005532585.localdomain sudo[306770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306770]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:06.247 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:06 np0005532585.localdomain sudo[306788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:54:06 np0005532585.localdomain sudo[306788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306788]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:54:06 np0005532585.localdomain sudo[306822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306822]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:54:06 np0005532585.localdomain sudo[306840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306840]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:54:06 np0005532585.localdomain sudo[306858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306858]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:54:06 np0005532585.localdomain sudo[306876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306876]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:54:06 np0005532585.localdomain sudo[306894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306894]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:54:06 np0005532585.localdomain sudo[306912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306912]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:54:06 np0005532585.localdomain sudo[306930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306930]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain sudo[306948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:54:06 np0005532585.localdomain sudo[306948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:06 np0005532585.localdomain sudo[306948]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: mgrmap e38: np0005532586.thmvqb(active, since 5s), standbys: np0005532583.orhywt, np0005532584.naxwxy, np0005532585.gzafiw
Nov 23 09:54:06 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:54:07 np0005532585.localdomain sudo[306982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:54:07 np0005532585.localdomain sudo[306982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:07 np0005532585.localdomain sudo[306982]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:07 np0005532585.localdomain sudo[307000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:54:07 np0005532585.localdomain sudo[307000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:07 np0005532585.localdomain sudo[307000]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:07.208 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:07 np0005532585.localdomain sudo[307018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:54:07 np0005532585.localdomain sudo[307018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:07 np0005532585.localdomain sudo[307018]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:07 np0005532585.localdomain sudo[307036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:54:07 np0005532585.localdomain sudo[307036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:07 np0005532585.localdomain sudo[307036]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4225034591' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 23 09:54:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.521 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.522 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.522 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:54:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:08.523 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2776620448' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2315736127' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:09.020 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:54:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:09.045 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:54:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:09.046 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:54:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:09.046 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:09.047 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:09.047 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:54:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:54:09.292 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:54:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:54:09.292 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:54:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:54:09.293 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3629266017' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)...
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:09 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain
Nov 23 09:54:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:10.421 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:10.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.229 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.230 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.230 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.231 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.231 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1079439071' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.687 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.756 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:54:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:11.756 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)...
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain
Nov 23 09:54:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1079439071' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:54:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:54:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:54:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:54:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:54:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18716 "" "Go-http-client/1.1"
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.007 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.008 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11713MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.008 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.008 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.076 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.077 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.077 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.100 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.133 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.134 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.155 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.197 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.246 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/285259482' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.668 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.674 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.694 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.697 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:54:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:12.697 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)...
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/285259482' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:54:12 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:13 np0005532585.localdomain sudo[307098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:54:13 np0005532585.localdomain sudo[307098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:13 np0005532585.localdomain sudo[307098]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:13.699 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:54:13 np0005532585.localdomain ceph-mon[300199]: from='client.64151 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:54:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:54:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:54:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:54:14 np0005532585.localdomain podman[307117]: 2025-11-23 09:54:14.020505339 +0000 UTC m=+0.076373837 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:54:14 np0005532585.localdomain podman[307117]: 2025-11-23 09:54:14.034581103 +0000 UTC m=+0.090449601 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:54:14 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:54:14 np0005532585.localdomain podman[307116]: 2025-11-23 09:54:14.114049975 +0000 UTC m=+0.169916603 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 23 09:54:14 np0005532585.localdomain podman[307116]: 2025-11-23 09:54:14.12816019 +0000 UTC m=+0.184026828 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true)
Nov 23 09:54:14 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:54:14 np0005532585.localdomain ceph-mon[300199]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:54:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:15.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:15 np0005532585.localdomain sudo[307156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:54:15 np0005532585.localdomain sudo[307156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:15 np0005532585.localdomain sudo[307156]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='client.54220 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: Saving service mon spec with placement label:mon
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532584 (monmap changed)...
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain
Nov 23 09:54:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:16 np0005532585.localdomain sudo[307174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:54:16 np0005532585.localdomain sudo[307174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:16 np0005532585.localdomain sudo[307174]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:16 np0005532585.localdomain sudo[307192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:54:16 np0005532585.localdomain sudo[307192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:54:16 np0005532585.localdomain podman[307225]: 
Nov 23 09:54:16 np0005532585.localdomain podman[307225]: 2025-11-23 09:54:16.987959427 +0000 UTC m=+0.075652785 container create cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, name=rhceph, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container)
Nov 23 09:54:17 np0005532585.localdomain systemd[1]: Started libpod-conmon-cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860.scope.
Nov 23 09:54:17 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:54:17 np0005532585.localdomain podman[307225]: 2025-11-23 09:54:16.957551749 +0000 UTC m=+0.045245167 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 09:54:17 np0005532585.localdomain podman[307225]: 2025-11-23 09:54:17.056902794 +0000 UTC m=+0.144596142 container init cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, release=553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 09:54:17 np0005532585.localdomain podman[307225]: 2025-11-23 09:54:17.067766939 +0000 UTC m=+0.155460287 container start cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7)
Nov 23 09:54:17 np0005532585.localdomain podman[307225]: 2025-11-23 09:54:17.068195893 +0000 UTC m=+0.155889281 container attach cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, name=rhceph, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=)
Nov 23 09:54:17 np0005532585.localdomain determined_williamson[307240]: 167 167
Nov 23 09:54:17 np0005532585.localdomain systemd[1]: libpod-cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860.scope: Deactivated successfully.
Nov 23 09:54:17 np0005532585.localdomain podman[307225]: 2025-11-23 09:54:17.071778603 +0000 UTC m=+0.159472051 container died cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public)
Nov 23 09:54:17 np0005532585.localdomain podman[307245]: 2025-11-23 09:54:17.169306332 +0000 UTC m=+0.086213401 container remove cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 23 09:54:17 np0005532585.localdomain systemd[1]: libpod-conmon-cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860.scope: Deactivated successfully.
Nov 23 09:54:17 np0005532585.localdomain sudo[307192]: pam_unix(sudo:session): session closed for user root
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532585 (monmap changed)...
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='client.54223 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532586", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: mgrmap e39: np0005532586.thmvqb(active, since 16s), standbys: np0005532584.naxwxy, np0005532585.gzafiw
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:54:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6ef8a841a0ee9a462d743fb83008423335eef41a1bcc46f3912cc9f5eac26fdb-merged.mount: Deactivated successfully.
Nov 23 09:54:18 np0005532585.localdomain ceph-mon[300199]: Reconfiguring mon.np0005532586 (monmap changed)...
Nov 23 09:54:18 np0005532585.localdomain ceph-mon[300199]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain
Nov 23 09:54:18 np0005532585.localdomain ceph-mon[300199]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:54:18 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:18 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:54:19 np0005532585.localdomain sshd[307262]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:54:19 np0005532585.localdomain sshd[307262]: Invalid user p from 107.172.15.139 port 44886
Nov 23 09:54:19 np0005532585.localdomain sshd[307262]: Received disconnect from 107.172.15.139 port 44886:11: Bye Bye [preauth]
Nov 23 09:54:19 np0005532585.localdomain sshd[307262]: Disconnected from invalid user p 107.172.15.139 port 44886 [preauth]
Nov 23 09:54:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:20.428 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:20 np0005532585.localdomain ceph-mon[300199]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:22 np0005532585.localdomain ceph-mon[300199]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:54:24 np0005532585.localdomain ceph-mon[300199]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:25 np0005532585.localdomain podman[307264]: 2025-11-23 09:54:25.040572013 +0000 UTC m=+0.092792934 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:54:25 np0005532585.localdomain podman[307264]: 2025-11-23 09:54:25.08132797 +0000 UTC m=+0.133548951 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:54:25 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:54:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:25.430 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:54:26 np0005532585.localdomain ceph-mon[300199]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:54:27 np0005532585.localdomain systemd[1]: tmp-crun.UfsKap.mount: Deactivated successfully.
Nov 23 09:54:27 np0005532585.localdomain podman[307282]: 2025-11-23 09:54:27.03036707 +0000 UTC m=+0.086327354 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:54:27 np0005532585.localdomain podman[307282]: 2025-11-23 09:54:27.045343162 +0000 UTC m=+0.101303456 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:54:27 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.211359) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667211401, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1666, "num_deletes": 255, "total_data_size": 5704233, "memory_usage": 5979112, "flush_reason": "Manual Compaction"}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667228512, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3362442, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15314, "largest_seqno": 16975, "table_properties": {"data_size": 3355375, "index_size": 3892, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18742, "raw_average_key_size": 22, "raw_value_size": 3339894, "raw_average_value_size": 4004, "num_data_blocks": 170, "num_entries": 834, "num_filter_entries": 834, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891633, "oldest_key_time": 1763891633, "file_creation_time": 1763891667, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 17188 microseconds, and 7932 cpu microseconds.
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.228549) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3362442 bytes OK
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.228570) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.232642) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.232670) EVENT_LOG_v1 {"time_micros": 1763891667232663, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.232693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5695784, prev total WAL file size 5696533, number of live WAL files 2.
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.234020) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373536' seq:72057594037927935, type:22 .. '6D6772737461740034303038' seq:0, type:0; will stop at (end)
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(3283KB)], [21(16MB)]
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667234084, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 20587021, "oldest_snapshot_seqno": -1}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12208 keys, 18323804 bytes, temperature: kUnknown
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667319029, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 18323804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18254647, "index_size": 37568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30533, "raw_key_size": 325777, "raw_average_key_size": 26, "raw_value_size": 18047217, "raw_average_value_size": 1478, "num_data_blocks": 1437, "num_entries": 12208, "num_filter_entries": 12208, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891667, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.319402) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 18323804 bytes
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.321362) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 241.9 rd, 215.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 16.4 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(11.6) write-amplify(5.4) OK, records in: 12736, records dropped: 528 output_compression: NoCompression
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.321391) EVENT_LOG_v1 {"time_micros": 1763891667321378, "job": 10, "event": "compaction_finished", "compaction_time_micros": 85092, "compaction_time_cpu_micros": 52007, "output_level": 6, "num_output_files": 1, "total_output_size": 18323804, "num_input_records": 12736, "num_output_records": 12208, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667322103, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667324725, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.233955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324835) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:54:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:28 np0005532585.localdomain ceph-mon[300199]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:54:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:54:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:30.432 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:54:30 np0005532585.localdomain ceph-mon[300199]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:32 np0005532585.localdomain ceph-mon[300199]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:34 np0005532585.localdomain ceph-mon[300199]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:35.435 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:54:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:35.436 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:35.437 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:54:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:35.437 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:54:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:35.437 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:54:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:35.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:54:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:54:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:54:36 np0005532585.localdomain systemd[1]: tmp-crun.2CCjJg.mount: Deactivated successfully.
Nov 23 09:54:36 np0005532585.localdomain podman[307307]: 2025-11-23 09:54:36.039997892 +0000 UTC m=+0.092923047 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 09:54:36 np0005532585.localdomain podman[307307]: 2025-11-23 09:54:36.078193581 +0000 UTC m=+0.131118666 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:54:36 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:54:36 np0005532585.localdomain podman[307305]: 2025-11-23 09:54:36.1310051 +0000 UTC m=+0.187081832 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:54:36 np0005532585.localdomain podman[307306]: 2025-11-23 09:54:36.181715685 +0000 UTC m=+0.234505786 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:54:36 np0005532585.localdomain podman[307306]: 2025-11-23 09:54:36.19129472 +0000 UTC m=+0.244084821 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:54:36 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:54:36 np0005532585.localdomain podman[307305]: 2025-11-23 09:54:36.249396413 +0000 UTC m=+0.305473165 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:54:36 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:54:36 np0005532585.localdomain ceph-mon[300199]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:38 np0005532585.localdomain ceph-mon[300199]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:40.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:54:40 np0005532585.localdomain ceph-mon[300199]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:54:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:54:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:54:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:54:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:54:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1"
Nov 23 09:54:42 np0005532585.localdomain ceph-mon[300199]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:44 np0005532585.localdomain ceph-mon[300199]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:54:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:54:45 np0005532585.localdomain podman[307368]: 2025-11-23 09:54:45.034060815 +0000 UTC m=+0.077317357 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:54:45 np0005532585.localdomain podman[307367]: 2025-11-23 09:54:45.090358771 +0000 UTC m=+0.134427358 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:54:45 np0005532585.localdomain podman[307368]: 2025-11-23 09:54:45.099161273 +0000 UTC m=+0.142417765 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:54:45 np0005532585.localdomain podman[307367]: 2025-11-23 09:54:45.106536231 +0000 UTC m=+0.150604868 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd)
Nov 23 09:54:45 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:54:45 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:54:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:45.442 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:46 np0005532585.localdomain ceph-mon[300199]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:48 np0005532585.localdomain ceph-mon[300199]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:50.444 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:50 np0005532585.localdomain ceph-mon[300199]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:52 np0005532585.localdomain ceph-mon[300199]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:54 np0005532585.localdomain ceph-mon[300199]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Nov 23 09:54:54 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1282514036' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:54:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:54:55.448 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:54:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/1282514036' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 23 09:54:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:54:56 np0005532585.localdomain podman[307409]: 2025-11-23 09:54:56.039157975 +0000 UTC m=+0.097111206 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 23 09:54:56 np0005532585.localdomain podman[307409]: 2025-11-23 09:54:56.050175242 +0000 UTC m=+0.108128493 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 23 09:54:56 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:54:56 np0005532585.localdomain ceph-mon[300199]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:54:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:54:58 np0005532585.localdomain podman[307428]: 2025-11-23 09:54:58.03345967 +0000 UTC m=+0.085166631 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:54:58 np0005532585.localdomain podman[307428]: 2025-11-23 09:54:58.047325004 +0000 UTC m=+0.099031955 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:54:58 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:54:58 np0005532585.localdomain ceph-mon[300199]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:54:58 np0005532585.localdomain sshd[307451]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:54:59 np0005532585.localdomain sshd[307451]: Received disconnect from 207.154.194.2 port 57256:11: Bye Bye [preauth]
Nov 23 09:54:59 np0005532585.localdomain sshd[307451]: Disconnected from authenticating user root 207.154.194.2 port 57256 [preauth]
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:54:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:54:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:55:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:00.449 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:55:00 np0005532585.localdomain ceph-mon[300199]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1481934905' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:55:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1481934905' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:55:01 np0005532585.localdomain ceph-mon[300199]: from='client.54247 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.248088) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702248128, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 637, "num_deletes": 256, "total_data_size": 552338, "memory_usage": 564184, "flush_reason": "Manual Compaction"}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702254044, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 353856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16980, "largest_seqno": 17612, "table_properties": {"data_size": 350917, "index_size": 922, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6845, "raw_average_key_size": 18, "raw_value_size": 344937, "raw_average_value_size": 927, "num_data_blocks": 41, "num_entries": 372, "num_filter_entries": 372, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891667, "oldest_key_time": 1763891667, "file_creation_time": 1763891702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 6009 microseconds, and 2283 cpu microseconds.
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.254096) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 353856 bytes OK
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.254117) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257195) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257218) EVENT_LOG_v1 {"time_micros": 1763891702257211, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257237) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 548787, prev total WAL file size 549111, number of live WAL files 2.
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.258040) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373733' seq:72057594037927935, type:22 .. '6C6F676D0034303235' seq:0, type:0; will stop at (end)
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(345KB)], [24(17MB)]
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702258141, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 18677660, "oldest_snapshot_seqno": -1}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12055 keys, 18578998 bytes, temperature: kUnknown
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702347251, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 18578998, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18509741, "index_size": 38052, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323539, "raw_average_key_size": 26, "raw_value_size": 18303804, "raw_average_value_size": 1518, "num_data_blocks": 1455, "num_entries": 12055, "num_filter_entries": 12055, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.347579) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 18578998 bytes
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.349749) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 209.4 rd, 208.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 17.5 +0.0 blob) out(17.7 +0.0 blob), read-write-amplify(105.3) write-amplify(52.5) OK, records in: 12580, records dropped: 525 output_compression: NoCompression
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.349779) EVENT_LOG_v1 {"time_micros": 1763891702349766, "job": 12, "event": "compaction_finished", "compaction_time_micros": 89184, "compaction_time_cpu_micros": 49321, "output_level": 6, "num_output_files": 1, "total_output_size": 18578998, "num_input_records": 12580, "num_output_records": 12055, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702349987, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702352404, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:04 np0005532585.localdomain ceph-mon[300199]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:05.452 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:06.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:06 np0005532585.localdomain ceph-mon[300199]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:55:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:55:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:55:07 np0005532585.localdomain podman[307453]: 2025-11-23 09:55:07.031214495 +0000 UTC m=+0.083379976 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 23 09:55:07 np0005532585.localdomain systemd[1]: tmp-crun.Wzehwb.mount: Deactivated successfully.
Nov 23 09:55:07 np0005532585.localdomain podman[307455]: 2025-11-23 09:55:07.098878661 +0000 UTC m=+0.144382658 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm)
Nov 23 09:55:07 np0005532585.localdomain podman[307455]: 2025-11-23 09:55:07.111132985 +0000 UTC m=+0.156636972 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 23 09:55:07 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:55:07 np0005532585.localdomain podman[307453]: 2025-11-23 09:55:07.128078862 +0000 UTC m=+0.180244303 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 23 09:55:07 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:55:07 np0005532585.localdomain podman[307454]: 2025-11-23 09:55:07.181982348 +0000 UTC m=+0.230690744 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 09:55:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:07.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:07 np0005532585.localdomain podman[307454]: 2025-11-23 09:55:07.210779847 +0000 UTC m=+0.259488243 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 23 09:55:07 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:55:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Nov 23 09:55:07 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3373261952' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Nov 23 09:55:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:08.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:55:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/3373261952' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Nov 23 09:55:08 np0005532585.localdomain ceph-mon[300199]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1696104249' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:09.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1663917042' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2747986739' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:55:09.293 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:55:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:55:09.294 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:55:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:55:09.294 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:55:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2084422730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:10 np0005532585.localdomain ceph-mon[300199]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.454 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.551 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.552 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.552 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:55:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:10.553 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.810 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.811 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.826 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.827 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '461bb781-f9b6-434c-bdf2-2c827cf6f236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.812103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ff75bac-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': 'f84887a25236d38fb7de0f981be81343c8558a1303a98ef082786858d20eab62'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.812103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ff777ae-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '8e9611ef5810158105432a388a73212eda1bfa78508e457cb89afec1c05e5ad5'}]}, 'timestamp': '2025-11-23 09:55:10.828590', '_unique_id': '3cc11a1d5d4d49638e049331a9cdde1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.837 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34a252d6-355d-48d8-ac2f-ae68693362ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.832844', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '7ff8fe80-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '598b50ca2630bd815a021ae52fba1ebc367a08e76ec46ea5a30522ff0e2a4b40'}]}, 'timestamp': '2025-11-23 09:55:10.838631', '_unique_id': '8c2e1d1fa30f445ca8a61a23385b574f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3899128-c954-4aa3-99c5-ad1c37a495c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.841880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ffe4606-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '4d6f467d8c67d1bea9819ae4b9481a519b56cf41a716da1612bfb3f61d11f2ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.841880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ffe58ee-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'a16d72daa859c35470316b0f443f410614435cd2258b794c6100e7cc9d6638ba'}]}, 'timestamp': '2025-11-23 09:55:10.873664', '_unique_id': '40189cabbf1940daadee49af75842e58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c4038e0-79f8-4a29-b2c7-0693a1321e43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.876680', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '7ffee30e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '395f3356e5c33b155a5d5a5f1497582977d9d49fcfbe818d50ab0ede2ac132aa'}]}, 'timestamp': '2025-11-23 09:55:10.877200', '_unique_id': '9531847849064bdf9802a69deca004a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e04464d-9419-4246-80f4-fdfdeeeb7730', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.879538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7fff50f0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'b69935fd959733e0246db9c4914b8f73d0369b1651a722433558d3fa470a98ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.879538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7fff6234-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '899acac412ca443269aa5c213106b7008f58633f6e5d7103738ae73eea3ad532'}]}, 'timestamp': '2025-11-23 09:55:10.880413', '_unique_id': 'd4d596de2e7a45c6afff00b569719dbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c82e02d-b15b-41ec-a9b3-28fa7bbff000', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.882781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7fffd034-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'b92cb3120b1ca125b0418081139e938afe3b13b200711748a5849912a5be2c84'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.882781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7fffdfc0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '80cb80ea9675d074bfbb4277197af0fdc72823d3af87e8d96cab6b02812c08dc'}]}, 'timestamp': '2025-11-23 09:55:10.883623', '_unique_id': '34808c7f1c3a4554864de3b3ed581a66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1b18c5d-64f4-4f22-b10e-203839bcff08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.885800', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '80004622-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '7ff59900dda9a6de6ab19d3057cfebe612762f446e743efcb65fe1411f1e312e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.885800', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8000559a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '1eb11d59957b81bc6831bd0e276aa628fe656f5003cda6b2ae45a1ec04fa9917'}]}, 'timestamp': '2025-11-23 09:55:10.886635', '_unique_id': '8b02285fb2a641698168cb1ca8bc19e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '719860d8-13dd-4c1f-8e12-c7d91a1497ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.888958', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8000c08e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': 'eabfc646825daa9caad977557b7892cb49a7c8d296214cc37e17f07c38be8049'}]}, 'timestamp': '2025-11-23 09:55:10.889408', '_unique_id': 'c8170143589943a49e8a58c68b7b91de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e42eb5e-c21b-4638-92bf-4620c0b5d1c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.891485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '800122a4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '385c7a99f94c8e713637f12eb17a38048755d78444501dd38fd98e01827e012c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.891485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8001333e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '57e7b57841082ed6c842b33d31ba38df9af571e888ff14406a9c968372ac3dcb'}]}, 'timestamp': '2025-11-23 09:55:10.892311', '_unique_id': 'a8cbcc9cb8cc43478114a7d8486f7a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '184f8b34-67eb-4f73-8e99-eb18b96a8d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.894636', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8001a35a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '7abfc725a105fbdabb920837b700fccc3f9843f2257b06f85ac2d68b1322ae6e'}]}, 'timestamp': '2025-11-23 09:55:10.895288', '_unique_id': '311195af1dea41b68d4cb9e0f45d6c0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 14710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1455f2f3-8980-4a0f-8dba-d1defc9c4480', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14710000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:55:10.897729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8004a7ee-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.091759399, 'message_signature': '24273712293c572f6644f815065cb9cb2a1b981779f78cd277c8599baa56dd85'}]}, 'timestamp': '2025-11-23 09:55:10.915126', '_unique_id': '009b091f9a4d4e40ae9577fcaccb2da0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f040e79e-75b7-426b-8d56-88b1ee6d8ae8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.918112', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '80053af6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'e8b11e2bd789c8285d2fc6dbd842111c3202814d1552a4e7cac20631447040ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.918112', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80054ef6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '18a7b913037ab2c1e36fdcace057519e6b3a2d0f28e0f7d8adf8c25cd9ff8c5b'}]}, 'timestamp': '2025-11-23 09:55:10.919245', '_unique_id': '1b9cab1985714b58b1cf15b25e4a19f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd3c381f-7e7c-4b2c-92dd-16cb3925a96e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.921593', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8005bbd4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': 'a38f5baa4b1ece17e7ea45a34a1c9db82113219c21b47b4d34732405c13e87b3'}]}, 'timestamp': '2025-11-23 09:55:10.922087', '_unique_id': 'e91e1bd4b54240f1a179a65c60e01e4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c40d189f-985a-40e0-83d1-c047cb28f92a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.924223', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '80062254-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '1c92ec8f6c11eeddbe13aef1198d1a926f44f026c2d9ec1023fa94927180a1a0'}]}, 'timestamp': '2025-11-23 09:55:10.924676', '_unique_id': 'ff63f47aabf24923a56a3124f4a338d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c80c572-4a41-4588-9846-1d155790a092', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.926770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8006873a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '79e492b1420ae37f46b311c98370e8b6ad7ffd63e9d3b637a19fa28a530633f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.926770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80069900-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '722e3627b67b49896fe35a925cdb0e1513d501f7439899f9c633ed349c88c6c6'}]}, 'timestamp': '2025-11-23 09:55:10.927694', '_unique_id': 'cba96551f4ce459a8c9f1441ffc1d5ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8f2e85a-323b-4cfd-aff6-a047d550abfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.930065', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '800709c6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': 'b7d42dac3d68fd4daaf247634466a145af0b18f6902767c09646ee0d2ccf993b'}]}, 'timestamp': '2025-11-23 09:55:10.930651', '_unique_id': 'c5ddd3d033d74ab58171cec8ad834260'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fe48602-7a45-4110-98c0-0102690e1c14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:55:10.932823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8007767c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.091759399, 'message_signature': '780232b365cb150951fc3b9c6f802b7aaa54a73dcea8ffb2f9567acc5dcf6f55'}]}, 'timestamp': '2025-11-23 09:55:10.933448', '_unique_id': '6f6a0fab7b154176b33935662afd3b69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b54372d9-3976-489c-8cfc-05a03dbea9fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.935876', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8007ee4a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '998687858efaf0a125b9f253706ec4313fc913817cb69424b6cfd5f2301233bd'}]}, 'timestamp': '2025-11-23 09:55:10.936459', '_unique_id': '5967c00af50143b194e9e39451a3f9e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.939 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aad9812-4cf8-4a4d-9a84-a9320043aa53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.938969', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '800865b4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '06ef85120bfadd16b1dc4f84f67a1326b0aed6cdeabff34e8abc99edbeae73f6'}]}, 'timestamp': '2025-11-23 09:55:10.939558', '_unique_id': 'c6745011888e4ed6adde6aa64ddc4289'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.941 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5890a598-fbaa-4333-8bdd-67bad04f4388', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.941821', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8008d67a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '30ecf30e965ad68ce593f9e458e14fe8859664fe8ba6e2e04e02834049b85fb8'}]}, 'timestamp': '2025-11-23 09:55:10.942479', '_unique_id': 'f3237ba01a2646c28903efefc6c656a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.944 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.945 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '891913eb-9aa2-4340-8bf3-8f669c72ab53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.944649', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '800943ee-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '11b15efb534a4aa162fb2b516b6d3414bd598984c9d93cec42123bc92285ef09'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.944649', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80095848-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'f00ec1cf9a88335fcc99692fc4f4d8129975a4a92f28148c81edcb3891985c5e'}]}, 'timestamp': '2025-11-23 09:55:10.945707', '_unique_id': '57523b6a4bef47d5bc34af1e5eb110eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:55:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:55:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:11.049 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:55:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:11.086 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:55:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:11.086 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:55:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:55:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:55:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:55:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:55:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:55:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18715 "" "Go-http-client/1.1"
Nov 23 09:55:12 np0005532585.localdomain ceph-mon[300199]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:12.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.231 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.231 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.232 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.232 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.232 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:55:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:55:13 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/466749514' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.688 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.764 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.764 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.966 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.968 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11701MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.969 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:55:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:13.969 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.069 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.070 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.070 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.122 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:55:14 np0005532585.localdomain ceph-mon[300199]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/466749514' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:14 np0005532585.localdomain ceph-mon[300199]: from='client.64199 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 23 09:55:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:55:14 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1799410222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.626 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.631 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.646 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.647 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:55:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:14.648 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:55:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:15.457 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1799410222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:55:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:55:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:55:16 np0005532585.localdomain systemd[1]: tmp-crun.55sCPn.mount: Deactivated successfully.
Nov 23 09:55:16 np0005532585.localdomain podman[307560]: 2025-11-23 09:55:16.034638103 +0000 UTC m=+0.085453671 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:55:16 np0005532585.localdomain podman[307559]: 2025-11-23 09:55:16.082578956 +0000 UTC m=+0.135780667 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:55:16 np0005532585.localdomain podman[307560]: 2025-11-23 09:55:16.099697169 +0000 UTC m=+0.150512787 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:55:16 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:55:16 np0005532585.localdomain podman[307559]: 2025-11-23 09:55:16.122470164 +0000 UTC m=+0.175671825 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 23 09:55:16 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:55:16 np0005532585.localdomain ceph-mon[300199]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:18 np0005532585.localdomain sudo[307601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:55:18 np0005532585.localdomain sudo[307601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:18 np0005532585.localdomain sudo[307601]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:18 np0005532585.localdomain ceph-mon[300199]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:18 np0005532585.localdomain sudo[307619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:55:18 np0005532585.localdomain sudo[307619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:18.644 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:55:18 np0005532585.localdomain sudo[307619]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:19 np0005532585.localdomain sudo[307670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:55:19 np0005532585.localdomain sudo[307670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:19 np0005532585.localdomain sudo[307670]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:55:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:55:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:55:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:55:20 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 09:55:20 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1186317306' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:55:20 np0005532585.localdomain ceph-mon[300199]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/1186317306' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:55:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:20.461 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' 
Nov 23 09:55:22 np0005532585.localdomain ceph-mon[300199]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0)
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1136241170' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 e87: 6 total, 6 up, 6 in
Nov 23 09:55:23 np0005532585.localdomain sshd[305976]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 23 09:55:23 np0005532585.localdomain systemd[1]: session-70.scope: Deactivated successfully.
Nov 23 09:55:23 np0005532585.localdomain systemd[1]: session-70.scope: Consumed 7.453s CPU time.
Nov 23 09:55:23 np0005532585.localdomain systemd-logind[761]: Session 70 logged out. Waiting for processes to exit.
Nov 23 09:55:23 np0005532585.localdomain systemd-logind[761]: Removed session 70.
Nov 23 09:55:23 np0005532585.localdomain sshd[307688]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:55:23 np0005532585.localdomain sshd[307688]: Accepted publickey for ceph-admin from 192.168.122.106 port 43386 ssh2: RSA SHA256:J0VOv54ngvc48ItLlXIT/aOAFSV4K46aP9YyQF5FIQo
Nov 23 09:55:23 np0005532585.localdomain systemd-logind[761]: New session 71 of user ceph-admin.
Nov 23 09:55:23 np0005532585.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Nov 23 09:55:23 np0005532585.localdomain sshd[307688]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 23 09:55:23 np0005532585.localdomain sudo[307692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:55:23 np0005532585.localdomain sudo[307692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:23 np0005532585.localdomain sudo[307692]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:23 np0005532585.localdomain sudo[307710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 09:55:23 np0005532585.localdomain sudo[307710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.200:0/1136241170' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: Activating manager daemon np0005532584.naxwxy
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: osdmap e87: 6 total, 6 up, 6 in
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: mgrmap e40: np0005532584.naxwxy(active, starting, since 0.0272642s), standbys: np0005532585.gzafiw
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata", "who": "mds.np0005532585.jcltnl"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata", "who": "mds.np0005532584.aoxjmw"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata", "who": "mds.np0005532586.mfohsb"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532584.naxwxy", "id": "np0005532584.naxwxy"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532585.gzafiw", "id": "np0005532585.gzafiw"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mds metadata"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mon metadata"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: Manager daemon np0005532584.naxwxy is now available
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/mirror_snapshot_schedule"} : dispatch
Nov 23 09:55:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/trash_purge_schedule"} : dispatch
Nov 23 09:55:24 np0005532585.localdomain podman[307800]: 2025-11-23 09:55:24.719879457 +0000 UTC m=+0.094282240 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vcs-type=git, release=553, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=)
Nov 23 09:55:24 np0005532585.localdomain podman[307800]: 2025-11-23 09:55:24.827583765 +0000 UTC m=+0.201986488 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 23 09:55:25 np0005532585.localdomain ceph-mon[300199]: mgrmap e41: np0005532584.naxwxy(active, since 1.05703s), standbys: np0005532585.gzafiw
Nov 23 09:55:25 np0005532585.localdomain ceph-mon[300199]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:25 np0005532585.localdomain sudo[307710]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:25.464 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:25.470 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:25 np0005532585.localdomain sudo[307915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:55:25 np0005532585.localdomain sudo[307915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:25 np0005532585.localdomain sudo[307915]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:25 np0005532585.localdomain sudo[307933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:55:25 np0005532585.localdomain sudo[307933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:26 np0005532585.localdomain sudo[307933]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Bus STARTING
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Serving on https://172.18.0.106:7150
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Client ('172.18.0.106', 60482) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: Cluster is now healthy
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Serving on http://172.18.0.106:8765
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Bus STARTED
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:26 np0005532585.localdomain sudo[307982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:55:26 np0005532585.localdomain sudo[307982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:26 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:55:26 np0005532585.localdomain sudo[307982]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:26 np0005532585.localdomain sudo[308001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 23 09:55:26 np0005532585.localdomain sudo[308001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:26 np0005532585.localdomain podman[308000]: 2025-11-23 09:55:26.507019837 +0000 UTC m=+0.102596314 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm)
Nov 23 09:55:26 np0005532585.localdomain podman[308000]: 2025-11-23 09:55:26.52416991 +0000 UTC m=+0.119746417 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 09:55:26 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:55:26 np0005532585.localdomain sudo[308001]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:26 np0005532585.localdomain sudo[308056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:55:26 np0005532585.localdomain sudo[308056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:26 np0005532585.localdomain sudo[308056]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:55:27 np0005532585.localdomain sudo[308074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308074]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:55:27 np0005532585.localdomain sudo[308092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308092]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:55:27 np0005532585.localdomain sudo[308110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308110]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:55:27 np0005532585.localdomain sudo[308128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308128]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.283829) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727283871, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 767, "num_deletes": 258, "total_data_size": 2031449, "memory_usage": 2157712, "flush_reason": "Manual Compaction"}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727292105, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1315966, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17617, "largest_seqno": 18379, "table_properties": {"data_size": 1312190, "index_size": 1503, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 8763, "raw_average_key_size": 19, "raw_value_size": 1304214, "raw_average_value_size": 2829, "num_data_blocks": 62, "num_entries": 461, "num_filter_entries": 461, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891702, "oldest_key_time": 1763891702, "file_creation_time": 1763891727, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 8332 microseconds, and 3930 cpu microseconds.
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.292157) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1315966 bytes OK
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.292180) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.293958) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.293983) EVENT_LOG_v1 {"time_micros": 1763891727293975, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.294005) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2027190, prev total WAL file size 2027190, number of live WAL files 2.
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.294660) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353237' seq:72057594037927935, type:22 .. '6B760031373835' seq:0, type:0; will stop at (end)
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1285KB)], [27(17MB)]
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727294700, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 19894964, "oldest_snapshot_seqno": -1}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11969 keys, 18720388 bytes, temperature: kUnknown
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727372765, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 18720388, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18652335, "index_size": 37040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29957, "raw_key_size": 323417, "raw_average_key_size": 27, "raw_value_size": 18448355, "raw_average_value_size": 1541, "num_data_blocks": 1396, "num_entries": 11969, "num_filter_entries": 11969, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891727, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.373428) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 18720388 bytes
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.375734) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 254.3 rd, 239.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 17.7 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(29.3) write-amplify(14.2) OK, records in: 12516, records dropped: 547 output_compression: NoCompression
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.375820) EVENT_LOG_v1 {"time_micros": 1763891727375800, "job": 14, "event": "compaction_finished", "compaction_time_micros": 78246, "compaction_time_cpu_micros": 46871, "output_level": 6, "num_output_files": 1, "total_output_size": 18720388, "num_input_records": 12516, "num_output_records": 11969, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727376434, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727379866, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.294609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:27 np0005532585.localdomain sudo[308162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:55:27 np0005532585.localdomain sudo[308162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308162]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new
Nov 23 09:55:27 np0005532585.localdomain sudo[308180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308180]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: mgrmap e42: np0005532584.naxwxy(active, since 3s), standbys: np0005532585.gzafiw
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:55:27 np0005532585.localdomain sudo[308198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 23 09:55:27 np0005532585.localdomain sudo[308198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308198]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:55:27 np0005532585.localdomain sudo[308216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308216]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:55:27 np0005532585.localdomain sudo[308234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308234]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:55:27 np0005532585.localdomain sudo[308252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308252]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain sudo[308270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:55:27 np0005532585.localdomain sudo[308270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308270]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:27 np0005532585.localdomain sudo[308288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:55:27 np0005532585.localdomain sudo[308288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:27 np0005532585.localdomain sudo[308288]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:55:28 np0005532585.localdomain sudo[308322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308322]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new
Nov 23 09:55:28 np0005532585.localdomain sudo[308340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:55:28 np0005532585.localdomain sudo[308340]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain sudo[308364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308364]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain podman[308357]: 2025-11-23 09:55:28.244296615 +0000 UTC m=+0.082891281 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:55:28 np0005532585.localdomain podman[308357]: 2025-11-23 09:55:28.283315496 +0000 UTC m=+0.121910192 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:55:28 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:55:28 np0005532585.localdomain sudo[308391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 23 09:55:28 np0005532585.localdomain sudo[308391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308391]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph
Nov 23 09:55:28 np0005532585.localdomain sudo[308416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308416]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:55:28 np0005532585.localdomain sudo[308434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308434]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf
Nov 23 09:55:28 np0005532585.localdomain ceph-mon[300199]: Standby manager daemon np0005532586.thmvqb started
Nov 23 09:55:28 np0005532585.localdomain sudo[308452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:55:28 np0005532585.localdomain sudo[308452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308452]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:55:28 np0005532585.localdomain sudo[308470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308470]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:55:28 np0005532585.localdomain sudo[308504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308504]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new
Nov 23 09:55:28 np0005532585.localdomain sudo[308522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308522]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 23 09:55:28 np0005532585.localdomain sudo[308540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308540]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:28 np0005532585.localdomain sudo[308558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:55:28 np0005532585.localdomain sudo[308558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:28 np0005532585.localdomain sudo[308558]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain sudo[308576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config
Nov 23 09:55:29 np0005532585.localdomain sudo[308576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308576]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain sudo[308594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:55:29 np0005532585.localdomain sudo[308594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308594]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain sudo[308612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b
Nov 23 09:55:29 np0005532585.localdomain sudo[308612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308612]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain sudo[308630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:55:29 np0005532585.localdomain sudo[308630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308630]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain sudo[308664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:55:29 np0005532585.localdomain sudo[308664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308664]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain sudo[308682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new
Nov 23 09:55:29 np0005532585.localdomain sudo[308682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308682]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: mgrmap e43: np0005532584.naxwxy(active, since 5s), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "mgr metadata", "who": "np0005532586.thmvqb", "id": "np0005532586.thmvqb"} : dispatch
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:29 np0005532585.localdomain sudo[308700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-46550e70-79cb-5f55-bf6d-1204b97e083b/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring.new /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:55:29 np0005532585.localdomain sudo[308700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308700]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.687100) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729687132, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 374, "num_deletes": 251, "total_data_size": 643977, "memory_usage": 651848, "flush_reason": "Manual Compaction"}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729690995, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 427045, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18384, "largest_seqno": 18753, "table_properties": {"data_size": 424623, "index_size": 533, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6885, "raw_average_key_size": 21, "raw_value_size": 419482, "raw_average_value_size": 1282, "num_data_blocks": 20, "num_entries": 327, "num_filter_entries": 327, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891727, "oldest_key_time": 1763891727, "file_creation_time": 1763891729, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 3945 microseconds, and 1628 cpu microseconds.
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.691042) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 427045 bytes OK
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.691064) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.694009) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.694033) EVENT_LOG_v1 {"time_micros": 1763891729694026, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.694054) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 641384, prev total WAL file size 657562, number of live WAL files 2.
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.695610) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(417KB)], [30(17MB)]
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729695679, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 19147433, "oldest_snapshot_seqno": -1}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11777 keys, 16438704 bytes, temperature: kUnknown
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729771725, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 16438704, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16374064, "index_size": 34075, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29509, "raw_key_size": 320096, "raw_average_key_size": 27, "raw_value_size": 16175487, "raw_average_value_size": 1373, "num_data_blocks": 1268, "num_entries": 11777, "num_filter_entries": 11777, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891729, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.772094) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 16438704 bytes
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.773869) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 251.5 rd, 215.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 17.9 +0.0 blob) out(15.7 +0.0 blob), read-write-amplify(83.3) write-amplify(38.5) OK, records in: 12296, records dropped: 519 output_compression: NoCompression
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.773952) EVENT_LOG_v1 {"time_micros": 1763891729773938, "job": 16, "event": "compaction_finished", "compaction_time_micros": 76136, "compaction_time_cpu_micros": 47638, "output_level": 6, "num_output_files": 1, "total_output_size": 16438704, "num_input_records": 12296, "num_output_records": 11777, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729774155, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729776660, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.695464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776705) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:29 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:55:29 np0005532585.localdomain sudo[308718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:55:29 np0005532585.localdomain sudo[308718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:29 np0005532585.localdomain sudo[308718]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:55:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:55:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:55:30 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:30 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:55:30 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:55:30 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:30 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:55:30 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:55:30 np0005532585.localdomain sudo[308736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:55:30 np0005532585.localdomain sudo[308736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:55:30 np0005532585.localdomain sudo[308736]: pam_unix(sudo:session): session closed for user root
Nov 23 09:55:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:30.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:30.471 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:30 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:55:32 np0005532585.localdomain ceph-mon[300199]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 23 09:55:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:34 np0005532585.localdomain ceph-mon[300199]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 23 09:55:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:55:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:35.470 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:36 np0005532585.localdomain ceph-mon[300199]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 23 09:55:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:55:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:55:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:55:38 np0005532585.localdomain podman[308755]: 2025-11-23 09:55:38.037962479 +0000 UTC m=+0.088874635 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:55:38 np0005532585.localdomain podman[308755]: 2025-11-23 09:55:38.045230001 +0000 UTC m=+0.096142127 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:55:38 np0005532585.localdomain podman[308756]: 2025-11-23 09:55:38.014041639 +0000 UTC m=+0.065915594 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Nov 23 09:55:38 np0005532585.localdomain podman[308754]: 2025-11-23 09:55:38.081573491 +0000 UTC m=+0.133228639 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:55:38 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:55:38 np0005532585.localdomain podman[308756]: 2025-11-23 09:55:38.147706909 +0000 UTC m=+0.199580884 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 23 09:55:38 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:55:38 np0005532585.localdomain podman[308754]: 2025-11-23 09:55:38.187256027 +0000 UTC m=+0.238911195 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:55:38 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:55:38 np0005532585.localdomain ceph-mon[300199]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:55:40 np0005532585.localdomain ceph-mon[300199]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:55:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:40.473 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:55:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:55:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:55:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:55:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:55:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18725 "" "Go-http-client/1.1"
Nov 23 09:55:41 np0005532585.localdomain sshd[308817]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:55:42 np0005532585.localdomain ceph-mon[300199]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 23 09:55:42 np0005532585.localdomain sshd[308817]: Invalid user embedded from 107.172.15.139 port 49818
Nov 23 09:55:42 np0005532585.localdomain sshd[308817]: Received disconnect from 107.172.15.139 port 49818:11: Bye Bye [preauth]
Nov 23 09:55:42 np0005532585.localdomain sshd[308817]: Disconnected from invalid user embedded 107.172.15.139 port 49818 [preauth]
Nov 23 09:55:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:44 np0005532585.localdomain ceph-mon[300199]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:45.476 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:46 np0005532585.localdomain ceph-mon[300199]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:55:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:55:47 np0005532585.localdomain podman[308819]: 2025-11-23 09:55:47.023854812 +0000 UTC m=+0.077167317 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:55:47 np0005532585.localdomain podman[308819]: 2025-11-23 09:55:47.034061234 +0000 UTC m=+0.087373679 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 09:55:47 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:55:47 np0005532585.localdomain systemd[1]: tmp-crun.m9U4ra.mount: Deactivated successfully.
Nov 23 09:55:47 np0005532585.localdomain podman[308820]: 2025-11-23 09:55:47.083630407 +0000 UTC m=+0.132832016 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:55:47 np0005532585.localdomain podman[308820]: 2025-11-23 09:55:47.09194699 +0000 UTC m=+0.141148649 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:55:47 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:55:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:48 np0005532585.localdomain ceph-mon[300199]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:50.478 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:50 np0005532585.localdomain ceph-mon[300199]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:52 np0005532585.localdomain ceph-mon[300199]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:54 np0005532585.localdomain ceph-mon[300199]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:55:55.481 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:55:56 np0005532585.localdomain ceph-mon[300199]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:55:57 np0005532585.localdomain systemd[1]: tmp-crun.sxjkl3.mount: Deactivated successfully.
Nov 23 09:55:57 np0005532585.localdomain podman[308861]: 2025-11-23 09:55:57.015858521 +0000 UTC m=+0.077326642 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:55:57 np0005532585.localdomain podman[308861]: 2025-11-23 09:55:57.027507346 +0000 UTC m=+0.088975407 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:55:57 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:55:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:55:58 np0005532585.localdomain ceph-mon[300199]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:55:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:55:59 np0005532585.localdomain podman[308880]: 2025-11-23 09:55:59.003865013 +0000 UTC m=+0.060914830 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:55:59 np0005532585.localdomain podman[308880]: 2025-11-23 09:55:59.03618314 +0000 UTC m=+0.093232897 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:55:59 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:55:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:55:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1817100876' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1817100876' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:56:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:00.484 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1817100876' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:56:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1817100876' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:56:02 np0005532585.localdomain ceph-mon[300199]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:04 np0005532585.localdomain sshd[308904]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:56:04 np0005532585.localdomain ceph-mon[300199]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:05 np0005532585.localdomain sshd[308904]: Received disconnect from 207.154.194.2 port 37176:11: Bye Bye [preauth]
Nov 23 09:56:05 np0005532585.localdomain sshd[308904]: Disconnected from authenticating user root 207.154.194.2 port 37176 [preauth]
Nov 23 09:56:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:05.486 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:06 np0005532585.localdomain ceph-mon[300199]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:07.229 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:08 np0005532585.localdomain ceph-mon[300199]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:56:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:56:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:56:09 np0005532585.localdomain systemd[1]: tmp-crun.3wPmKs.mount: Deactivated successfully.
Nov 23 09:56:09 np0005532585.localdomain podman[308906]: 2025-11-23 09:56:09.02106616 +0000 UTC m=+0.078843818 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 09:56:09 np0005532585.localdomain podman[308907]: 2025-11-23 09:56:09.075827752 +0000 UTC m=+0.129944759 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 09:56:09 np0005532585.localdomain podman[308907]: 2025-11-23 09:56:09.10916601 +0000 UTC m=+0.163283017 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:56:09 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:56:09 np0005532585.localdomain podman[308906]: 2025-11-23 09:56:09.159195486 +0000 UTC m=+0.216973224 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:56:09 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:56:09 np0005532585.localdomain podman[308908]: 2025-11-23 09:56:09.235705483 +0000 UTC m=+0.286652763 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 09:56:09 np0005532585.localdomain podman[308908]: 2025-11-23 09:56:09.251454083 +0000 UTC m=+0.302401413 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 23 09:56:09 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:56:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:56:09.295 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:56:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:56:09.295 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:56:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:56:09.296 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:56:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3355387136' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1932412365' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:10 np0005532585.localdomain systemd[1]: tmp-crun.tEtryB.mount: Deactivated successfully.
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.487 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.604 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.604 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.605 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:56:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:10.605 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:56:10 np0005532585.localdomain ceph-mon[300199]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/68488798' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4143276001' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:11.007 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:56:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:11.021 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:56:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:11.022 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:56:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:11.023 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:11.023 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:56:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:11.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:56:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:56:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:56:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:56:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:56:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18723 "" "Go-http-client/1.1"
Nov 23 09:56:12 np0005532585.localdomain ceph-mon[300199]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.236 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.236 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:56:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:56:13 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1193741643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.709 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.757 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.759 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:56:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1193741643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.947 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.948 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11697MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.948 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.949 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.998 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.998 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:56:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:13.998 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:56:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:14.037 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:56:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:56:14 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2435005534' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:14.479 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:56:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:14.486 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:56:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:14.506 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:56:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:14.508 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:56:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:14.509 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:56:14 np0005532585.localdomain ceph-mon[300199]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2435005534' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:56:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:15.489 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:15.508 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:56:16 np0005532585.localdomain ceph-mon[300199]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:56:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:56:18 np0005532585.localdomain podman[309010]: 2025-11-23 09:56:18.021029182 +0000 UTC m=+0.073663240 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:56:18 np0005532585.localdomain podman[309010]: 2025-11-23 09:56:18.037069692 +0000 UTC m=+0.089703750 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, container_name=multipathd, managed_by=edpm_ansible)
Nov 23 09:56:18 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:56:18 np0005532585.localdomain podman[309011]: 2025-11-23 09:56:18.131832315 +0000 UTC m=+0.182196824 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:56:18 np0005532585.localdomain podman[309011]: 2025-11-23 09:56:18.144474171 +0000 UTC m=+0.194838750 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:56:18 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:56:18 np0005532585.localdomain ceph-mon[300199]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:20 np0005532585.localdomain ceph-mon[300199]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:20.491 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:22 np0005532585.localdomain ceph-mon[300199]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:24 np0005532585.localdomain ceph-mon[300199]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:25.493 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:26 np0005532585.localdomain ceph-mon[300199]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:56:28 np0005532585.localdomain systemd[1]: tmp-crun.zRNKzg.mount: Deactivated successfully.
Nov 23 09:56:28 np0005532585.localdomain podman[309050]: 2025-11-23 09:56:28.021917583 +0000 UTC m=+0.083453769 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:56:28 np0005532585.localdomain podman[309050]: 2025-11-23 09:56:28.030084962 +0000 UTC m=+0.091621158 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 09:56:28 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:56:28 np0005532585.localdomain ceph-mon[300199]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:56:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:56:30 np0005532585.localdomain systemd[1]: tmp-crun.kq3e1B.mount: Deactivated successfully.
Nov 23 09:56:30 np0005532585.localdomain podman[309069]: 2025-11-23 09:56:30.060695176 +0000 UTC m=+0.117707045 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:56:30 np0005532585.localdomain podman[309069]: 2025-11-23 09:56:30.097414177 +0000 UTC m=+0.154426026 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:56:30 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:56:30 np0005532585.localdomain sudo[309091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:56:30 np0005532585.localdomain sudo[309091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:56:30 np0005532585.localdomain sudo[309091]: pam_unix(sudo:session): session closed for user root
Nov 23 09:56:30 np0005532585.localdomain sudo[309109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 09:56:30 np0005532585.localdomain sudo[309109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:56:30 np0005532585.localdomain ceph-mon[300199]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:30.495 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:30 np0005532585.localdomain sudo[309109]: pam_unix(sudo:session): session closed for user root
Nov 23 09:56:30 np0005532585.localdomain sudo[309147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:56:30 np0005532585.localdomain sudo[309147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:56:30 np0005532585.localdomain sudo[309147]: pam_unix(sudo:session): session closed for user root
Nov 23 09:56:31 np0005532585.localdomain sudo[309165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:56:31 np0005532585.localdomain sudo[309165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:56:31 np0005532585.localdomain sudo[309165]: pam_unix(sudo:session): session closed for user root
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:56:31 np0005532585.localdomain sudo[309215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:56:31 np0005532585.localdomain sudo[309215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:56:31 np0005532585.localdomain sudo[309215]: pam_unix(sudo:session): session closed for user root
Nov 23 09:56:32 np0005532585.localdomain ceph-mon[300199]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:34 np0005532585.localdomain ceph-mon[300199]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:56:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:35.497 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:36 np0005532585.localdomain ceph-mon[300199]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:38 np0005532585.localdomain ceph-mon[300199]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:56:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:56:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:56:40 np0005532585.localdomain systemd[1]: tmp-crun.Bz0GOG.mount: Deactivated successfully.
Nov 23 09:56:40 np0005532585.localdomain podman[309233]: 2025-11-23 09:56:40.037307095 +0000 UTC m=+0.091459153 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:56:40 np0005532585.localdomain podman[309234]: 2025-11-23 09:56:40.080367829 +0000 UTC m=+0.133492425 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:56:40 np0005532585.localdomain podman[309235]: 2025-11-23 09:56:40.142494256 +0000 UTC m=+0.194004654 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9)
Nov 23 09:56:40 np0005532585.localdomain podman[309235]: 2025-11-23 09:56:40.159437003 +0000 UTC m=+0.210947401 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal)
Nov 23 09:56:40 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:56:40 np0005532585.localdomain podman[309234]: 2025-11-23 09:56:40.211100911 +0000 UTC m=+0.264225507 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 23 09:56:40 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:56:40 np0005532585.localdomain podman[309233]: 2025-11-23 09:56:40.261608353 +0000 UTC m=+0.315760331 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 09:56:40 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:56:40 np0005532585.localdomain ceph-mon[300199]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:40.499 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:56:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:56:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:56:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:56:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:56:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:56:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18724 "" "Go-http-client/1.1"
Nov 23 09:56:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:56:42.109 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:56:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:42.110 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:56:42.111 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:56:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:56:42.112 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:56:42 np0005532585.localdomain ceph-mon[300199]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:44 np0005532585.localdomain ceph-mon[300199]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:45.502 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:46 np0005532585.localdomain ceph-mon[300199]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e88 e88: 6 total, 6 up, 6 in
Nov 23 09:56:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:48 np0005532585.localdomain ceph-mon[300199]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:56:48 np0005532585.localdomain ceph-mon[300199]: osdmap e88: 6 total, 6 up, 6 in
Nov 23 09:56:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:56:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:56:49 np0005532585.localdomain systemd[1]: tmp-crun.wLgEMb.mount: Deactivated successfully.
Nov 23 09:56:49 np0005532585.localdomain podman[309298]: 2025-11-23 09:56:49.08188603 +0000 UTC m=+0.136024753 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:56:49 np0005532585.localdomain podman[309297]: 2025-11-23 09:56:49.041754725 +0000 UTC m=+0.099618772 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:56:49 np0005532585.localdomain podman[309297]: 2025-11-23 09:56:49.125290526 +0000 UTC m=+0.183154523 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:56:49 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:56:49 np0005532585.localdomain podman[309298]: 2025-11-23 09:56:49.144645697 +0000 UTC m=+0.198784430 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 09:56:49 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:56:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 e89: 6 total, 6 up, 6 in
Nov 23 09:56:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:50.504 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:50 np0005532585.localdomain ceph-mon[300199]: pgmap v47: 177 pgs: 177 active+clean; 129 MiB data, 627 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.5 MiB/s wr, 27 op/s
Nov 23 09:56:50 np0005532585.localdomain ceph-mon[300199]: osdmap e89: 6 total, 6 up, 6 in
Nov 23 09:56:52 np0005532585.localdomain sshd[309339]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:56:52 np0005532585.localdomain ceph-mon[300199]: pgmap v49: 177 pgs: 177 active+clean; 145 MiB data, 660 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 5.1 MiB/s wr, 41 op/s
Nov 23 09:56:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:54 np0005532585.localdomain sshd[309339]: Invalid user unknown from 122.170.99.195 port 36472
Nov 23 09:56:54 np0005532585.localdomain ceph-mon[300199]: pgmap v50: 177 pgs: 177 active+clean; 145 MiB data, 660 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 5.1 MiB/s wr, 41 op/s
Nov 23 09:56:54 np0005532585.localdomain sshd[309339]: Connection closed by invalid user unknown 122.170.99.195 port 36472 [preauth]
Nov 23 09:56:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:55.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:56:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:55.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:56:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:55.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:56:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:55.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:56:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:55.531 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:56:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:56:55.531 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:56:55 np0005532585.localdomain ceph-mon[300199]: mgrmap e44: np0005532584.naxwxy(active, since 91s), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 09:56:56 np0005532585.localdomain ceph-mon[300199]: pgmap v51: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Nov 23 09:56:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:56:58 np0005532585.localdomain ceph-mon[300199]: pgmap v52: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 4.2 MiB/s wr, 39 op/s
Nov 23 09:56:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:56:59 np0005532585.localdomain podman[309341]: 2025-11-23 09:56:59.020632853 +0000 UTC m=+0.073171905 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:56:59 np0005532585.localdomain podman[309341]: 2025-11-23 09:56:59.03428898 +0000 UTC m=+0.086828042 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 23 09:56:59 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:56:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:56:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:57:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:00.532 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:00.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:00.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:00.536 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:00.580 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:00.581 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:00 np0005532585.localdomain ceph-mon[300199]: pgmap v53: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 5.7 KiB/s rd, 1.6 MiB/s wr, 10 op/s
Nov 23 09:57:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2399139433' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:57:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2399139433' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:57:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:57:01 np0005532585.localdomain podman[309360]: 2025-11-23 09:57:01.019398024 +0000 UTC m=+0.077177798 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:57:01 np0005532585.localdomain podman[309360]: 2025-11-23 09:57:01.027447169 +0000 UTC m=+0.085227023 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:57:01 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:57:02 np0005532585.localdomain ceph-mon[300199]: pgmap v54: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 2.8 KiB/s rd, 698 B/s wr, 4 op/s
Nov 23 09:57:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:04 np0005532585.localdomain ceph-mon[300199]: pgmap v55: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s rd, 682 B/s wr, 4 op/s
Nov 23 09:57:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:05.582 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:05.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:05.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:05.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:05.613 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:05.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:06 np0005532585.localdomain ceph-mon[300199]: pgmap v56: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s rd, 682 B/s wr, 4 op/s
Nov 23 09:57:07 np0005532585.localdomain sshd[309383]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:57:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:08 np0005532585.localdomain sshd[309383]: Received disconnect from 107.172.15.139 port 48588:11: Bye Bye [preauth]
Nov 23 09:57:08 np0005532585.localdomain sshd[309383]: Disconnected from authenticating user root 107.172.15.139 port 48588 [preauth]
Nov 23 09:57:08 np0005532585.localdomain sshd[309385]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:57:08 np0005532585.localdomain ceph-mon[300199]: pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:09 np0005532585.localdomain sshd[309385]: Received disconnect from 207.154.194.2 port 45416:11: Bye Bye [preauth]
Nov 23 09:57:09 np0005532585.localdomain sshd[309385]: Disconnected from authenticating user root 207.154.194.2 port 45416 [preauth]
Nov 23 09:57:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:09.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:09.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:09.297 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:57:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:09.297 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:57:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.616 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.626 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.627 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.627 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.627 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.647 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:10.648 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.807 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.808 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.823 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.823 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78d63edc-1c58-4001-830c-f978fcd88d02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.808821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c77d46a8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '478447ddc229dde13972dc31cf545b502ca1742c4ef214562bc59ebab5209259'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.808821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c77d54f4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '8a796f0b8c2490823cc24cbf2a5f9cbd6eac1380c1be37d20f15b2dabcfe8579'}]}, 'timestamp': '2025-11-23 09:57:10.824029', '_unique_id': 'd5bf6fa087a1454a94b107d09bc7531b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bf04b37-8c67-4263-9471-dd0f8267658c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:57:10.825988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c781caa2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.030258566, 'message_signature': '5fc41ba593677bfa0a4f9300132e9c3ab566e460f240e4c67d6a7c637685d3de'}]}, 'timestamp': '2025-11-23 09:57:10.853411', '_unique_id': '629405b2cf724f2abb25f40888cae624'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.855 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.859 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06b2c575-8499-4fe8-a143-f93255350343', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.855981', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c782d9d8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '9df1f0832b638b2d0818a84ec3aa386a735f8f9b572a822b1525a4ff5d3e0742'}]}, 'timestamp': '2025-11-23 09:57:10.860386', '_unique_id': 'c963bdceddaf47769c677e8253926aca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.863 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7a22a3-bb15-4987-83dc-11f582233ef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.863341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7884a1c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '9fe84e68b0b3d056d64a5d9b366910cf3d1d17b6b03a7d830cc5dcfd20b26d9a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.863341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7885ff2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'e703e90dba7a22496910fbdfe2bdc94cd650ca548608e0dbdaca5f348879a57c'}]}, 'timestamp': '2025-11-23 09:57:10.896456', '_unique_id': 'e094cfeb5b014c069e44a806daf653ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50d9880f-85d6-46af-a329-141b0915c56e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.899311', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c788e1f2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '58a59fb6ea04ec031ae38657c2093f8667f46bf11d6302bce319a0534751093a'}]}, 'timestamp': '2025-11-23 09:57:10.899825', '_unique_id': 'fb3e77bb6f904d8694d10e1439c59437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '465116ef-d445-4ece-95b5-b7e71c3d695e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.902997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78971d0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '0b2f12f6ceda787aa02a51658d6118a282694d1dc8d1e2e5022218d2e2b5caf2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.902997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78982b0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '4af8e016df5967d42b70dbcb3bcb5ec25dea58f58e90baa1cade008828db1198'}]}, 'timestamp': '2025-11-23 09:57:10.903874', '_unique_id': 'b2b35dc62ba34a37874ba177128d3cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc46533b-b1c9-4390-842c-6601c0d1acc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.906190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c789ee12-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '7209c62dd00fee7384ed4294e4628a83b79cd150eda829e9c991d5ba3b237585'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.906190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c789fe0c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': 'd5e542e81b0add35d04967d45e0e010ecffa5a72d270bb8e6ea6d40196c3f01a'}]}, 'timestamp': '2025-11-23 09:57:10.907064', '_unique_id': 'e76408ef1e4646cf9b4bfc82674f3f7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aca6e18-13da-4893-b385-76e5276051d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.909294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78a6af4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '71d68b1f00d9f78bdc31144ef957cdd5156df74a85311e19821a84d3fc8cd365'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.909294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78a7c42-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'b91e091c3b0fe22aad96b34e4a56119079a21ae85f862bb21e76c35ef95bdac1'}]}, 'timestamp': '2025-11-23 09:57:10.910265', '_unique_id': '5faa8f88ea12415cbef0a14b966d8d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfa3095f-02b4-4eae-a2e0-ea1594942321', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.912427', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78ae164-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '716b0b29318cd6d3c155bd47a51d066a6738229b3b10627c83b1eb86c7d5c962'}]}, 'timestamp': '2025-11-23 09:57:10.912878', '_unique_id': '263bf88113544c61a6a5801e5abb5d6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2ef9b6a-bd7f-43e0-9b85-38f219580e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.915152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78b4bae-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '55c69475479b2b1f8157062a31bf12c96f06853e982b7f561c633ed915ec6dd2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.915152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78b5cc0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '5103aca7f11099788e3812a80e95826c71f9167fed48c101b05553c60b081fc0'}]}, 'timestamp': '2025-11-23 09:57:10.916038', '_unique_id': 'a7777385ea0a41f49d6bfc23d7c89dd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc26168d-4dc0-4ef1-8598-a2107d28f5a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.918188', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78bc2a0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '380ff1fe396935985b3e2d9c310fff282c35185e7e3429791711c39974c6c9b1'}]}, 'timestamp': '2025-11-23 09:57:10.918644', '_unique_id': '8d8a1dba7d344296b159c5e8bfa74df5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5b12f18-e3aa-4925-8020-7b32a1182a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.921561', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78c46da-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '352e0492c91fc0aa55ff15b50e55b51060445ec4fa11977d54a8f1ea55b1b873'}]}, 'timestamp': '2025-11-23 09:57:10.922070', '_unique_id': 'e1f2cd504dbd43a88f9f5e271e08d08e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd35e3634-b034-4603-a2e1-fd5bcb4e64fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.925061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78cd1c2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '49df87751615c99b36ec08b5b045345c06a37eed20ec7e1aafe77dfe521707b1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.925061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78ce946-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '0ddb2e5cdca43862ed99c41645e30c463dd73dcd4c360029f2cd2bda91e13767'}]}, 'timestamp': '2025-11-23 09:57:10.926248', '_unique_id': '11d4bdaa20af4124b1e167cbf2e5fd1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fde602e-894d-48d0-bc66-f14477135d97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.929404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78d7b22-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'febff36c543627473c9d9142647482e449005cb3fdfcfbfb4d39961e01b81094'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.929404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78d9472-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '7b8aa71da2b8e875865329d4f48f9145b0523500e7d7d3ede2674808a69ba7cd'}]}, 'timestamp': '2025-11-23 09:57:10.930621', '_unique_id': 'd852f3572b754bb7a78dded20966081e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.933 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.933 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 15320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afd9dc55-7579-48fa-97c5-5d668e0fa794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15320000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:57:10.933653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c78e227a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.030258566, 'message_signature': '2faeb5c5b1dbc380e718da489722546adfca69de7e4dca56247db747e2262a48'}]}, 'timestamp': '2025-11-23 09:57:10.934284', '_unique_id': 'c6c15e0edf0440b784591de26760661c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6e3f2a8-0a1c-4c96-9316-89f0840523a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.937287', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78eb140-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '83b7371bb87d4d35e538010f44e156244190a3591eddf40f9cd1ee9018dcb8d0'}]}, 'timestamp': '2025-11-23 09:57:10.938030', '_unique_id': '799ee58a731049c2a450220ceeb9ac0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.940 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.940 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f202f32e-74b8-4d8d-b4ae-b9aa32534262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.940724', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78f352a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': 'd08e78602ed722288b53f40b7cd23e74a71f3dcf12ae395d5489e5c65dec165c'}]}, 'timestamp': '2025-11-23 09:57:10.941257', '_unique_id': 'edcad9c6d26a4b1aaeaf9c378c878926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.943 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.943 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00cc8a95-27d4-4968-90e9-a77518574d99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.943512', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78fa3e8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '82feda838720645bf9419ffd7ac450cf6cbed16797b98d3c469fb80fa854cab8'}]}, 'timestamp': '2025-11-23 09:57:10.944150', '_unique_id': 'b1f06faf3f014047bdc10d445abef5ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.946 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5035b8bc-234b-442e-b005-7c67697f93e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.946881', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c79027c8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': 'fd396743e3f7156405109fb499216b793852b8b2ed94d1e6779d6c0efddcb9c1'}]}, 'timestamp': '2025-11-23 09:57:10.947475', '_unique_id': '38b9643ca2834623921487fdde980363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.949 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '572107a5-0cb5-4d8f-95ea-82c8d93d4e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.949673', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c790915e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '3e4212aa9f11cee3c63b3b537aff35a535547be38b3f12d08935fc15c3a354ab'}]}, 'timestamp': '2025-11-23 09:57:10.950155', '_unique_id': '3380ed6a4f1649f783a80847ef9b03a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ad56411-1f6b-4a34-bea8-354e24daa917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.952473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c790fd60-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'd7b4d62f11b0731c573fc8d33cabd8a8e2371bb2d812a33d58f1d4885b57e4f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.952473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7910e18-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '9ee4a3df95ede19cede8bdb042951ef4f76b7adebc0c3fdd9abe621a63da26cf'}]}, 'timestamp': '2025-11-23 09:57:10.953317', '_unique_id': '9cd8f808b56743faaf97eaaefb92ce23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:57:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:57:11 np0005532585.localdomain podman[309388]: 2025-11-23 09:57:11.037854789 +0000 UTC m=+0.090402821 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:57:11 np0005532585.localdomain podman[309387]: 2025-11-23 09:57:11.071263238 +0000 UTC m=+0.122916173 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 09:57:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:11.098 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:57:11 np0005532585.localdomain podman[309387]: 2025-11-23 09:57:11.107320839 +0000 UTC m=+0.158973844 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller)
Nov 23 09:57:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:11.113 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:57:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:11.114 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:57:11 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:57:11 np0005532585.localdomain podman[309388]: 2025-11-23 09:57:11.122294136 +0000 UTC m=+0.174842158 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:57:11 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:57:11 np0005532585.localdomain podman[309389]: 2025-11-23 09:57:11.194017766 +0000 UTC m=+0.236252734 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350)
Nov 23 09:57:11 np0005532585.localdomain podman[309389]: 2025-11-23 09:57:11.235316466 +0000 UTC m=+0.277551434 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 09:57:11 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:57:11 np0005532585.localdomain ceph-mon[300199]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4241493939' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:57:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:57:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:57:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 09:57:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:57:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18718 "" "Go-http-client/1.1"
Nov 23 09:57:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:12.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:12.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:12.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:57:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1983039144' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:12 np0005532585.localdomain ceph-mon[300199]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/48797289' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3901111112' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:13 np0005532585.localdomain ceph-mon[300199]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:14.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:14Z|00070|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.234 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.234 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.649 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.678 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.678 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.682 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:15 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:57:15 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2601443712' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.744 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.800 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:57:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:15.801 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.020 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.022 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11681MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.022 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.023 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.103 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.104 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.104 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.152 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:57:16 np0005532585.localdomain ceph-mon[300199]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2601443712' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:16 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:57:16 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/349663793' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.613 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.620 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.633 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.636 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:57:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:16.636 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:57:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/349663793' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:57:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:17.637 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:17.657 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:57:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:18 np0005532585.localdomain ceph-mon[300199]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:19 np0005532585.localdomain ceph-mon[300199]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:57:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:57:20 np0005532585.localdomain podman[309491]: 2025-11-23 09:57:20.030745915 +0000 UTC m=+0.083744188 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:57:20 np0005532585.localdomain podman[309491]: 2025-11-23 09:57:20.046532967 +0000 UTC m=+0.099531270 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:57:20 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:57:20 np0005532585.localdomain podman[309492]: 2025-11-23 09:57:20.134137832 +0000 UTC m=+0.185072171 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:57:20 np0005532585.localdomain podman[309492]: 2025-11-23 09:57:20.146655563 +0000 UTC m=+0.197589932 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:57:20 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:57:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:20.683 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:20.685 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:20.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:20.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:20.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:20.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:22 np0005532585.localdomain ceph-mon[300199]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:23 np0005532585.localdomain ceph-mon[300199]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:25.721 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:25.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:25.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:25.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:25.741 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:25.742 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:26 np0005532585.localdomain ceph-mon[300199]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:27 np0005532585.localdomain ceph-mon[300199]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:57:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:57:30 np0005532585.localdomain podman[309533]: 2025-11-23 09:57:30.03900065 +0000 UTC m=+0.091157354 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 09:57:30 np0005532585.localdomain podman[309533]: 2025-11-23 09:57:30.047777858 +0000 UTC m=+0.099934562 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:57:30 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:57:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.262 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmph68f1hsn/privsep.sock']
Nov 23 09:57:30 np0005532585.localdomain ceph-mon[300199]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:30.743 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:30.768 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:30.769 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:57:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:30.769 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:30.770 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:30.771 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:57:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.929 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 23 09:57:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.814 309556 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:57:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.819 309556 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:57:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.822 309556 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 23 09:57:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.823 309556 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309556
Nov 23 09:57:31 np0005532585.localdomain ceph-mon[300199]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:31.483 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpymplsf7c/privsep.sock']
Nov 23 09:57:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:57:32 np0005532585.localdomain sudo[309571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:57:32 np0005532585.localdomain sudo[309571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:57:32 np0005532585.localdomain sudo[309571]: pam_unix(sudo:session): session closed for user root
Nov 23 09:57:32 np0005532585.localdomain systemd[1]: tmp-crun.4Z4zoy.mount: Deactivated successfully.
Nov 23 09:57:32 np0005532585.localdomain podman[309565]: 2025-11-23 09:57:32.098065432 +0000 UTC m=+0.153515848 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:57:32 np0005532585.localdomain sudo[309598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:57:32 np0005532585.localdomain sudo[309598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:57:32 np0005532585.localdomain podman[309565]: 2025-11-23 09:57:32.13634492 +0000 UTC m=+0.191795346 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:57:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.145 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 23 09:57:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.034 309593 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:57:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.039 309593 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:57:32 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:57:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.044 309593 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 23 09:57:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.044 309593 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309593
Nov 23 09:57:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:32.455 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:32 np0005532585.localdomain sudo[309598]: pam_unix(sudo:session): session closed for user root
Nov 23 09:57:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:33 np0005532585.localdomain sudo[309660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:57:33 np0005532585.localdomain sudo[309660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:57:33 np0005532585.localdomain sudo[309660]: pam_unix(sudo:session): session closed for user root
Nov 23 09:57:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.103 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpafrwc5cs/privsep.sock']
Nov 23 09:57:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:57:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:57:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:57:33 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:57:33 np0005532585.localdomain ceph-mon[300199]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.725 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 23 09:57:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.624 309685 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 23 09:57:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.629 309685 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 23 09:57:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.632 309685 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 23 09:57:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.632 309685 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309685
Nov 23 09:57:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:57:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:34.687 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:35.044 263258 INFO neutron.agent.linux.ip_lib [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Device tap73b68089-2f cannot be used as it has no MAC address
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.116 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain kernel: device tap73b68089-2f entered promiscuous mode
Nov 23 09:57:35 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891855.1292] manager: (tap73b68089-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 23 09:57:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:35Z|00071|binding|INFO|Claiming lport 73b68089-2f56-49cd-9bd3-002979f43843 for this chassis.
Nov 23 09:57:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:35Z|00072|binding|INFO|73b68089-2f56-49cd-9bd3-002979f43843: Claiming unknown
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.130 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain systemd-udevd[309700]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:57:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:35.152 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab89783c9d39468096f7d3a0c6bf4d3e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13bd6782-9124-462d-b4a7-d10c24537f9d, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=73b68089-2f56-49cd-9bd3-002979f43843) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:57:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:35.155 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 73b68089-2f56-49cd-9bd3-002979f43843 in datapath 7c71c90d-a08f-42e1-bb2f-ef1175c4042b bound to our chassis
Nov 23 09:57:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:35.159 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port c84e6e46-efbc-4de0-9598-13a85365118b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 23 09:57:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:35.159 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: hostname: np0005532585.localdomain
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:35.162 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f07e157f-b07a-4e3f-981e-1c3c3f4c89ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:35Z|00073|binding|INFO|Setting lport 73b68089-2f56-49cd-9bd3-002979f43843 ovn-installed in OVS
Nov 23 09:57:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:35Z|00074|binding|INFO|Setting lport 73b68089-2f56-49cd-9bd3-002979f43843 up in Southbound
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.170 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap73b68089-2f: No such device
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.237 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain ceph-mon[300199]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.770 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:35.772 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:36 np0005532585.localdomain podman[309772]: 
Nov 23 09:57:36 np0005532585.localdomain podman[309772]: 2025-11-23 09:57:36.12725213 +0000 UTC m=+0.088870135 container create 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 09:57:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101.scope.
Nov 23 09:57:36 np0005532585.localdomain podman[309772]: 2025-11-23 09:57:36.084471454 +0000 UTC m=+0.046089479 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:57:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:57:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edacf18bbfc488c9c82ec3cf149606e2b511b3a1b82fab7a350e4145bfcc787/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:57:36 np0005532585.localdomain podman[309772]: 2025-11-23 09:57:36.203153577 +0000 UTC m=+0.164771572 container init 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 09:57:36 np0005532585.localdomain podman[309772]: 2025-11-23 09:57:36.214229235 +0000 UTC m=+0.175847230 container start 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: started, version 2.85 cachesize 150
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: DNS service limited to local subnets
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: warning: no upstream servers configured
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 0 addresses
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts
Nov 23 09:57:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.275 263258 INFO neutron.agent.dhcp.agent [None req-b79e4a4d-e53c-4c59-a4c3-1778eb6e2957 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:33Z, description=, device_id=af12546c-762e-460a-92cf-0a6f5f6b8733, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8d36f40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8d36ee0>], id=f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5, ip_allocation=immediate, mac_address=fa:16:3e:45:fe:c5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:28Z, description=, dns_domain=, id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-1806334605-network, port_security_enabled=True, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59428, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=222, status=ACTIVE, subnets=['12d58085-977d-4709-b642-b2646c597567'], tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:29Z, vlan_transparent=None, network_id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, port_security_enabled=False, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=269, status=DOWN, tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:33Z on network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b
Nov 23 09:57:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.380 263258 INFO neutron.agent.dhcp.agent [None req-dea51365-5277-467a-a0e4-8f2cff221a55 - - - - - -] DHCP configuration for ports {'351c24ed-9c6f-4f66-8dd6-3859a3d2d148'} is completed
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 1 addresses
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts
Nov 23 09:57:36 np0005532585.localdomain podman[309809]: 2025-11-23 09:57:36.55931361 +0000 UTC m=+0.060553169 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 09:57:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.686 263258 INFO neutron.agent.dhcp.agent [None req-0d7905f9-ef19-4367-b6f0-737b6233be16 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:33Z, description=, device_id=af12546c-762e-460a-92cf-0a6f5f6b8733, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8d36e50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8d36c10>], id=f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5, ip_allocation=immediate, mac_address=fa:16:3e:45:fe:c5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:28Z, description=, dns_domain=, id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-1806334605-network, port_security_enabled=True, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59428, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=222, status=ACTIVE, subnets=['12d58085-977d-4709-b642-b2646c597567'], tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:29Z, vlan_transparent=None, network_id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, port_security_enabled=False, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=269, status=DOWN, tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:33Z on network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b
Nov 23 09:57:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.831 263258 INFO neutron.agent.dhcp.agent [None req-181149f1-12c6-4e81-9849-e6d22dbcf9be - - - - - -] DHCP configuration for ports {'f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5'} is completed
Nov 23 09:57:36 np0005532585.localdomain dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 1 addresses
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host
Nov 23 09:57:36 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts
Nov 23 09:57:36 np0005532585.localdomain podman[309846]: 2025-11-23 09:57:36.899024181 +0000 UTC m=+0.058800726 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:57:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:37.354 263258 INFO neutron.agent.dhcp.agent [None req-5136d212-a09e-49a4-80b5-c64eee44d855 - - - - - -] DHCP configuration for ports {'f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5'} is completed
Nov 23 09:57:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:38 np0005532585.localdomain ceph-mon[300199]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:38.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:39 np0005532585.localdomain ceph-mon[300199]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:40 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:40.470 263258 INFO neutron.agent.linux.ip_lib [None req-30fdac74-ec64-4a2d-8e04-cdf0bde538d3 - - - - - -] Device tap55b3fe2f-21 cannot be used as it has no MAC address
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain kernel: device tap55b3fe2f-21 entered promiscuous mode
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.501 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891860.5022] manager: (tap55b3fe2f-21): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 23 09:57:40 np0005532585.localdomain systemd-udevd[309878]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:57:40 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:40Z|00075|binding|INFO|Claiming lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 for this chassis.
Nov 23 09:57:40 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:40Z|00076|binding|INFO|55b3fe2f-21ef-4379-99f5-112f0dfef914: Claiming unknown
Nov 23 09:57:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:40.514 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a6dad5f4ea934cb2b33cba44987053be', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9d6fde6-b084-4385-8650-ae0140b56791, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=55b3fe2f-21ef-4379-99f5-112f0dfef914) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:57:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:40.516 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 55b3fe2f-21ef-4379-99f5-112f0dfef914 in datapath 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d bound to our chassis
Nov 23 09:57:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:40.517 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:57:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:40.519 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e27a7ceb-2cfc-4704-bc41-916ab5c4d36e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.526 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:40Z|00077|binding|INFO|Setting lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 ovn-installed in OVS
Nov 23 09:57:40 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:40Z|00078|binding|INFO|Setting lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 up in Southbound
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.533 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.562 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.588 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.773 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:40.776 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:57:40 np0005532585.localdomain systemd[1]: tmp-crun.d70Jkr.mount: Deactivated successfully.
Nov 23 09:57:40 np0005532585.localdomain dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 0 addresses
Nov 23 09:57:40 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host
Nov 23 09:57:40 np0005532585.localdomain dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts
Nov 23 09:57:40 np0005532585.localdomain podman[309930]: 2025-11-23 09:57:40.818658196 +0000 UTC m=+0.068877664 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:57:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:41Z|00079|binding|INFO|Releasing lport 73b68089-2f56-49cd-9bd3-002979f43843 from this chassis (sb_readonly=0)
Nov 23 09:57:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:41Z|00080|binding|INFO|Setting lport 73b68089-2f56-49cd-9bd3-002979f43843 down in Southbound
Nov 23 09:57:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:41.031 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:41 np0005532585.localdomain kernel: device tap73b68089-2f left promiscuous mode
Nov 23 09:57:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:41.044 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab89783c9d39468096f7d3a0c6bf4d3e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13bd6782-9124-462d-b4a7-d10c24537f9d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=73b68089-2f56-49cd-9bd3-002979f43843) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:57:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:41.046 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 73b68089-2f56-49cd-9bd3-002979f43843 in datapath 7c71c90d-a08f-42e1-bb2f-ef1175c4042b unbound from our chassis
Nov 23 09:57:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:41.048 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:41.049 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:57:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:41.049 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba680c6-ad2a-42bb-acbf-82c6fba0c1c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:57:41 np0005532585.localdomain podman[309991]: 
Nov 23 09:57:41 np0005532585.localdomain podman[309991]: 2025-11-23 09:57:41.456043935 +0000 UTC m=+0.091721562 container create 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: Started libpod-conmon-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296.scope.
Nov 23 09:57:41 np0005532585.localdomain podman[309991]: 2025-11-23 09:57:41.413365592 +0000 UTC m=+0.049043239 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:57:41 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a72aa34fa3703c0feb66b78aa70334c94e6bb60891737ec5625b5eed8d5bdbf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:57:41 np0005532585.localdomain podman[309991]: 2025-11-23 09:57:41.535547742 +0000 UTC m=+0.171225389 container init 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:57:41 np0005532585.localdomain podman[309991]: 2025-11-23 09:57:41.550127717 +0000 UTC m=+0.185805334 container start 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:57:41 np0005532585.localdomain dnsmasq[310035]: started, version 2.85 cachesize 150
Nov 23 09:57:41 np0005532585.localdomain dnsmasq[310035]: DNS service limited to local subnets
Nov 23 09:57:41 np0005532585.localdomain dnsmasq[310035]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:57:41 np0005532585.localdomain dnsmasq[310035]: warning: no upstream servers configured
Nov 23 09:57:41 np0005532585.localdomain dnsmasq-dhcp[310035]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:57:41 np0005532585.localdomain dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 0 addresses
Nov 23 09:57:41 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host
Nov 23 09:57:41 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts
Nov 23 09:57:41 np0005532585.localdomain podman[310005]: 2025-11-23 09:57:41.589875911 +0000 UTC m=+0.086098870 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller)
Nov 23 09:57:41 np0005532585.localdomain podman[310006]: 2025-11-23 09:57:41.603334232 +0000 UTC m=+0.096375834 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 23 09:57:41 np0005532585.localdomain podman[310006]: 2025-11-23 09:57:41.63573223 +0000 UTC m=+0.128773842 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:57:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:41.670 263258 INFO neutron.agent.dhcp.agent [None req-d8ff0c38-fe32-49e6-9103-fcca295fbcf4 - - - - - -] DHCP configuration for ports {'3b646975-045a-40d5-9ebc-93b350fd7125'} is completed
Nov 23 09:57:41 np0005532585.localdomain podman[310005]: 2025-11-23 09:57:41.671851983 +0000 UTC m=+0.168074962 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:57:41 np0005532585.localdomain podman[310007]: 2025-11-23 09:57:41.676356381 +0000 UTC m=+0.166976149 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 09:57:41 np0005532585.localdomain podman[310007]: 2025-11-23 09:57:41.759681435 +0000 UTC m=+0.250301243 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Nov 23 09:57:41 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:57:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:57:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:57:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:57:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157510 "" "Go-http-client/1.1"
Nov 23 09:57:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:57:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19691 "" "Go-http-client/1.1"
Nov 23 09:57:42 np0005532585.localdomain ceph-mon[300199]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:42 np0005532585.localdomain systemd[1]: tmp-crun.FHqc4H.mount: Deactivated successfully.
Nov 23 09:57:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:42.773 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:57:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:42.775 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:57:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:42.776 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:57:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:42.812 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:43 np0005532585.localdomain ceph-mon[300199]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:43 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:43Z|00081|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:57:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:43.673 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:44 np0005532585.localdomain dnsmasq[309791]: exiting on receipt of SIGTERM
Nov 23 09:57:44 np0005532585.localdomain podman[310088]: 2025-11-23 09:57:44.202520933 +0000 UTC m=+0.061353044 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 09:57:44 np0005532585.localdomain systemd[1]: libpod-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101.scope: Deactivated successfully.
Nov 23 09:57:44 np0005532585.localdomain podman[310102]: 2025-11-23 09:57:44.271927361 +0000 UTC m=+0.057692392 container died 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:57:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101-userdata-shm.mount: Deactivated successfully.
Nov 23 09:57:44 np0005532585.localdomain podman[310102]: 2025-11-23 09:57:44.307125846 +0000 UTC m=+0.092890807 container cleanup 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:57:44 np0005532585.localdomain systemd[1]: libpod-conmon-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101.scope: Deactivated successfully.
Nov 23 09:57:44 np0005532585.localdomain podman[310109]: 2025-11-23 09:57:44.363547358 +0000 UTC m=+0.134814466 container remove 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 09:57:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:44.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:44.442 263258 INFO neutron.agent.dhcp.agent [None req-12a9e5c0-6ff2-4ada-a33a-fd5958a68959 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:57:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:44.443 263258 INFO neutron.agent.dhcp.agent [None req-12a9e5c0-6ff2-4ada-a33a-fd5958a68959 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:57:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8edacf18bbfc488c9c82ec3cf149606e2b511b3a1b82fab7a350e4145bfcc787-merged.mount: Deactivated successfully.
Nov 23 09:57:45 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d7c71c90d\x2da08f\x2d42e1\x2dbb2f\x2def1175c4042b.mount: Deactivated successfully.
Nov 23 09:57:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:45.714 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:45Z, description=, device_id=0f8464a3-f81d-4757-a6d6-f5dc7dd25ed6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cd3460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8d144c0>], id=cc4bfc60-3e56-4b71-a76c-62f3aabf67a4, ip_allocation=immediate, mac_address=fa:16:3e:c7:62:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:37Z, description=, dns_domain=, id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-328332667-network, port_security_enabled=True, project_id=a6dad5f4ea934cb2b33cba44987053be, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33842, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=312, status=ACTIVE, subnets=['f8967804-12ce-482f-8265-4d6194cb2186'], tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:39Z, vlan_transparent=None, network_id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, port_security_enabled=False, project_id=a6dad5f4ea934cb2b33cba44987053be, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=352, status=DOWN, tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:45Z on network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d
Nov 23 09:57:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:45.815 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:45 np0005532585.localdomain podman[310149]: 2025-11-23 09:57:45.923998468 +0000 UTC m=+0.060038904 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 09:57:45 np0005532585.localdomain dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 1 addresses
Nov 23 09:57:45 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host
Nov 23 09:57:45 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts
Nov 23 09:57:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:46.146 263258 INFO neutron.agent.dhcp.agent [None req-c8a9f48c-e4cc-4380-8d0a-cc660448b4ce - - - - - -] DHCP configuration for ports {'cc4bfc60-3e56-4b71-a76c-62f3aabf67a4'} is completed
Nov 23 09:57:46 np0005532585.localdomain ceph-mon[300199]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:46.690 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:45Z, description=, device_id=0f8464a3-f81d-4757-a6d6-f5dc7dd25ed6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bf1d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c946afa0>], id=cc4bfc60-3e56-4b71-a76c-62f3aabf67a4, ip_allocation=immediate, mac_address=fa:16:3e:c7:62:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:37Z, description=, dns_domain=, id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-328332667-network, port_security_enabled=True, project_id=a6dad5f4ea934cb2b33cba44987053be, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33842, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=312, status=ACTIVE, subnets=['f8967804-12ce-482f-8265-4d6194cb2186'], tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:39Z, vlan_transparent=None, network_id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, port_security_enabled=False, project_id=a6dad5f4ea934cb2b33cba44987053be, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=352, status=DOWN, tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:45Z on network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d
Nov 23 09:57:46 np0005532585.localdomain dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 1 addresses
Nov 23 09:57:46 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host
Nov 23 09:57:46 np0005532585.localdomain podman[310186]: 2025-11-23 09:57:46.899333364 +0000 UTC m=+0.061355774 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:57:46 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts
Nov 23 09:57:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:46.990 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:47 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:47.268 263258 INFO neutron.agent.dhcp.agent [None req-59729e6a-680e-4ea1-b310-aa07945498c6 - - - - - -] DHCP configuration for ports {'cc4bfc60-3e56-4b71-a76c-62f3aabf67a4'} is completed
Nov 23 09:57:47 np0005532585.localdomain ceph-mon[300199]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:50 np0005532585.localdomain ceph-mon[300199]: pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:50.819 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:57:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:57:51 np0005532585.localdomain podman[310207]: 2025-11-23 09:57:51.034860488 +0000 UTC m=+0.087340247 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:57:51 np0005532585.localdomain podman[310207]: 2025-11-23 09:57:51.042537813 +0000 UTC m=+0.095017602 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:57:51 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:57:51 np0005532585.localdomain podman[310206]: 2025-11-23 09:57:51.128717104 +0000 UTC m=+0.184105442 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 09:57:51 np0005532585.localdomain podman[310206]: 2025-11-23 09:57:51.1443069 +0000 UTC m=+0.199695258 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:57:51 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:57:51 np0005532585.localdomain ceph-mon[300199]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:51.665 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:51 np0005532585.localdomain dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 0 addresses
Nov 23 09:57:51 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host
Nov 23 09:57:51 np0005532585.localdomain podman[310265]: 2025-11-23 09:57:51.976784505 +0000 UTC m=+0.049546534 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 09:57:51 np0005532585.localdomain dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts
Nov 23 09:57:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:52.123 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:52 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:52Z|00082|binding|INFO|Releasing lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 from this chassis (sb_readonly=0)
Nov 23 09:57:52 np0005532585.localdomain kernel: device tap55b3fe2f-21 left promiscuous mode
Nov 23 09:57:52 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:52Z|00083|binding|INFO|Setting lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 down in Southbound
Nov 23 09:57:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:52.133 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a6dad5f4ea934cb2b33cba44987053be', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9d6fde6-b084-4385-8650-ae0140b56791, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=55b3fe2f-21ef-4379-99f5-112f0dfef914) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:57:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:52.134 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 55b3fe2f-21ef-4379-99f5-112f0dfef914 in datapath 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d unbound from our chassis
Nov 23 09:57:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:52.137 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:57:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:52.139 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[13165d96-6e54-4d0b-b8ab-f5c270ed1f75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:57:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:52.141 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:52.143 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:53 np0005532585.localdomain ceph-mon[300199]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:54 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:57:54.044 2 INFO neutron.agent.securitygroups_rpc [None req-c244d218-e6b7-4260-9702-7bb508c7ef68 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']
Nov 23 09:57:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:54Z|00084|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:57:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:54.830 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e90 e90: 6 total, 6 up, 6 in
Nov 23 09:57:55 np0005532585.localdomain dnsmasq[310035]: exiting on receipt of SIGTERM
Nov 23 09:57:55 np0005532585.localdomain podman[310305]: 2025-11-23 09:57:55.704080336 +0000 UTC m=+0.059619061 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 09:57:55 np0005532585.localdomain systemd[1]: libpod-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296.scope: Deactivated successfully.
Nov 23 09:57:55 np0005532585.localdomain podman[310318]: 2025-11-23 09:57:55.764851852 +0000 UTC m=+0.046315676 container died 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:57:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296-userdata-shm.mount: Deactivated successfully.
Nov 23 09:57:55 np0005532585.localdomain podman[310318]: 2025-11-23 09:57:55.801724427 +0000 UTC m=+0.083188201 container cleanup 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 09:57:55 np0005532585.localdomain systemd[1]: libpod-conmon-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296.scope: Deactivated successfully.
Nov 23 09:57:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:55.861 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:55 np0005532585.localdomain podman[310319]: 2025-11-23 09:57:55.872550329 +0000 UTC m=+0.147677998 container remove 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:57:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:55.913 263258 INFO neutron.agent.dhcp.agent [None req-3dd77c87-1459-47cc-a2cf-f3d9dc34dd0b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:57:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:56.224 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:57:56 np0005532585.localdomain ceph-mon[300199]: osdmap e90: 6 total, 6 up, 6 in
Nov 23 09:57:56 np0005532585.localdomain ceph-mon[300199]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-1a72aa34fa3703c0feb66b78aa70334c94e6bb60891737ec5625b5eed8d5bdbf-merged.mount: Deactivated successfully.
Nov 23 09:57:56 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d81b87fa4\x2d03b0\x2d4cc9\x2d9d5d\x2d0d532d38f99d.mount: Deactivated successfully.
Nov 23 09:57:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:56.899 263258 INFO neutron.agent.linux.ip_lib [None req-97a4fc18-8858-4838-86d1-e88b9f78dbd9 - - - - - -] Device tap8c439e83-e9 cannot be used as it has no MAC address
Nov 23 09:57:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:56.957 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:56 np0005532585.localdomain kernel: device tap8c439e83-e9 entered promiscuous mode
Nov 23 09:57:56 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891876.9652] manager: (tap8c439e83-e9): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 23 09:57:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:56Z|00085|binding|INFO|Claiming lport 8c439e83-e972-4e99-8d01-ff5269427a3c for this chassis.
Nov 23 09:57:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:56Z|00086|binding|INFO|8c439e83-e972-4e99-8d01-ff5269427a3c: Claiming unknown
Nov 23 09:57:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:56.966 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:56 np0005532585.localdomain systemd-udevd[310357]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:57:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:56.981 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2148c18d8f24a6db12dc22c787e8b2e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e3b2035-d1e3-4dc9-824d-c8c5d8c83090, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8c439e83-e972-4e99-8d01-ff5269427a3c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:57:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:56.983 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8c439e83-e972-4e99-8d01-ff5269427a3c in datapath 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3 bound to our chassis
Nov 23 09:57:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:56.984 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:57:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:57:56.985 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4fb42e-106e-4b13-a2fb-3da42b3adf25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:57Z|00087|binding|INFO|Setting lport 8c439e83-e972-4e99-8d01-ff5269427a3c ovn-installed in OVS
Nov 23 09:57:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:57:57Z|00088|binding|INFO|Setting lport 8c439e83-e972-4e99-8d01-ff5269427a3c up in Southbound
Nov 23 09:57:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:57.006 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8c439e83-e9: No such device
Nov 23 09:57:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:57.042 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:57:57.068 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:57:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:57:57 np0005532585.localdomain podman[310428]: 
Nov 23 09:57:57 np0005532585.localdomain podman[310428]: 2025-11-23 09:57:57.920496041 +0000 UTC m=+0.091383080 container create ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 09:57:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8.scope.
Nov 23 09:57:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:57:57 np0005532585.localdomain podman[310428]: 2025-11-23 09:57:57.875823328 +0000 UTC m=+0.046710387 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:57:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/067ea74ee71831860e0346c8368bf66a295b3b34ee14dc8629622033be014e0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:57:57 np0005532585.localdomain podman[310428]: 2025-11-23 09:57:57.986635911 +0000 UTC m=+0.157522950 container init ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:57:57 np0005532585.localdomain podman[310428]: 2025-11-23 09:57:57.997477742 +0000 UTC m=+0.168364771 container start ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:57:58 np0005532585.localdomain dnsmasq[310446]: started, version 2.85 cachesize 150
Nov 23 09:57:58 np0005532585.localdomain dnsmasq[310446]: DNS service limited to local subnets
Nov 23 09:57:58 np0005532585.localdomain dnsmasq[310446]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:57:58 np0005532585.localdomain dnsmasq[310446]: warning: no upstream servers configured
Nov 23 09:57:58 np0005532585.localdomain dnsmasq-dhcp[310446]: DHCP, static leases only on 19.80.0.0, lease time 1d
Nov 23 09:57:58 np0005532585.localdomain dnsmasq[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/addn_hosts - 0 addresses
Nov 23 09:57:58 np0005532585.localdomain dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/host
Nov 23 09:57:58 np0005532585.localdomain dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/opts
Nov 23 09:57:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:58.332 263258 INFO neutron.agent.dhcp.agent [None req-aa114eb8-c645-45ca-aee4-97d1e9730f4f - - - - - -] DHCP configuration for ports {'6df03061-a46e-4f2d-b42f-4f149f759e31'} is completed
Nov 23 09:57:58 np0005532585.localdomain ceph-mon[300199]: pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 23 09:57:58 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:57:58.480 2 INFO neutron.agent.securitygroups_rpc [None req-d68d9a06-c0e6-4bcf-9e27-a376d467ec2a 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']
Nov 23 09:57:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:58.509 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:57Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c4fdf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c94361f0>], id=b779be61-5809-44a6-8395-bfdf8254b4cc, ip_allocation=immediate, mac_address=fa:16:3e:e3:5d:7d, name=tempest-subport-711090127, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:54Z, description=, dns_domain=, id=8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-881047405, port_security_enabled=True, project_id=a2148c18d8f24a6db12dc22c787e8b2e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7641, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=410, status=ACTIVE, subnets=['62dffd83-97b3-49ae-a870-a9bc062f1cbb'], tags=[], tenant_id=a2148c18d8f24a6db12dc22c787e8b2e, updated_at=2025-11-23T09:57:55Z, vlan_transparent=None, network_id=8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, port_security_enabled=True, project_id=a2148c18d8f24a6db12dc22c787e8b2e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ff44a28d-1e1f-4163-b206-fdf77022bf0b'], standard_attr_id=430, status=DOWN, tags=[], tenant_id=a2148c18d8f24a6db12dc22c787e8b2e, updated_at=2025-11-23T09:57:58Z on network 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3
Nov 23 09:57:58 np0005532585.localdomain dnsmasq[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/addn_hosts - 1 addresses
Nov 23 09:57:58 np0005532585.localdomain dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/host
Nov 23 09:57:58 np0005532585.localdomain dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/opts
Nov 23 09:57:58 np0005532585.localdomain podman[310462]: 2025-11-23 09:57:58.727038535 +0000 UTC m=+0.057615000 container kill ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:57:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:57:59.239 263258 INFO neutron.agent.dhcp.agent [None req-f336ca04-ed79-4114-8df0-7a40d4e366db - - - - - -] DHCP configuration for ports {'b779be61-5809-44a6-8395-bfdf8254b4cc'} is completed
Nov 23 09:57:59 np0005532585.localdomain ceph-mon[300199]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 1.6 KiB/s wr, 14 op/s
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:57:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:57:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:58:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e91 e91: 6 total, 6 up, 6 in
Nov 23 09:58:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/650142256' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:58:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/650142256' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:58:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:00.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:58:01 np0005532585.localdomain podman[310483]: 2025-11-23 09:58:01.032989284 +0000 UTC m=+0.084250014 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:01 np0005532585.localdomain podman[310483]: 2025-11-23 09:58:01.070086657 +0000 UTC m=+0.121347417 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:58:01 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:58:01 np0005532585.localdomain ceph-mon[300199]: osdmap e91: 6 total, 6 up, 6 in
Nov 23 09:58:01 np0005532585.localdomain ceph-mon[300199]: pgmap v86: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Nov 23 09:58:01 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:01.902 263258 INFO neutron.agent.linux.ip_lib [None req-5c65c8ed-2305-4d00-9b34-36b1bd8ce7dd - - - - - -] Device tapb3d2d8f1-5b cannot be used as it has no MAC address
Nov 23 09:58:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:01.924 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:01 np0005532585.localdomain kernel: device tapb3d2d8f1-5b entered promiscuous mode
Nov 23 09:58:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:01.931 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:01Z|00089|binding|INFO|Claiming lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 for this chassis.
Nov 23 09:58:01 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891881.9320] manager: (tapb3d2d8f1-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Nov 23 09:58:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:01Z|00090|binding|INFO|b3d2d8f1-5bd4-4472-8663-88ab24ce0d37: Claiming unknown
Nov 23 09:58:01 np0005532585.localdomain systemd-udevd[310512]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:58:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:01.945 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '103398a293414a3081333eb24455a6bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f945a9c-d59a-418c-831c-76be9f4ae46a, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b3d2d8f1-5bd4-4472-8663-88ab24ce0d37) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:01.947 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 in datapath dbe92c68-f688-4288-aaf4-6edc728d68bf bound to our chassis
Nov 23 09:58:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:01.950 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6c6697f5-de38-4217-b0e1-a5abd4eb4fe0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:58:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:01.950 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbe92c68-f688-4288-aaf4-6edc728d68bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:01.954 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[37615a6f-3668-4a79-9e2b-72ba9e322bba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:01Z|00091|binding|INFO|Setting lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 ovn-installed in OVS
Nov 23 09:58:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:01Z|00092|binding|INFO|Setting lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 up in Southbound
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:01.970 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:01 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device
Nov 23 09:58:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:02.007 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:02.035 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:58:02 np0005532585.localdomain podman[310546]: 2025-11-23 09:58:02.295326652 +0000 UTC m=+0.088163852 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:58:02 np0005532585.localdomain podman[310546]: 2025-11-23 09:58:02.333392594 +0000 UTC m=+0.126229784 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:58:02 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:58:02 np0005532585.localdomain podman[310605]: 
Nov 23 09:58:02 np0005532585.localdomain podman[310605]: 2025-11-23 09:58:02.857326369 +0000 UTC m=+0.090473983 container create 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:58:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:02 np0005532585.localdomain systemd[1]: Started libpod-conmon-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d.scope.
Nov 23 09:58:02 np0005532585.localdomain podman[310605]: 2025-11-23 09:58:02.81083064 +0000 UTC m=+0.043978284 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:58:02 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:58:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95375b4c449ae49e7d86df09c61dc1d8f60e640130c105c535ef71f0e1c26dfc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:58:02 np0005532585.localdomain podman[310605]: 2025-11-23 09:58:02.935057352 +0000 UTC m=+0.168204956 container init 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:02 np0005532585.localdomain podman[310605]: 2025-11-23 09:58:02.944288894 +0000 UTC m=+0.177436518 container start 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 09:58:02 np0005532585.localdomain dnsmasq[310624]: started, version 2.85 cachesize 150
Nov 23 09:58:02 np0005532585.localdomain dnsmasq[310624]: DNS service limited to local subnets
Nov 23 09:58:02 np0005532585.localdomain dnsmasq[310624]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:58:02 np0005532585.localdomain dnsmasq[310624]: warning: no upstream servers configured
Nov 23 09:58:02 np0005532585.localdomain dnsmasq-dhcp[310624]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:58:02 np0005532585.localdomain dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 0 addresses
Nov 23 09:58:02 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host
Nov 23 09:58:02 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts
Nov 23 09:58:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:03.153 263258 INFO neutron.agent.dhcp.agent [None req-d6a567b3-94f9-4b2f-8762-e28c321912fc - - - - - -] DHCP configuration for ports {'aba6d273-b423-4b6b-b3f0-7ecf19405434'} is completed
Nov 23 09:58:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:03.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:03 np0005532585.localdomain ceph-mon[300199]: pgmap v87: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Nov 23 09:58:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:04.987 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:04Z, description=, device_id=caa365d3-aa93-4c7c-a692-c3fee4872fc2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c946adf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c946a580>], id=9d830597-f26f-4a70-b1b8-39d71caf458e, ip_allocation=immediate, mac_address=fa:16:3e:89:8e:29, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:59Z, description=, dns_domain=, id=dbe92c68-f688-4288-aaf4-6edc728d68bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1552785166-network, port_security_enabled=True, project_id=103398a293414a3081333eb24455a6bd, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20423, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=447, status=ACTIVE, subnets=['c372ea8d-cfdf-4713-bee5-0b10a9ac63ab'], tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:00Z, vlan_transparent=None, network_id=dbe92c68-f688-4288-aaf4-6edc728d68bf, port_security_enabled=False, project_id=103398a293414a3081333eb24455a6bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=492, status=DOWN, tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:04Z on network dbe92c68-f688-4288-aaf4-6edc728d68bf
Nov 23 09:58:05 np0005532585.localdomain dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 1 addresses
Nov 23 09:58:05 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host
Nov 23 09:58:05 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts
Nov 23 09:58:05 np0005532585.localdomain podman[310642]: 2025-11-23 09:58:05.185170336 +0000 UTC m=+0.062902782 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:58:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:05.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:05 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:05.499 263258 INFO neutron.agent.dhcp.agent [None req-74be8323-216f-4023-b121-5a43ec778f23 - - - - - -] DHCP configuration for ports {'9d830597-f26f-4a70-b1b8-39d71caf458e'} is completed
Nov 23 09:58:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:05.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:06 np0005532585.localdomain ceph-mon[300199]: pgmap v88: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.0 KiB/s wr, 39 op/s
Nov 23 09:58:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1362528200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:06.519 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:04Z, description=, device_id=caa365d3-aa93-4c7c-a692-c3fee4872fc2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bf11f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bf12b0>], id=9d830597-f26f-4a70-b1b8-39d71caf458e, ip_allocation=immediate, mac_address=fa:16:3e:89:8e:29, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:59Z, description=, dns_domain=, id=dbe92c68-f688-4288-aaf4-6edc728d68bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1552785166-network, port_security_enabled=True, project_id=103398a293414a3081333eb24455a6bd, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20423, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=447, status=ACTIVE, subnets=['c372ea8d-cfdf-4713-bee5-0b10a9ac63ab'], tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:00Z, vlan_transparent=None, network_id=dbe92c68-f688-4288-aaf4-6edc728d68bf, port_security_enabled=False, project_id=103398a293414a3081333eb24455a6bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=492, status=DOWN, tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:04Z on network dbe92c68-f688-4288-aaf4-6edc728d68bf
Nov 23 09:58:06 np0005532585.localdomain dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 1 addresses
Nov 23 09:58:06 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host
Nov 23 09:58:06 np0005532585.localdomain podman[310681]: 2025-11-23 09:58:06.726000097 +0000 UTC m=+0.062973804 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:58:06 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts
Nov 23 09:58:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:06.759 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:06.997 263258 INFO neutron.agent.dhcp.agent [None req-bfd1472c-9df8-4506-bb99-77d295303bc5 - - - - - -] DHCP configuration for ports {'9d830597-f26f-4a70-b1b8-39d71caf458e'} is completed
Nov 23 09:58:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 e92: 6 total, 6 up, 6 in
Nov 23 09:58:07 np0005532585.localdomain ceph-mon[300199]: pgmap v89: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.0 KiB/s wr, 39 op/s
Nov 23 09:58:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:08 np0005532585.localdomain ceph-mon[300199]: osdmap e92: 6 total, 6 up, 6 in
Nov 23 09:58:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:09.233 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:09.234 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:09 np0005532585.localdomain ceph-mon[300199]: pgmap v91: 177 pgs: 177 active+clean; 170 MiB data, 705 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 628 KiB/s wr, 57 op/s
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.319 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.319 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.319 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:58:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:10.320 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:58:10 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:10.587 2 INFO neutron.agent.securitygroups_rpc [req-3532c496-51d7-40c7-b3da-c0e7be1692a4 req-d87dead5-02e8-46e7-bd25-42d652af07f6 b79ba98acc3c4b3580a3847feb119c9b 103398a293414a3081333eb24455a6bd - - default default] Security group rule updated ['280efa91-c004-412c-b87a-91a6eef9493c']
Nov 23 09:58:10 np0005532585.localdomain sshd[310701]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:58:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:11.001 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:58:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:11.011 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:11.017 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:58:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:11.017 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:58:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3139089852' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:11 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:11.082 2 INFO neutron.agent.securitygroups_rpc [req-ce34b552-7369-4724-9e28-4e57bb3059bd req-8a77dff8-1df4-4326-b30f-4088438850bd b79ba98acc3c4b3580a3847feb119c9b 103398a293414a3081333eb24455a6bd - - default default] Security group rule updated ['280efa91-c004-412c-b87a-91a6eef9493c']
Nov 23 09:58:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:11.217 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:11 np0005532585.localdomain sshd[310701]: Received disconnect from 207.154.194.2 port 32976:11: Bye Bye [preauth]
Nov 23 09:58:11 np0005532585.localdomain sshd[310701]: Disconnected from authenticating user root 207.154.194.2 port 32976 [preauth]
Nov 23 09:58:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:58:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:58:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:58:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157508 "" "Go-http-client/1.1"
Nov 23 09:58:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:58:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:58:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:58:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:58:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19686 "" "Go-http-client/1.1"
Nov 23 09:58:12 np0005532585.localdomain ceph-mon[300199]: pgmap v92: 177 pgs: 177 active+clean; 170 MiB data, 705 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 558 KiB/s wr, 51 op/s
Nov 23 09:58:12 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1218640482' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:12 np0005532585.localdomain systemd[1]: tmp-crun.nJuNvQ.mount: Deactivated successfully.
Nov 23 09:58:12 np0005532585.localdomain podman[310703]: 2025-11-23 09:58:12.088725948 +0000 UTC m=+0.140606223 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:58:12 np0005532585.localdomain podman[310705]: 2025-11-23 09:58:12.099547759 +0000 UTC m=+0.151594059 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, vcs-type=git)
Nov 23 09:58:12 np0005532585.localdomain podman[310705]: 2025-11-23 09:58:12.143346315 +0000 UTC m=+0.195392575 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter)
Nov 23 09:58:12 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:58:12 np0005532585.localdomain podman[310703]: 2025-11-23 09:58:12.170457443 +0000 UTC m=+0.222337708 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:58:12 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:58:12 np0005532585.localdomain podman[310704]: 2025-11-23 09:58:12.149321418 +0000 UTC m=+0.202459432 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 23 09:58:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:12.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:12.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:58:12 np0005532585.localdomain podman[310704]: 2025-11-23 09:58:12.230240858 +0000 UTC m=+0.283378892 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 09:58:12 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:58:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2409305137' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:13 np0005532585.localdomain ceph-mon[300199]: pgmap v93: 177 pgs: 177 active+clean; 170 MiB data, 705 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 558 KiB/s wr, 51 op/s
Nov 23 09:58:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:13.421 263258 INFO neutron.agent.linux.ip_lib [None req-d54484ed-6ef1-4ccc-9b87-1b8f2a65c516 - - - - - -] Device tapd6f3b7ff-1b cannot be used as it has no MAC address
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:13 np0005532585.localdomain kernel: device tapd6f3b7ff-1b entered promiscuous mode
Nov 23 09:58:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:13Z|00093|binding|INFO|Claiming lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 for this chassis.
Nov 23 09:58:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:13Z|00094|binding|INFO|d6f3b7ff-1bfe-4568-bcbd-2732186dba70: Claiming unknown
Nov 23 09:58:13 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891893.5052] manager: (tapd6f3b7ff-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:13 np0005532585.localdomain systemd-udevd[310775]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.517 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:13.515 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0497de4959b2494e8036eb39226430d6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54e00d1b-ba48-40e5-8228-7e38f918fa79, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d6f3b7ff-1bfe-4568-bcbd-2732186dba70) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:13.516 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d6f3b7ff-1bfe-4568-bcbd-2732186dba70 in datapath c5d88dfa-0db8-489e-a45a-e843e31a3b26 bound to our chassis
Nov 23 09:58:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:13.518 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port ebd9d8fb-8dbf-4955-903c-af75d19c361c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:58:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:13.518 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5d88dfa-0db8-489e-a45a-e843e31a3b26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:13.519 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b450dd71-ba31-44a6-a8f3-014f03e6d96c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.533 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:13Z|00095|binding|INFO|Setting lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 ovn-installed in OVS
Nov 23 09:58:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:13Z|00096|binding|INFO|Setting lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 up in Southbound
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.540 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.589 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:13.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:14.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:14.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2677225766' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/227347488' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:14 np0005532585.localdomain podman[310846]: 
Nov 23 09:58:14 np0005532585.localdomain podman[310846]: 2025-11-23 09:58:14.523241812 +0000 UTC m=+0.097531609 container create 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:58:14 np0005532585.localdomain podman[310846]: 2025-11-23 09:58:14.476000989 +0000 UTC m=+0.050290876 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:58:14 np0005532585.localdomain systemd[1]: Started libpod-conmon-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf.scope.
Nov 23 09:58:14 np0005532585.localdomain systemd[1]: tmp-crun.vWVlD8.mount: Deactivated successfully.
Nov 23 09:58:14 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:58:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e36441351384cf4ce607d42d9d2a693f2618154ca741337c33c1071de55a0ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:58:14 np0005532585.localdomain podman[310846]: 2025-11-23 09:58:14.618987174 +0000 UTC m=+0.193276971 container init 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:58:14 np0005532585.localdomain podman[310846]: 2025-11-23 09:58:14.627909877 +0000 UTC m=+0.202199674 container start 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:58:14 np0005532585.localdomain dnsmasq[310864]: started, version 2.85 cachesize 150
Nov 23 09:58:14 np0005532585.localdomain dnsmasq[310864]: DNS service limited to local subnets
Nov 23 09:58:14 np0005532585.localdomain dnsmasq[310864]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:58:14 np0005532585.localdomain dnsmasq[310864]: warning: no upstream servers configured
Nov 23 09:58:14 np0005532585.localdomain dnsmasq-dhcp[310864]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:58:14 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 0 addresses
Nov 23 09:58:14 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:58:14 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:58:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:14.785 263258 INFO neutron.agent.dhcp.agent [None req-96f8d984-f7ca-4ade-ab26-f0f0f3ddceed - - - - - -] DHCP configuration for ports {'a8a61203-fe2e-4005-bcf2-6150709eadea'} is completed
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.234 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1097202376' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:15 np0005532585.localdomain ceph-mon[300199]: pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 52 op/s
Nov 23 09:58:15 np0005532585.localdomain systemd[1]: tmp-crun.2d6Teh.mount: Deactivated successfully.
Nov 23 09:58:15 np0005532585.localdomain dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 0 addresses
Nov 23 09:58:15 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host
Nov 23 09:58:15 np0005532585.localdomain dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts
Nov 23 09:58:15 np0005532585.localdomain podman[310899]: 2025-11-23 09:58:15.557472046 +0000 UTC m=+0.065540752 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:15 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:58:15 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1789582883' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.742 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:15Z|00097|binding|INFO|Releasing lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 from this chassis (sb_readonly=0)
Nov 23 09:58:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:15Z|00098|binding|INFO|Setting lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 down in Southbound
Nov 23 09:58:15 np0005532585.localdomain kernel: device tapb3d2d8f1-5b left promiscuous mode
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.784 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:15.792 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '103398a293414a3081333eb24455a6bd', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f945a9c-d59a-418c-831c-76be9f4ae46a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b3d2d8f1-5bd4-4472-8663-88ab24ce0d37) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:15.793 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 in datapath dbe92c68-f688-4288-aaf4-6edc728d68bf unbound from our chassis
Nov 23 09:58:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:15.795 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbe92c68-f688-4288-aaf4-6edc728d68bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:15.799 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ba65c446-ac5d-4822-803f-31fb7c072ee4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:15 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:15.858 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:15Z, description=, device_id=489975b8-b64c-4318-bb24-a798d93046de, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7ffa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7fe20>], id=a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e, ip_allocation=immediate, mac_address=fa:16:3e:fb:9f:ce, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:11Z, description=, dns_domain=, id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1246892017-network, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8214, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=532, status=ACTIVE, subnets=['c6867cf5-ed63-4cec-9755-38eaead8ab16'], tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:11Z, vlan_transparent=None, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=False, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=586, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:15Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.878 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:58:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:15.879 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.054 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.055 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11375MB free_disk=41.774322509765625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.056 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.056 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:16 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 1 addresses
Nov 23 09:58:16 np0005532585.localdomain systemd[1]: tmp-crun.OItiOf.mount: Deactivated successfully.
Nov 23 09:58:16 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:58:16 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:58:16 np0005532585.localdomain podman[310941]: 2025-11-23 09:58:16.087932551 +0000 UTC m=+0.071883956 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.277 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.278 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.279 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:58:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:16.350 263258 INFO neutron.agent.dhcp.agent [None req-955c50bf-858a-4c6e-af2e-94778d49fba6 - - - - - -] DHCP configuration for ports {'a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e'} is completed
Nov 23 09:58:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1789582883' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:16.609 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:58:17 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3516973189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.053 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.060 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.077 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.095 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.096 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 09:58:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:17.230 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 09:58:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:17.859 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:15Z, description=, device_id=489975b8-b64c-4318-bb24-a798d93046de, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6f70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd61c0>], id=a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e, ip_allocation=immediate, mac_address=fa:16:3e:fb:9f:ce, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:11Z, description=, dns_domain=, id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1246892017-network, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8214, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=532, status=ACTIVE, subnets=['c6867cf5-ed63-4cec-9755-38eaead8ab16'], tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:11Z, vlan_transparent=None, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=False, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=586, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:15Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26
Nov 23 09:58:18 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 1 addresses
Nov 23 09:58:18 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:58:18 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:58:18 np0005532585.localdomain podman[310999]: 2025-11-23 09:58:18.228940704 +0000 UTC m=+0.200285706 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:58:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:18 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3516973189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:18 np0005532585.localdomain ceph-mon[300199]: pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 52 op/s
Nov 23 09:58:18 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:18.397 263258 INFO neutron.agent.dhcp.agent [None req-8185762f-af93-4602-9a04-39972f389d15 - - - - - -] DHCP configuration for ports {'a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e'} is completed
Nov 23 09:58:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:18Z|00099|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:58:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:18.724 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:19.193 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:19 np0005532585.localdomain dnsmasq[310624]: exiting on receipt of SIGTERM
Nov 23 09:58:19 np0005532585.localdomain systemd[1]: tmp-crun.msWSXe.mount: Deactivated successfully.
Nov 23 09:58:19 np0005532585.localdomain systemd[1]: libpod-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d.scope: Deactivated successfully.
Nov 23 09:58:19 np0005532585.localdomain podman[311037]: 2025-11-23 09:58:19.45943014 +0000 UTC m=+0.059337112 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 09:58:19 np0005532585.localdomain podman[311051]: 2025-11-23 09:58:19.52692046 +0000 UTC m=+0.054380310 container died 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:58:19 np0005532585.localdomain podman[311051]: 2025-11-23 09:58:19.561525036 +0000 UTC m=+0.088984846 container cleanup 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 09:58:19 np0005532585.localdomain systemd[1]: libpod-conmon-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d.scope: Deactivated successfully.
Nov 23 09:58:19 np0005532585.localdomain podman[311056]: 2025-11-23 09:58:19.612105831 +0000 UTC m=+0.129762563 container remove 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:58:19 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:19.958 263258 INFO neutron.agent.dhcp.agent [None req-13acc26b-0b5a-47d5-91ff-55db475188c1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:58:19 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:19.959 263258 INFO neutron.agent.dhcp.agent [None req-13acc26b-0b5a-47d5-91ff-55db475188c1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:58:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:20.332 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:58:20 np0005532585.localdomain ceph-mon[300199]: pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 81 op/s
Nov 23 09:58:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-95375b4c449ae49e7d86df09c61dc1d8f60e640130c105c535ef71f0e1c26dfc-merged.mount: Deactivated successfully.
Nov 23 09:58:20 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d-userdata-shm.mount: Deactivated successfully.
Nov 23 09:58:20 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2ddbe92c68\x2df688\x2d4288\x2daaf4\x2d6edc728d68bf.mount: Deactivated successfully.
Nov 23 09:58:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:21.017 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:21 np0005532585.localdomain ceph-mon[300199]: pgmap v97: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.1 MiB/s rd, 1.3 MiB/s wr, 58 op/s
Nov 23 09:58:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:21.562 2 INFO neutron.agent.securitygroups_rpc [None req-513d9ac5-08dd-4555-997f-809230181da7 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group rule updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']
Nov 23 09:58:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:21.713 2 INFO neutron.agent.securitygroups_rpc [None req-7741ab62-6798-4d09-b205-555af43d015d 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group rule updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']
Nov 23 09:58:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:58:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:58:22 np0005532585.localdomain podman[311083]: 2025-11-23 09:58:22.041929333 +0000 UTC m=+0.094243009 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:58:22 np0005532585.localdomain podman[311083]: 2025-11-23 09:58:22.081251533 +0000 UTC m=+0.133565239 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:58:22 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:58:22 np0005532585.localdomain podman[311082]: 2025-11-23 09:58:22.136557222 +0000 UTC m=+0.191367924 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 23 09:58:22 np0005532585.localdomain podman[311082]: 2025-11-23 09:58:22.178476722 +0000 UTC m=+0.233287454 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:22 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:58:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:23 np0005532585.localdomain ceph-mon[300199]: pgmap v98: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.1 MiB/s rd, 1.3 MiB/s wr, 58 op/s
Nov 23 09:58:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:23.794 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:25.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:58:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:25.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 09:58:25 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:25.331 2 INFO neutron.agent.securitygroups_rpc [None req-ec9f2257-2897-484b-a0ca-c8a73a80ef4d 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']
Nov 23 09:58:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:26.019 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:26 np0005532585.localdomain ceph-mon[300199]: pgmap v99: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 1.3 MiB/s wr, 85 op/s
Nov 23 09:58:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1353440912' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:26 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:26.565 2 INFO neutron.agent.securitygroups_rpc [req-dc62ce45-8668-47e6-9d5e-2f0b1764537e req-34d4dcd5-73f6-46e0-ba5e-aabbd18e768e 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group member updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']
Nov 23 09:58:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:26.636 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:26Z, description=, device_id=1148b5a9-4da9-491f-8952-80c4a965fe6b, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bf1550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bf1580>], id=a1846659-6b91-4156-9939-085b30454143, ip_allocation=immediate, mac_address=fa:16:3e:da:90:40, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:11Z, description=, dns_domain=, id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1246892017-network, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8214, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=532, status=ACTIVE, subnets=['c6867cf5-ed63-4cec-9755-38eaead8ab16'], tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:11Z, vlan_transparent=None, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['2da1104f-77c5-475e-b21f-e52710edc8b5'], standard_attr_id=648, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:26Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26
Nov 23 09:58:26 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 2 addresses
Nov 23 09:58:26 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:58:26 np0005532585.localdomain podman[311139]: 2025-11-23 09:58:26.849632959 +0000 UTC m=+0.060220469 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:58:26 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:58:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:27.143 263258 INFO neutron.agent.dhcp.agent [None req-ac2ce650-6b75-4265-9722-02c2ce37a04c - - - - - -] DHCP configuration for ports {'a1846659-6b91-4156-9939-085b30454143'} is completed
Nov 23 09:58:27 np0005532585.localdomain ceph-mon[300199]: pgmap v100: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 63 op/s
Nov 23 09:58:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:27.625 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005532584.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:26Z, description=, device_id=1148b5a9-4da9-491f-8952-80c4a965fe6b, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c94107c0>], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9410070>], id=a1846659-6b91-4156-9939-085b30454143, ip_allocation=immediate, mac_address=fa:16:3e:da:90:40, name=, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['2da1104f-77c5-475e-b21f-e52710edc8b5'], standard_attr_id=648, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:27Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26
Nov 23 09:58:27 np0005532585.localdomain podman[311177]: 2025-11-23 09:58:27.761940332 +0000 UTC m=+0.029073140 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 09:58:27 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 2 addresses
Nov 23 09:58:27 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:58:27 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:58:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:27.973 263258 INFO neutron.agent.dhcp.agent [None req-68f7a82a-282a-4552-839c-b6eeea7989c5 - - - - - -] DHCP configuration for ports {'a1846659-6b91-4156-9939-085b30454143'} is completed
Nov 23 09:58:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:28.188 263258 INFO neutron.agent.linux.ip_lib [None req-69d1f747-8ab7-41e5-a631-855da9356272 - - - - - -] Device tapca98d0dd-23 cannot be used as it has no MAC address
Nov 23 09:58:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:28.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:28 np0005532585.localdomain kernel: device tapca98d0dd-23 entered promiscuous mode
Nov 23 09:58:28 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891908.2182] manager: (tapca98d0dd-23): new Generic device (/org/freedesktop/NetworkManager/Devices/23)
Nov 23 09:58:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:28.218 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:28Z|00100|binding|INFO|Claiming lport ca98d0dd-231a-46c7-80b8-a48c00a5696e for this chassis.
Nov 23 09:58:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:28Z|00101|binding|INFO|ca98d0dd-231a-46c7-80b8-a48c00a5696e: Claiming unknown
Nov 23 09:58:28 np0005532585.localdomain systemd-udevd[311209]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:58:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:28.235 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-903951dd-448c-4453-aa24-f24a53269074', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ca98d0dd-231a-46c7-80b8-a48c00a5696e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:28.238 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ca98d0dd-231a-46c7-80b8-a48c00a5696e in datapath 903951dd-448c-4453-aa24-f24a53269074 bound to our chassis
Nov 23 09:58:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:28.240 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 903951dd-448c-4453-aa24-f24a53269074 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:58:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:28.241 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdec47d-0554-49bc-a003-b46997fef3d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:28Z|00102|binding|INFO|Setting lport ca98d0dd-231a-46c7-80b8-a48c00a5696e ovn-installed in OVS
Nov 23 09:58:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:28Z|00103|binding|INFO|Setting lport ca98d0dd-231a-46c7-80b8-a48c00a5696e up in Southbound
Nov 23 09:58:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:28.292 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapca98d0dd-23: No such device
Nov 23 09:58:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:28.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:28.356 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2236451100' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:29 np0005532585.localdomain podman[311280]: 
Nov 23 09:58:29 np0005532585.localdomain podman[311280]: 2025-11-23 09:58:29.246077082 +0000 UTC m=+0.088000969 container create d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:29 np0005532585.localdomain systemd[1]: Started libpod-conmon-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814.scope.
Nov 23 09:58:29 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:58:29 np0005532585.localdomain podman[311280]: 2025-11-23 09:58:29.204137101 +0000 UTC m=+0.046061028 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:58:29 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50b27a1dbab67616f24ed7868948ddd418eee50493d781bdc989237aff316f07/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:58:29 np0005532585.localdomain podman[311280]: 2025-11-23 09:58:29.318087109 +0000 UTC m=+0.160010996 container init d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:58:29 np0005532585.localdomain podman[311280]: 2025-11-23 09:58:29.328171568 +0000 UTC m=+0.170095465 container start d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:58:29 np0005532585.localdomain dnsmasq[311299]: started, version 2.85 cachesize 150
Nov 23 09:58:29 np0005532585.localdomain dnsmasq[311299]: DNS service limited to local subnets
Nov 23 09:58:29 np0005532585.localdomain dnsmasq[311299]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:58:29 np0005532585.localdomain dnsmasq[311299]: warning: no upstream servers configured
Nov 23 09:58:29 np0005532585.localdomain dnsmasq-dhcp[311299]: DHCP, static leases only on 19.80.0.0, lease time 1d
Nov 23 09:58:29 np0005532585.localdomain dnsmasq[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/addn_hosts - 0 addresses
Nov 23 09:58:29 np0005532585.localdomain dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/host
Nov 23 09:58:29 np0005532585.localdomain dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/opts
Nov 23 09:58:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:29.470 263258 INFO neutron.agent.dhcp.agent [None req-27a4d57d-c3ab-4387-a248-7e72d81e4b29 - - - - - -] DHCP configuration for ports {'b83bb60d-d579-4f8d-9e2c-3885d238bb26'} is completed
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.624 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.625 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.645 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.738 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.739 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.745 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.745 281956 INFO nova.compute.claims [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Claim successful on node np0005532585.localdomain
Nov 23 09:58:29 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:29.796 2 INFO neutron.agent.securitygroups_rpc [None req-2d14bee2-d335-42d8-9b8c-ccacbe55654b 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']
Nov 23 09:58:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:29.827 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8becb80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b8e280>], id=fd30dda9-c731-47dd-b319-ebcca717b708, ip_allocation=immediate, mac_address=fa:16:3e:4f:95:ad, name=tempest-subport-1587702031, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:25Z, description=, dns_domain=, id=903951dd-448c-4453-aa24-f24a53269074, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-497326985, port_security_enabled=True, project_id=253c88568a634476a6c1284eed6a9464, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35137, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=646, status=ACTIVE, subnets=['2620a714-202e-4ac3-ab2a-226dc050a669'], tags=[], tenant_id=253c88568a634476a6c1284eed6a9464, updated_at=2025-11-23T09:58:27Z, vlan_transparent=None, network_id=903951dd-448c-4453-aa24-f24a53269074, port_security_enabled=True, project_id=253c88568a634476a6c1284eed6a9464, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a3350144-9b09-432b-a32e-ef84bb8bf494'], standard_attr_id=652, status=DOWN, tags=[], tenant_id=253c88568a634476a6c1284eed6a9464, updated_at=2025-11-23T09:58:29Z on network 903951dd-448c-4453-aa24-f24a53269074
Nov 23 09:58:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:29.882 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:29 np0005532585.localdomain ceph-mon[300199]: pgmap v101: 177 pgs: 177 active+clean; 238 MiB data, 838 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 1.8 MiB/s wr, 106 op/s
Nov 23 09:58:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2868171599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:29 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:58:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:58:30 np0005532585.localdomain dnsmasq[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/addn_hosts - 1 addresses
Nov 23 09:58:30 np0005532585.localdomain dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/host
Nov 23 09:58:30 np0005532585.localdomain dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/opts
Nov 23 09:58:30 np0005532585.localdomain podman[311320]: 2025-11-23 09:58:30.097978769 +0000 UTC m=+0.057924419 container kill d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:30.331 263258 INFO neutron.agent.dhcp.agent [None req-62ce58fd-b234-459e-b227-fbd75f57ba4f - - - - - -] DHCP configuration for ports {'fd30dda9-c731-47dd-b319-ebcca717b708'} is completed
Nov 23 09:58:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:58:30 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3630223276' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.400 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.407 281956 DEBUG nova.compute.provider_tree [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.423 281956 DEBUG nova.scheduler.client.report [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.458 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.459 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.520 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.535 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.552 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.629 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.632 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.632 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating image(s)
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.673 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.717 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.757 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.761 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "ba971d9ef74673015953b46ad4dbea47e54dd66a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.762 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "ba971d9ef74673015953b46ad4dbea47e54dd66a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:30.800 281956 DEBUG nova.virt.libvirt.imagebackend [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Image locations are: [{'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/c5806483-57a8-4254-b41b-254b888c8606/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/c5806483-57a8-4254-b41b-254b888c8606/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Nov 23 09:58:30 np0005532585.localdomain sshd[311417]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:58:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3630223276' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:31.056 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:31 np0005532585.localdomain sshd[311417]: Received disconnect from 107.172.15.139 port 48954:11: Bye Bye [preauth]
Nov 23 09:58:31 np0005532585.localdomain sshd[311417]: Disconnected from authenticating user root 107.172.15.139 port 48954 [preauth]
Nov 23 09:58:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:58:31 np0005532585.localdomain podman[311419]: 2025-11-23 09:58:31.353001994 +0000 UTC m=+0.091881406 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:58:31 np0005532585.localdomain podman[311419]: 2025-11-23 09:58:31.372359926 +0000 UTC m=+0.111239318 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:58:31 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:58:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:31.777 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:31.851 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:31.852 281956 DEBUG nova.virt.images [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] c5806483-57a8-4254-b41b-254b888c8606 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 23 09:58:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:31.854 281956 DEBUG nova.privsep.utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 23 09:58:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:31.855 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:32 np0005532585.localdomain ceph-mon[300199]: pgmap v102: 177 pgs: 177 active+clean; 238 MiB data, 838 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 1.8 MiB/s wr, 69 op/s
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.097 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.102 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.176 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.177 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "ba971d9ef74673015953b46ad4dbea47e54dd66a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.209 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.215 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.829 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.614s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:58:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:32.942 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] resizing rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Nov 23 09:58:33 np0005532585.localdomain podman[311526]: 2025-11-23 09:58:33.025781213 +0000 UTC m=+0.084677316 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 09:58:33 np0005532585.localdomain podman[311526]: 2025-11-23 09:58:33.068427915 +0000 UTC m=+0.127324018 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:58:33 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.122 281956 DEBUG nova.objects.instance [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.138 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.139 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Ensure instance console log exists: /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.139 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.140 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.140 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.143 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T09:56:45Z,direct_url=<?>,disk_format='qcow2',id=c5806483-57a8-4254-b41b-254b888c8606,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1915d3e5d4254231a0517e2dcf35848f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T09:56:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'image_id': 'c5806483-57a8-4254-b41b-254b888c8606'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.148 281956 WARNING nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.151 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.152 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.153 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.154 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.154 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.155 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T09:56:44Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43b374b4-75d9-47f9-aa6b-ddb1a45f7c04',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T09:56:45Z,direct_url=<?>,disk_format='qcow2',id=c5806483-57a8-4254-b41b-254b888c8606,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1915d3e5d4254231a0517e2dcf35848f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T09:56:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.156 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.156 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.156 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.157 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.157 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.158 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.158 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.159 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.159 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.160 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.164 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:33 np0005532585.localdomain sudo[311585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:58:33 np0005532585.localdomain sudo[311585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:58:33 np0005532585.localdomain sudo[311585]: pam_unix(sudo:session): session closed for user root
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:33 np0005532585.localdomain sudo[311604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:58:33 np0005532585.localdomain sudo[311604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: pgmap v103: 177 pgs: 177 active+clean; 238 MiB data, 838 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 1.8 MiB/s wr, 69 op/s
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.463841) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913463879, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2462, "num_deletes": 251, "total_data_size": 3704725, "memory_usage": 3757000, "flush_reason": "Manual Compaction"}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913476317, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2394639, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18758, "largest_seqno": 21215, "table_properties": {"data_size": 2385727, "index_size": 5545, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19315, "raw_average_key_size": 21, "raw_value_size": 2367399, "raw_average_value_size": 2576, "num_data_blocks": 239, "num_entries": 919, "num_filter_entries": 919, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891729, "oldest_key_time": 1763891729, "file_creation_time": 1763891913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 12535 microseconds, and 6484 cpu microseconds.
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.476370) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2394639 bytes OK
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.476394) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.478756) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.478775) EVENT_LOG_v1 {"time_micros": 1763891913478769, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.478798) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3693768, prev total WAL file size 3693768, number of live WAL files 2.
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.479755) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2338KB)], [33(15MB)]
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913479799, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 18833343, "oldest_snapshot_seqno": -1}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12164 keys, 16882661 bytes, temperature: kUnknown
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913556083, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 16882661, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16814736, "index_size": 36400, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 328801, "raw_average_key_size": 27, "raw_value_size": 16608602, "raw_average_value_size": 1365, "num_data_blocks": 1365, "num_entries": 12164, "num_filter_entries": 12164, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.556376) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 16882661 bytes
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.558154) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 246.6 rd, 221.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 15.7 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(14.9) write-amplify(7.1) OK, records in: 12696, records dropped: 532 output_compression: NoCompression
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.558190) EVENT_LOG_v1 {"time_micros": 1763891913558174, "job": 18, "event": "compaction_finished", "compaction_time_micros": 76378, "compaction_time_cpu_micros": 44428, "output_level": 6, "num_output_files": 1, "total_output_size": 16882661, "num_input_records": 12696, "num_output_records": 12164, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913558693, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913560928, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.479709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 09:58:33 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3874030768' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.637 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.677 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:33.682 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:34 np0005532585.localdomain sudo[311604]: pam_unix(sudo:session): session closed for user root
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3441935928' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.159 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.160 281956 DEBUG nova.objects.instance [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.169 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] End _get_guest_xml xml=<domain type="kvm">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <uuid>8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7</uuid>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <name>instance-00000009</name>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <memory>131072</memory>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <vcpu>1</vcpu>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <metadata>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-2005076685</nova:name>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:creationTime>2025-11-23 09:58:33</nova:creationTime>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:flavor name="m1.nano">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:memory>128</nova:memory>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:disk>1</nova:disk>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:swap>0</nova:swap>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:ephemeral>0</nova:ephemeral>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:vcpus>1</nova:vcpus>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       </nova:flavor>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:owner>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:user uuid="55581f20ed8d4bd8a61a81c525ca8141">tempest-UnshelveToHostMultiNodesTest-612486733-project-member</nova:user>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <nova:project uuid="37a58b702f564a81ab5a59cf4201b4f0">tempest-UnshelveToHostMultiNodesTest-612486733</nova:project>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       </nova:owner>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:root type="image" uuid="c5806483-57a8-4254-b41b-254b888c8606"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <nova:ports/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </nova:instance>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </metadata>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <sysinfo type="smbios">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <system>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <entry name="manufacturer">RDO</entry>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <entry name="product">OpenStack Compute</entry>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <entry name="serial">8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7</entry>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <entry name="uuid">8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7</entry>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <entry name="family">Virtual Machine</entry>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </system>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </sysinfo>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <os>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <boot dev="hd"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <smbios mode="sysinfo"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <acpi/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <apic/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <clock offset="utc">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <timer name="pit" tickpolicy="delay"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <timer name="hpet" present="no"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </clock>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <cpu mode="host-model" match="exact">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <topology sockets="1" cores="1" threads="1"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <disk type="network" device="disk">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <driver type="raw" cache="none"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <source protocol="rbd" name="vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.103" port="6789"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.104" port="6789"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.105" port="6789"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       </source>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <auth username="openstack">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <secret type="ceph" uuid="46550e70-79cb-5f55-bf6d-1204b97e083b"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       </auth>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <target dev="vda" bus="virtio"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <disk type="network" device="cdrom">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <driver type="raw" cache="none"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <source protocol="rbd" name="vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.103" port="6789"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.104" port="6789"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.105" port="6789"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       </source>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <auth username="openstack">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:         <secret type="ceph" uuid="46550e70-79cb-5f55-bf6d-1204b97e083b"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       </auth>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <target dev="sda" bus="sata"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <serial type="pty">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <log file="/var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/console.log" append="off"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </serial>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <video>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <model type="virtio"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <input type="tablet" bus="usb"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <rng model="virtio">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <backend model="random">/dev/urandom</backend>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <controller type="usb" index="0"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     <memballoon model="virtio">
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:       <stats period="10"/>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:     </memballoon>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: </domain>
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.205 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.206 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.206 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Using config drive
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.236 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.279 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating config drive at /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.282 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp101umq3_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:34 np0005532585.localdomain sudo[311734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:58:34 np0005532585.localdomain sudo[311734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:58:34 np0005532585.localdomain sudo[311734]: pam_unix(sudo:session): session closed for user root
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.412 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp101umq3_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4212683681' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3874030768' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3441935928' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/200617908' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4042067926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.458 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.467 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.671 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:58:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:34.673 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting local config drive /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config because it was imported into RBD.
Nov 23 09:58:34 np0005532585.localdomain systemd[1]: Started libvirt secret daemon.
Nov 23 09:58:34 np0005532585.localdomain systemd-machined[84275]: New machine qemu-3-instance-00000009.
Nov 23 09:58:34 np0005532585.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.255 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891915.2544425, 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.255 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Resumed (Lifecycle Event)
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.257 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.258 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.261 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance spawned successfully.
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.261 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.275 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.282 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.287 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.288 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.288 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.289 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.290 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.291 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.365 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.366 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891915.2547896, 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.366 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Started (Lifecycle Event)
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.394 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.398 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.402 281956 INFO nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Took 4.77 seconds to spawn the instance on the hypervisor.
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.403 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.425 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.461 281956 INFO nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Took 5.75 seconds to build instance.
Nov 23 09:58:35 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1229603635' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:35 np0005532585.localdomain ceph-mon[300199]: pgmap v104: 177 pgs: 177 active+clean; 317 MiB data, 995 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 5.7 MiB/s wr, 169 op/s
Nov 23 09:58:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:35.477 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.090 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:58:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:36.099 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:37.357 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:37.358 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:37.358 281956 INFO nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelving
Nov 23 09:58:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:37.386 281956 DEBUG nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 23 09:58:37 np0005532585.localdomain ceph-mon[300199]: pgmap v105: 177 pgs: 177 active+clean; 317 MiB data, 995 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 5.7 MiB/s wr, 142 op/s
Nov 23 09:58:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:38 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3145656533' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:58:39 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2888214571' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:58:39 np0005532585.localdomain ceph-mon[300199]: pgmap v106: 177 pgs: 177 active+clean; 285 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 7.6 MiB/s rd, 7.5 MiB/s wr, 334 op/s
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.127 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 09:58:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:41.135 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:41.290 2 INFO neutron.agent.securitygroups_rpc [None req-b6d2f56d-2805-44c2-9e36-7ffa8fc09e14 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']
Nov 23 09:58:41 np0005532585.localdomain ceph-mon[300199]: pgmap v107: 177 pgs: 177 active+clean; 285 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.7 MiB/s wr, 292 op/s
Nov 23 09:58:41 np0005532585.localdomain dnsmasq[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/addn_hosts - 0 addresses
Nov 23 09:58:41 np0005532585.localdomain dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/host
Nov 23 09:58:41 np0005532585.localdomain podman[311885]: 2025-11-23 09:58:41.554658603 +0000 UTC m=+0.044953744 container kill ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:41 np0005532585.localdomain dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/opts
Nov 23 09:58:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:58:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:58:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:58:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159330 "" "Go-http-client/1.1"
Nov 23 09:58:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:58:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20180 "" "Go-http-client/1.1"
Nov 23 09:58:41 np0005532585.localdomain systemd[1]: tmp-crun.K7OmSL.mount: Deactivated successfully.
Nov 23 09:58:41 np0005532585.localdomain dnsmasq[310446]: exiting on receipt of SIGTERM
Nov 23 09:58:41 np0005532585.localdomain podman[311925]: 2025-11-23 09:58:41.994225903 +0000 UTC m=+0.081364575 container kill ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:58:41 np0005532585.localdomain systemd[1]: libpod-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8.scope: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain podman[311939]: 2025-11-23 09:58:42.049986725 +0000 UTC m=+0.041908641 container died ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:58:42 np0005532585.localdomain podman[311939]: 2025-11-23 09:58:42.080420584 +0000 UTC m=+0.072342420 container cleanup ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: libpod-conmon-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8.scope: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain podman[311940]: 2025-11-23 09:58:42.121692634 +0000 UTC m=+0.100900431 container remove ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:58:42 np0005532585.localdomain kernel: device tap8c439e83-e9 left promiscuous mode
Nov 23 09:58:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:42Z|00104|binding|INFO|Releasing lport 8c439e83-e972-4e99-8d01-ff5269427a3c from this chassis (sb_readonly=0)
Nov 23 09:58:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:42Z|00105|binding|INFO|Setting lport 8c439e83-e972-4e99-8d01-ff5269427a3c down in Southbound
Nov 23 09:58:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:42.133 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:42.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:42.156 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2148c18d8f24a6db12dc22c787e8b2e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e3b2035-d1e3-4dc9-824d-c8c5d8c83090, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8c439e83-e972-4e99-8d01-ff5269427a3c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:42.157 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8c439e83-e972-4e99-8d01-ff5269427a3c in datapath 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3 unbound from our chassis
Nov 23 09:58:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:42.160 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:42.161 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ba603f-c97d-4b7d-8d6d-57a09f6e73a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:42.184 263258 INFO neutron.agent.dhcp.agent [None req-4b70179b-86c0-4734-a9cb-8e744eb6604f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:58:42 np0005532585.localdomain podman[311969]: 2025-11-23 09:58:42.524097869 +0000 UTC m=+0.079064355 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 23 09:58:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:58:42.530 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:58:42 np0005532585.localdomain podman[311967]: 2025-11-23 09:58:42.536292512 +0000 UTC m=+0.089483183 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:42 np0005532585.localdomain podman[311969]: 2025-11-23 09:58:42.539144509 +0000 UTC m=+0.094110995 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, distribution-scope=public, io.openshift.expose-services=)
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-067ea74ee71831860e0346c8368bf66a295b3b34ee14dc8629622033be014e0a-merged.mount: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8-userdata-shm.mount: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d8cd987c4\x2d7e4e\x2d467f\x2d9ee2\x2dd70cb75b87c3.mount: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain podman[311967]: 2025-11-23 09:58:42.607277269 +0000 UTC m=+0.160467950 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: tmp-crun.JvAJB8.mount: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain podman[311968]: 2025-11-23 09:58:42.641126022 +0000 UTC m=+0.195291113 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 09:58:42 np0005532585.localdomain podman[311968]: 2025-11-23 09:58:42.649306512 +0000 UTC m=+0.203471553 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 09:58:42 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:58:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:42Z|00106|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:58:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:42.969 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:42.972 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:42.973 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:58:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:42.975 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:43 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:43.305 2 INFO neutron.agent.securitygroups_rpc [None req-50406108-6fe1-4d79-842a-8b928e46e646 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']
Nov 23 09:58:43 np0005532585.localdomain ceph-mon[300199]: pgmap v108: 177 pgs: 177 active+clean; 285 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.7 MiB/s wr, 292 op/s
Nov 23 09:58:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:43.783 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Creating tmpfile /var/lib/nova/instances/tmp2mwvq3bv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 23 09:58:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:43.812 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 23 09:58:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:43.839 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:58:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:43.840 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:58:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:43.849 281956 INFO nova.compute.rpcapi [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 23 09:58:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:43.850 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:58:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:44.485 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 23 09:58:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:44.504 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:58:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:44.504 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquired lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:58:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:44.505 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.366 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Updating instance_info_cache with network_info: [{"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.384 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Releasing lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.388 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.389 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Creating instance directory: /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.390 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Ensure instance console log exists: /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.390 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.393 281956 DEBUG nova.virt.libvirt.vif [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T09:58:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1576780525',display_name='tempest-LiveMigrationTest-server-1576780525',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005532584.localdomain',hostname='tempest-livemigrationtest-server-1576780525',id=10,image_ref='c5806483-57a8-4254-b41b-254b888c8606',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T09:58:41Z,launched_on='np0005532584.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005532584.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='253c88568a634476a6c1284eed6a9464',ramdisk_id='',reservation_id='r-dvg5v145',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c5806483-57a8-4254-b41b-254b888c8606',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1889456510',owner_user_name='tempest-LiveMigrationTest-1889456510-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T09:58:41Z,user_data=None,user_id='7f7875c0084c46fdb2e7b37e4fc44faf',uuid=8f62292f-5719-4b19-9188-3715b94493a7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.395 281956 DEBUG nova.network.os_vif_util [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Converting VIF {"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.397 281956 DEBUG nova.network.os_vif_util [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.398 281956 DEBUG os_vif [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.400 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.401 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.408 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap737e82a6-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap737e82a6-26, col_values=(('external_ids', {'iface-id': '737e82a6-2634-47df-b8a7-ec21a927cc3f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:21:74', 'vm-uuid': '8f62292f-5719-4b19-9188-3715b94493a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.425 281956 INFO os_vif [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26')
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.427 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 23 09:58:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:45.428 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 23 09:58:45 np0005532585.localdomain ceph-mon[300199]: pgmap v109: 177 pgs: 177 active+clean; 285 MiB data, 921 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 5.7 MiB/s wr, 365 op/s
Nov 23 09:58:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:46.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:47 np0005532585.localdomain ceph-mon[300199]: pgmap v110: 177 pgs: 177 active+clean; 285 MiB data, 921 MiB used, 41 GiB / 42 GiB avail; 5.8 MiB/s rd, 1.8 MiB/s wr, 265 op/s
Nov 23 09:58:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:47.451 281956 DEBUG nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 23 09:58:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:47.668 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Port 737e82a6-2634-47df-b8a7-ec21a927cc3f updated with migration profile {'migrating_to': 'np0005532585.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 23 09:58:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:47.671 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 23 09:58:47 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:47Z|00107|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:58:47 np0005532585.localdomain sshd[312028]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:58:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:47.845 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:47 np0005532585.localdomain sshd[312028]: Accepted publickey for nova from 172.17.0.106 port 48704 ssh2: ECDSA SHA256:wMzEbOO1fjjG+LWUd7n2rjnVQ6QTBNm58LW+/BVKalo
Nov 23 09:58:47 np0005532585.localdomain systemd-logind[761]: New session 72 of user nova.
Nov 23 09:58:47 np0005532585.localdomain systemd[1]: Created slice User Slice of UID 42436.
Nov 23 09:58:47 np0005532585.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 23 09:58:47 np0005532585.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 23 09:58:47 np0005532585.localdomain systemd[1]: Starting User Manager for UID 42436...
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Queued start job for default target Main User Target.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Created slice User Application Slice.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Reached target Paths.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Reached target Timers.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Starting D-Bus User Message Bus Socket...
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Starting Create User's Volatile Files and Directories...
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Listening on D-Bus User Message Bus Socket.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Reached target Sockets.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Finished Create User's Volatile Files and Directories.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Reached target Basic System.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Reached target Main User Target.
Nov 23 09:58:48 np0005532585.localdomain systemd[312032]: Startup finished in 136ms.
Nov 23 09:58:48 np0005532585.localdomain systemd[1]: Started User Manager for UID 42436.
Nov 23 09:58:48 np0005532585.localdomain systemd[1]: Started Session 72 of User nova.
Nov 23 09:58:48 np0005532585.localdomain sshd[312028]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Nov 23 09:58:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891928.3760] manager: (tap737e82a6-26): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Nov 23 09:58:48 np0005532585.localdomain kernel: device tap737e82a6-26 entered promiscuous mode
Nov 23 09:58:48 np0005532585.localdomain systemd-udevd[312061]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:58:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891928.4008] device (tap737e82a6-26): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 23 09:58:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891928.4019] device (tap737e82a6-26): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 23 09:58:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:48Z|00108|binding|INFO|Claiming lport 737e82a6-2634-47df-b8a7-ec21a927cc3f for this additional chassis.
Nov 23 09:58:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:48.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:48Z|00109|binding|INFO|737e82a6-2634-47df-b8a7-ec21a927cc3f: Claiming fa:16:3e:da:21:74 10.100.0.10
Nov 23 09:58:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:48Z|00110|binding|INFO|Claiming lport fd30dda9-c731-47dd-b319-ebcca717b708 for this additional chassis.
Nov 23 09:58:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:48Z|00111|binding|INFO|fd30dda9-c731-47dd-b319-ebcca717b708: Claiming fa:16:3e:4f:95:ad 19.80.0.95
Nov 23 09:58:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:48Z|00112|binding|INFO|Setting lport 737e82a6-2634-47df-b8a7-ec21a927cc3f ovn-installed in OVS
Nov 23 09:58:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:48.444 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:48 np0005532585.localdomain systemd-machined[84275]: New machine qemu-4-instance-0000000a.
Nov 23 09:58:48 np0005532585.localdomain systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Nov 23 09:58:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:48.792 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891928.7916596, 8f62292f-5719-4b19-9188-3715b94493a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:58:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:48.794 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] VM Started (Lifecycle Event)
Nov 23 09:58:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:48.827 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:58:49 np0005532585.localdomain ceph-mon[300199]: pgmap v111: 177 pgs: 177 active+clean; 324 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 6.1 MiB/s rd, 4.9 MiB/s wr, 351 op/s
Nov 23 09:58:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:49.507 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891929.5066593, 8f62292f-5719-4b19-9188-3715b94493a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:58:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:49.508 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] VM Resumed (Lifecycle Event)
Nov 23 09:58:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:49.660 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:58:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:49.663 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:58:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:49.681 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] During the sync_power process the instance has moved from host np0005532584.localdomain to host np0005532585.localdomain
Nov 23 09:58:49 np0005532585.localdomain sshd[312047]: Received disconnect from 172.17.0.106 port 48704:11: disconnected by user
Nov 23 09:58:49 np0005532585.localdomain sshd[312047]: Disconnected from user nova 172.17.0.106 port 48704
Nov 23 09:58:49 np0005532585.localdomain sshd[312028]: pam_unix(sshd:session): session closed for user nova
Nov 23 09:58:49 np0005532585.localdomain systemd[1]: session-72.scope: Deactivated successfully.
Nov 23 09:58:49 np0005532585.localdomain systemd-logind[761]: Session 72 logged out. Waiting for processes to exit.
Nov 23 09:58:49 np0005532585.localdomain systemd-logind[761]: Removed session 72.
Nov 23 09:58:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:50.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:50.976 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:51.223 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:51 np0005532585.localdomain ceph-mon[300199]: pgmap v112: 177 pgs: 177 active+clean; 324 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 3.1 MiB/s wr, 159 op/s
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00113|binding|INFO|Claiming lport 737e82a6-2634-47df-b8a7-ec21a927cc3f for this chassis.
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00114|binding|INFO|737e82a6-2634-47df-b8a7-ec21a927cc3f: Claiming fa:16:3e:da:21:74 10.100.0.10
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00115|binding|INFO|Claiming lport fd30dda9-c731-47dd-b319-ebcca717b708 for this chassis.
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00116|binding|INFO|fd30dda9-c731-47dd-b319-ebcca717b708: Claiming fa:16:3e:4f:95:ad 19.80.0.95
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00117|binding|INFO|Setting lport 737e82a6-2634-47df-b8a7-ec21a927cc3f up in Southbound
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00118|binding|INFO|Setting lport fd30dda9-c731-47dd-b319-ebcca717b708 up in Southbound
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.517 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:95:ad 19.80.0.95'], port_security=['fa:16:3e:4f:95:ad 19.80.0.95'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['737e82a6-2634-47df-b8a7-ec21a927cc3f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1587702031', 'neutron:cidrs': '19.80.0.95/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1587702031', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fd30dda9-c731-47dd-b319-ebcca717b708) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.521 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:21:74 10.100.0.10'], port_security=['fa:16:3e:da:21:74 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1925970765', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8f62292f-5719-4b19-9188-3715b94493a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d679e465-8656-4403-afa0-724657d33ec4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1925970765', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532584.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1a90e812-f218-49cd-a3ab-6bc1317ad730, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=737e82a6-2634-47df-b8a7-ec21a927cc3f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.523 160439 INFO neutron.agent.ovn.metadata.agent [-] Port fd30dda9-c731-47dd-b319-ebcca717b708 in datapath 903951dd-448c-4453-aa24-f24a53269074 bound to our chassis
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.528 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5a66dd4e-4808-453a-ba83-842df44989df IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.528 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 903951dd-448c-4453-aa24-f24a53269074
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.542 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b0940f3d-b7dd-4868-a864-db87aa764a12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.543 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap903951dd-41 in ovnmeta-903951dd-448c-4453-aa24-f24a53269074 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.545 160542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap903951dd-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.546 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[912bc28a-d0f3-4383-9de2-c48ad634f2b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.547 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb0ad20-1870-4ea7-b074-9ade84b42341]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.571 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[eb03155e-1f25-4173-81a6-13183f7a2cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.589 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f99fbc86-e079-4312-9bfc-4a028e12e6f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.622 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9e9d60-f443-4d7c-837f-8ee2232c61b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.629 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfb78d2-3f36-400a-9281-d625b1d840f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891931.6352] manager: (tap903951dd-40): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Nov 23 09:58:51 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:51.651 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-e0ec21a6-2619-4364-8f01-ea196be44fa4 req-03d7f78c-5d6d-441d-aa9e-d4756ed73a5d 73d8249924dd406db12ad13a4ddb31a1 758f3043280349e086a85b86f2668848 - - default default] This port is not SRIOV, skip binding for port 737e82a6-2634-47df-b8a7-ec21a927cc3f.
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.674 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[8630cf05-67eb-4c2e-8c61-da06debb00c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.678 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[3d48d7d6-1f27-453a-92e4-50077fcac4d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap903951dd-41: link becomes ready
Nov 23 09:58:51 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap903951dd-40: link becomes ready
Nov 23 09:58:51 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891931.7057] device (tap903951dd-40): carrier: link connected
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.715 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb91685-fc04-4320-8336-e72d33065122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.736 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f53c4354-5748-47e8-a42a-06789a604782]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap903951dd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:47:c5:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187481, 'reachable_time': 41132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312141, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.758 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8f85d0-d594-4763-a117-af6cfcf34260]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:c591'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1187481, 'tstamp': 1187481}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312142, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:51.771 281956 INFO nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Post operation of migration started
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.780 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc3506f-729f-4bed-b8ae-ef5d28cb4534]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap903951dd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:47:c5:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187481, 'reachable_time': 41132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312143, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.818 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6caeb36c-16b5-4089-a29a-92643a86edca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.895 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ee42f3da-33c0-4311-8162-91964fcda884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.897 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap903951dd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.898 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.899 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap903951dd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:51 np0005532585.localdomain kernel: device tap903951dd-40 entered promiscuous mode
Nov 23 09:58:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:51.907 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.912 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap903951dd-40, col_values=(('external_ids', {'iface-id': 'b83bb60d-d579-4f8d-9e2c-3885d238bb26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:51Z|00119|binding|INFO|Releasing lport b83bb60d-d579-4f8d-9e2c-3885d238bb26 from this chassis (sb_readonly=0)
Nov 23 09:58:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:51.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.919 160439 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/903951dd-448c-4453-aa24-f24a53269074.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/903951dd-448c-4453-aa24-f24a53269074.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.920 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e94265-500a-4b94-bef1-0cf1ac358aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.921 160439 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: global
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     log         /dev/log local0 debug
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     log-tag     haproxy-metadata-proxy-903951dd-448c-4453-aa24-f24a53269074
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     user        root
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     group       root
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     maxconn     1024
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     pidfile     /var/lib/neutron/external/pids/903951dd-448c-4453-aa24-f24a53269074.pid.haproxy
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     daemon
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: defaults
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     log global
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     mode http
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     option httplog
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     option dontlognull
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     option http-server-close
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     option forwardfor
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     retries                 3
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout http-request    30s
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout connect         30s
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout client          32s
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout server          32s
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout http-keep-alive 30s
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: listen listener
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     bind 169.254.169.254:80
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     server metadata /var/lib/neutron/metadata_proxy
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:     http-request add-header X-OVN-Network-ID 903951dd-448c-4453-aa24-f24a53269074
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 23 09:58:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:51.922 160439 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'env', 'PROCESS_TAG=haproxy-903951dd-448c-4453-aa24-f24a53269074', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/903951dd-448c-4453-aa24-f24a53269074.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 23 09:58:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:51.925 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.171 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.171 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquired lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.172 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 23 09:58:52 np0005532585.localdomain podman[312176]: 
Nov 23 09:58:52 np0005532585.localdomain podman[312176]: 2025-11-23 09:58:52.526321151 +0000 UTC m=+0.103009596 container create 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:58:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:58:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:58:52 np0005532585.localdomain podman[312176]: 2025-11-23 09:58:52.475092266 +0000 UTC m=+0.051780801 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:58:52 np0005532585.localdomain systemd[1]: Started libpod-conmon-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797.scope.
Nov 23 09:58:52 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:58:52 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79231015dd8393ee753048cdacd0bc727bddc8892a2b6bc6b9a822a34427453e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:58:52 np0005532585.localdomain podman[312176]: 2025-11-23 09:58:52.631071838 +0000 UTC m=+0.207760243 container init 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:52 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE]   (312212) : New worker (312223) forked
Nov 23 09:58:52 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE]   (312212) : Loading success.
Nov 23 09:58:52 np0005532585.localdomain podman[312191]: 2025-11-23 09:58:52.709594526 +0000 UTC m=+0.114378854 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 09:58:52 np0005532585.localdomain podman[312191]: 2025-11-23 09:58:52.72315992 +0000 UTC m=+0.127944278 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:58:52 np0005532585.localdomain podman[312190]: 2025-11-23 09:58:52.692294118 +0000 UTC m=+0.115254960 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:58:52 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:58:52 np0005532585.localdomain podman[312176]: 2025-11-23 09:58:52.74641564 +0000 UTC m=+0.323104035 container start 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.800 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 737e82a6-2634-47df-b8a7-ec21a927cc3f in datapath d679e465-8656-4403-afa0-724657d33ec4 unbound from our chassis
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.807 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d679e465-8656-4403-afa0-724657d33ec4
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.818 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4608bc7c-318a-4ef9-8571-5f799fb34b86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.819 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd679e465-81 in ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.821 160542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd679e465-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.821 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb27d9e-4ece-4195-b536-3d3fa20be1b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.822 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d877bc8f-d1bf-4c8c-869f-450af634f172]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain podman[312190]: 2025-11-23 09:58:52.827488095 +0000 UTC m=+0.250448927 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.831 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[c64901c1-fdf5-4b41-bb9f-a24ae1a3b360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.845 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[bb86428e-46b2-4268-a9c8-50a9cb485c92]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.876 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd89133-7970-45b4-8fdb-0fac505a397d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.883 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[77ce8a1e-a250-4867-bf0f-2228fbfe2645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain systemd-udevd[312134]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:58:52 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891932.8864] manager: (tapd679e465-80): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.895 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Updating instance_info_cache with network_info: [{"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.915 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Releasing lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.927 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[572f409b-5976-4b66-a59c-675cec450fbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.930 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[0cedfb8c-7dbd-4866-99fb-e229d8bd44ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.931 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.932 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.932 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:52.940 281956 INFO nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 23 09:58:52 np0005532585.localdomain virtqemud[203731]: Domain id=4 name='instance-0000000a' uuid=8f62292f-5719-4b19-9188-3715b94493a7 is tainted: custom-monitor
Nov 23 09:58:52 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891932.9655] device (tapd679e465-80): carrier: link connected
Nov 23 09:58:52 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapd679e465-81: link becomes ready
Nov 23 09:58:52 np0005532585.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapd679e465-80: link becomes ready
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.972 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[e86bd64d-45ca-4cac-8d4c-cc0bcb14c9af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:52.990 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[60aea4e3-13fb-4447-99dc-847bb732512f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd679e465-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:a8:02:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187607, 'reachable_time': 41578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312256, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.006 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c677d5-5f84-4a2c-9a10-efc24ffb8f73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:218'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1187607, 'tstamp': 1187607}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312257, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.024 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[81fe0e7b-95dc-4f28-9aa1-c848ab150578]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd679e465-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:a8:02:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187607, 'reachable_time': 41578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312258, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.062 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[eb03534f-bf3f-4b2c-ae72-0586e54a94f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.135 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b9aa55a2-e026-4086-adb8-a6e558c272cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.136 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd679e465-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.137 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.137 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd679e465-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:53 np0005532585.localdomain kernel: device tapd679e465-80 entered promiscuous mode
Nov 23 09:58:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:53.140 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.148 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd679e465-80, col_values=(('external_ids', {'iface-id': '9b50ca15-3b72-42c0-998b-33441ea57460'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:53.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:53Z|00120|binding|INFO|Releasing lport 9b50ca15-3b72-42c0-998b-33441ea57460 from this chassis (sb_readonly=0)
Nov 23 09:58:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:53.154 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.155 160439 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d679e465-8656-4403-afa0-724657d33ec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d679e465-8656-4403-afa0-724657d33ec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.157 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[01493be9-5832-4a19-bfd0-5adb629d48d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.157 160439 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: global
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     log         /dev/log local0 debug
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     log-tag     haproxy-metadata-proxy-d679e465-8656-4403-afa0-724657d33ec4
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     user        root
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     group       root
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     maxconn     1024
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     pidfile     /var/lib/neutron/external/pids/d679e465-8656-4403-afa0-724657d33ec4.pid.haproxy
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     daemon
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: defaults
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     log global
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     mode http
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     option httplog
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     option dontlognull
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     option http-server-close
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     option forwardfor
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     retries                 3
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout http-request    30s
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout connect         30s
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout client          32s
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout server          32s
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     timeout http-keep-alive 30s
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: listen listener
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     bind 169.254.169.254:80
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     server metadata /var/lib/neutron/metadata_proxy
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:     http-request add-header X-OVN-Network-ID d679e465-8656-4403-afa0-724657d33ec4
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 23 09:58:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:53.158 160439 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'env', 'PROCESS_TAG=haproxy-d679e465-8656-4403-afa0-724657d33ec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d679e465-8656-4403-afa0-724657d33ec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 23 09:58:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:53.163 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:53 np0005532585.localdomain ceph-mon[300199]: pgmap v113: 177 pgs: 177 active+clean; 324 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 3.1 MiB/s wr, 159 op/s
Nov 23 09:58:53 np0005532585.localdomain podman[312291]: 
Nov 23 09:58:53 np0005532585.localdomain podman[312291]: 2025-11-23 09:58:53.639190016 +0000 UTC m=+0.099888411 container create b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:53 np0005532585.localdomain systemd[1]: Started libpod-conmon-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34.scope.
Nov 23 09:58:53 np0005532585.localdomain podman[312291]: 2025-11-23 09:58:53.590069056 +0000 UTC m=+0.050767501 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 23 09:58:53 np0005532585.localdomain systemd[1]: tmp-crun.zegw9E.mount: Deactivated successfully.
Nov 23 09:58:53 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:58:53 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3ad05ab8f11dd49e19cdf6059cf6f5ce8d04692e252d6615b26d979227f460/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:58:53 np0005532585.localdomain podman[312291]: 2025-11-23 09:58:53.72647264 +0000 UTC m=+0.187171035 container init b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:53 np0005532585.localdomain podman[312291]: 2025-11-23 09:58:53.749019258 +0000 UTC m=+0.209717653 container start b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 09:58:53 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE]   (312309) : New worker (312311) forked
Nov 23 09:58:53 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE]   (312309) : Loading success.
Nov 23 09:58:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:53.952 281956 INFO nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 23 09:58:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:54Z|00121|binding|INFO|Releasing lport 9b50ca15-3b72-42c0-998b-33441ea57460 from this chassis (sb_readonly=0)
Nov 23 09:58:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:54Z|00122|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:58:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:54Z|00123|binding|INFO|Releasing lport b83bb60d-d579-4f8d-9e2c-3885d238bb26 from this chassis (sb_readonly=0)
Nov 23 09:58:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:54.300 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:58:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:58:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:54.959 281956 INFO nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 23 09:58:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:54.965 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:58:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:54.988 281956 DEBUG nova.objects.instance [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.065 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.066 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.066 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.067 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.067 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.068 281956 WARNING nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received unexpected event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with vm_state active and task_state None.
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.068 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.068 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.069 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.069 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.070 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.070 281956 WARNING nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received unexpected event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with vm_state active and task_state None.
Nov 23 09:58:55 np0005532585.localdomain ceph-mon[300199]: pgmap v114: 177 pgs: 177 active+clean; 359 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 5.1 MiB/s wr, 222 op/s
Nov 23 09:58:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:55.448 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.122 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.123 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.123 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.124 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.124 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.126 281956 INFO nova.compute.manager [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Terminating instance
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.128 281956 DEBUG nova.compute.manager [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 23 09:58:56 np0005532585.localdomain kernel: device tap737e82a6-26 left promiscuous mode
Nov 23 09:58:56 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891936.1896] device (tap737e82a6-26): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00124|binding|INFO|Releasing lport 737e82a6-2634-47df-b8a7-ec21a927cc3f from this chassis (sb_readonly=0)
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00125|binding|INFO|Setting lport 737e82a6-2634-47df-b8a7-ec21a927cc3f down in Southbound
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00126|binding|INFO|Releasing lport fd30dda9-c731-47dd-b319-ebcca717b708 from this chassis (sb_readonly=0)
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00127|binding|INFO|Setting lport fd30dda9-c731-47dd-b319-ebcca717b708 down in Southbound
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00128|binding|INFO|Removing iface tap737e82a6-26 ovn-installed in OVS
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00129|binding|INFO|Releasing lport 9b50ca15-3b72-42c0-998b-33441ea57460 from this chassis (sb_readonly=0)
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00130|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:58:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:56Z|00131|binding|INFO|Releasing lport b83bb60d-d579-4f8d-9e2c-3885d238bb26 from this chassis (sb_readonly=0)
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.207 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:95:ad 19.80.0.95'], port_security=['fa:16:3e:4f:95:ad 19.80.0.95'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['737e82a6-2634-47df-b8a7-ec21a927cc3f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1587702031', 'neutron:cidrs': '19.80.0.95/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1587702031', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fd30dda9-c731-47dd-b319-ebcca717b708) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.210 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:21:74 10.100.0.10'], port_security=['fa:16:3e:da:21:74 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1925970765', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8f62292f-5719-4b19-9188-3715b94493a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d679e465-8656-4403-afa0-724657d33ec4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1925970765', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532584.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1a90e812-f218-49cd-a3ab-6bc1317ad730, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=737e82a6-2634-47df-b8a7-ec21a927cc3f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.212 160439 INFO neutron.agent.ovn.metadata.agent [-] Port fd30dda9-c731-47dd-b319-ebcca717b708 in datapath 903951dd-448c-4453-aa24-f24a53269074 unbound from our chassis
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.215 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5a66dd4e-4808-453a-ba83-842df44989df IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.215 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 903951dd-448c-4453-aa24-f24a53269074, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.216 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5045b2-aabf-4dbc-8755-2ab6c5f1c4de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.217 160439 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-903951dd-448c-4453-aa24-f24a53269074 namespace which is not needed anymore
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.230 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 23 09:58:56 np0005532585.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 5.790s CPU time.
Nov 23 09:58:56 np0005532585.localdomain systemd-machined[84275]: Machine qemu-4-instance-0000000a terminated.
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.370 281956 INFO nova.virt.libvirt.driver [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Instance destroyed successfully.
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.370 281956 DEBUG nova.objects.instance [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lazy-loading 'resources' on Instance uuid 8f62292f-5719-4b19-9188-3715b94493a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.382 281956 DEBUG nova.virt.libvirt.vif [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-23T09:58:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1576780525',display_name='tempest-LiveMigrationTest-server-1576780525',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005532585.localdomain',hostname='tempest-livemigrationtest-server-1576780525',id=10,image_ref='c5806483-57a8-4254-b41b-254b888c8606',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-23T09:58:41Z,launched_on='np0005532584.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='253c88568a634476a6c1284eed6a9464',ramdisk_id='',reservation_id='r-dvg5v145',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c5806483-57a8-4254-b41b-254b888c8606',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1889456510',owner_user_name='tempest-LiveMigrationTest-1889456510-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T09:58:55Z,user_data=None,user_id='7f7875c0084c46fdb2e7b37e4fc44faf',uuid=8f62292f-5719-4b19-9188-3715b94493a7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.383 281956 DEBUG nova.network.os_vif_util [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Converting VIF {"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.384 281956 DEBUG nova.network.os_vif_util [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.384 281956 DEBUG os_vif [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.386 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.387 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap737e82a6-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.392 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.395 281956 INFO os_vif [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26')
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE]   (312212) : haproxy version is 2.8.14-c23fe91
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE]   (312212) : path to executable is /usr/sbin/haproxy
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [WARNING]  (312212) : Exiting Master process...
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [WARNING]  (312212) : Exiting Master process...
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [ALERT]    (312212) : Current worker (312223) exited with code 143 (Terminated)
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [WARNING]  (312212) : All workers exited. Exiting... (0)
Nov 23 09:58:56 np0005532585.localdomain systemd[1]: libpod-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797.scope: Deactivated successfully.
Nov 23 09:58:56 np0005532585.localdomain podman[312343]: 2025-11-23 09:58:56.426605923 +0000 UTC m=+0.085163990 container died 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 09:58:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3976893038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:56 np0005532585.localdomain podman[312343]: 2025-11-23 09:58:56.47467325 +0000 UTC m=+0.133231287 container cleanup 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:56 np0005532585.localdomain podman[312381]: 2025-11-23 09:58:56.522135179 +0000 UTC m=+0.086439760 container cleanup 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:58:56 np0005532585.localdomain systemd[1]: libpod-conmon-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797.scope: Deactivated successfully.
Nov 23 09:58:56 np0005532585.localdomain podman[312397]: 2025-11-23 09:58:56.574062585 +0000 UTC m=+0.077994942 container remove 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.583 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0c9bfab0-9338-4659-9da1-fecd14ec8c03]: (4, ('Sun Nov 23 09:58:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074 (6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797)\n6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797\nSun Nov 23 09:58:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074 (6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797)\n6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.586 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6bb5e1-7605-4afb-af8c-71332c2d4774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.588 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap903951dd-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.630 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain kernel: device tap903951dd-40 left promiscuous mode
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.645 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[96d712d3-34f2-4e8e-9bbd-0f2c55e0d4b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.663 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c798e4-4397-49d7-b5db-565084c6983f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.664 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[67e91309-f904-4333-91b3-4703744d4638]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.681 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b6523cdd-9bac-4635-b536-73335f5785a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187472, 'reachable_time': 38242, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312418, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.686 160573 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-903951dd-448c-4453-aa24-f24a53269074 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.686 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ec97f4-fd75-48ef-8d8f-d68384ddd09f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.687 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 737e82a6-2634-47df-b8a7-ec21a927cc3f in datapath d679e465-8656-4403-afa0-724657d33ec4 unbound from our chassis
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.689 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d679e465-8656-4403-afa0-724657d33ec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.690 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[39403141-b69a-455c-8c31-767a58302452]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.691 160439 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 namespace which is not needed anymore
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE]   (312309) : haproxy version is 2.8.14-c23fe91
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE]   (312309) : path to executable is /usr/sbin/haproxy
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [WARNING]  (312309) : Exiting Master process...
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [ALERT]    (312309) : Current worker (312311) exited with code 143 (Terminated)
Nov 23 09:58:56 np0005532585.localdomain neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [WARNING]  (312309) : All workers exited. Exiting... (0)
Nov 23 09:58:56 np0005532585.localdomain systemd[1]: libpod-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34.scope: Deactivated successfully.
Nov 23 09:58:56 np0005532585.localdomain podman[312434]: 2025-11-23 09:58:56.844740918 +0000 UTC m=+0.058306511 container died b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 09:58:56 np0005532585.localdomain podman[312434]: 2025-11-23 09:58:56.87593372 +0000 UTC m=+0.089499303 container cleanup b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 09:58:56 np0005532585.localdomain podman[312448]: 2025-11-23 09:58:56.888504565 +0000 UTC m=+0.045660905 container cleanup b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:58:56 np0005532585.localdomain systemd[1]: libpod-conmon-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34.scope: Deactivated successfully.
Nov 23 09:58:56 np0005532585.localdomain podman[312465]: 2025-11-23 09:58:56.944271637 +0000 UTC m=+0.055180996 container remove b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.948 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ee19c034-3c99-4a34-b1a0-cde714f50a78]: (4, ('Sun Nov 23 09:58:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 (b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34)\nb1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34\nSun Nov 23 09:58:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 (b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34)\nb1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.950 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[701bae2d-641d-4272-9ec1-08c28dd92bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.950 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd679e465-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.952 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain kernel: device tapd679e465-80 left promiscuous mode
Nov 23 09:58:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:56.956 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.960 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8f255f-8b68-49a1-8d46-7887819f4f4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.978 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c4be31-e14f-4da5-89ad-7af0b6857f88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.979 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[15063fe0-7228-4090-aae1-94c651ca5a91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.994 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[066a7b20-73be-4c51-ab2f-f85b26c4ebd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187597, 'reachable_time': 41009, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312482, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.996 160573 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 23 09:58:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:56.996 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[ef80d3b0-40b9-4127-bcec-bb60a1f71634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.096 281956 INFO nova.virt.libvirt.driver [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Deleting instance files /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7_del
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.097 281956 INFO nova.virt.libvirt.driver [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Deletion of /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7_del complete
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.142 281956 INFO nova.compute.manager [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Took 1.01 seconds to destroy the instance on the hypervisor.
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.143 281956 DEBUG oslo.service.loopingcall [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.143 281956 DEBUG nova.compute.manager [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.144 281956 DEBUG nova.network.neutron [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.161 281956 DEBUG nova.compute.manager [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-unplugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.161 281956 DEBUG oslo_concurrency.lockutils [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.162 281956 DEBUG oslo_concurrency.lockutils [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.162 281956 DEBUG oslo_concurrency.lockutils [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.163 281956 DEBUG nova.compute.manager [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-unplugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:58:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:57.163 281956 DEBUG nova.compute.manager [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-unplugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 23 09:58:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7b3ad05ab8f11dd49e19cdf6059cf6f5ce8d04692e252d6615b26d979227f460-merged.mount: Deactivated successfully.
Nov 23 09:58:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34-userdata-shm.mount: Deactivated successfully.
Nov 23 09:58:57 np0005532585.localdomain systemd[1]: run-netns-ovnmeta\x2dd679e465\x2d8656\x2d4403\x2dafa0\x2d724657d33ec4.mount: Deactivated successfully.
Nov 23 09:58:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-79231015dd8393ee753048cdacd0bc727bddc8892a2b6bc6b9a822a34427453e-merged.mount: Deactivated successfully.
Nov 23 09:58:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797-userdata-shm.mount: Deactivated successfully.
Nov 23 09:58:57 np0005532585.localdomain systemd[1]: run-netns-ovnmeta\x2d903951dd\x2d448c\x2d4453\x2daa24\x2df24a53269074.mount: Deactivated successfully.
Nov 23 09:58:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1794950639' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:58:57 np0005532585.localdomain ceph-mon[300199]: pgmap v115: 177 pgs: 177 active+clean; 359 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 707 KiB/s rd, 5.1 MiB/s wr, 148 op/s
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.213 281956 DEBUG nova.network.neutron [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.234 281956 INFO nova.compute.manager [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Took 1.09 seconds to deallocate network for instance.
Nov 23 09:58:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.288 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.289 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.291 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.336 281956 INFO nova.scheduler.client.report [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Deleted allocations for instance 8f62292f-5719-4b19-9188-3715b94493a7
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.405 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:58.503 281956 DEBUG nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 23 09:58:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:58:59.125 2 INFO neutron.agent.securitygroups_rpc [None req-f36f5a6d-ca31-44d9-bac1-0308580f3e95 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.222 281956 DEBUG nova.compute.manager [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.223 281956 DEBUG oslo_concurrency.lockutils [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.223 281956 DEBUG oslo_concurrency.lockutils [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.224 281956 DEBUG oslo_concurrency.lockutils [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.224 281956 DEBUG nova.compute.manager [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.225 281956 WARNING nova.compute.manager [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received unexpected event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with vm_state deleted and task_state None.
Nov 23 09:58:59 np0005532585.localdomain dnsmasq[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/addn_hosts - 0 addresses
Nov 23 09:58:59 np0005532585.localdomain dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/host
Nov 23 09:58:59 np0005532585.localdomain dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/opts
Nov 23 09:58:59 np0005532585.localdomain podman[312500]: 2025-11-23 09:58:59.393776298 +0000 UTC m=+0.064487159 container kill d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:58:59 np0005532585.localdomain ceph-mon[300199]: pgmap v116: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 1.0 MiB/s rd, 6.4 MiB/s wr, 224 op/s
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: tmp-crun.HTIyLm.mount: Deactivated successfully.
Nov 23 09:58:59 np0005532585.localdomain dnsmasq[311299]: exiting on receipt of SIGTERM
Nov 23 09:58:59 np0005532585.localdomain podman[312537]: 2025-11-23 09:58:59.83307353 +0000 UTC m=+0.069973367 container kill d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: Stopping User Manager for UID 42436...
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: libpod-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814.scope: Deactivated successfully.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Activating special unit Exit the Session...
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped target Main User Target.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped target Basic System.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped target Paths.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped target Sockets.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped target Timers.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Closed D-Bus User Message Bus Socket.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Stopped Create User's Volatile Files and Directories.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Removed slice User Application Slice.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Reached target Shutdown.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Finished Exit the Session.
Nov 23 09:58:59 np0005532585.localdomain systemd[312032]: Reached target Exit the Session.
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: Stopped User Manager for UID 42436.
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Nov 23 09:58:59 np0005532585.localdomain podman[312551]: 2025-11-23 09:58:59.916065574 +0000 UTC m=+0.065657976 container died d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 09:58:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:59.927 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5a66dd4e-4808-453a-ba83-842df44989df with type ""
Nov 23 09:58:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:59.928 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-903951dd-448c-4453-aa24-f24a53269074', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ca98d0dd-231a-46c7-80b8-a48c00a5696e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:58:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:59.929 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ca98d0dd-231a-46c7-80b8-a48c00a5696e in datapath 903951dd-448c-4453-aa24-f24a53269074 unbound from our chassis
Nov 23 09:58:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:59.930 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 903951dd-448c-4453-aa24-f24a53269074, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:58:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:58:59.931 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4efafa-4807-4f80-aafd-5cc9e4866fa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:58:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:59Z|00132|binding|INFO|Removing iface tapca98d0dd-23 ovn-installed in OVS
Nov 23 09:58:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:58:59Z|00133|binding|INFO|Removing lport ca98d0dd-231a-46c7-80b8-a48c00a5696e ovn-installed in OVS
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.934 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:58:59.938 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:58:59 np0005532585.localdomain podman[312551]: 2025-11-23 09:58:59.944028168 +0000 UTC m=+0.093620490 container cleanup d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:58:59 np0005532585.localdomain systemd[1]: libpod-conmon-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814.scope: Deactivated successfully.
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:58:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:58:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:59:00 np0005532585.localdomain podman[312553]: 2025-11-23 09:59:00.003825253 +0000 UTC m=+0.144643277 container remove d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 09:59:00 np0005532585.localdomain kernel: device tapca98d0dd-23 left promiscuous mode
Nov 23 09:59:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:00.021 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:00.038 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:00.056 263258 INFO neutron.agent.dhcp.agent [None req-0b758ae4-6a63-4b17-bf24-2157967c06ff - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:00.180 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-50b27a1dbab67616f24ed7868948ddd418eee50493d781bdc989237aff316f07-merged.mount: Deactivated successfully.
Nov 23 09:59:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:00 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d903951dd\x2d448c\x2d4453\x2daa24\x2df24a53269074.mount: Deactivated successfully.
Nov 23 09:59:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:00Z|00134|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:00.442 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2510104653' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 09:59:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2510104653' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 09:59:00 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:00.644 2 INFO neutron.agent.securitygroups_rpc [None req-2e659d4a-74ef-46b7-bd3b-2baf8d6d13fe 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']
Nov 23 09:59:00 np0005532585.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 23 09:59:00 np0005532585.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 15.918s CPU time.
Nov 23 09:59:00 np0005532585.localdomain systemd-machined[84275]: Machine qemu-3-instance-00000009 terminated.
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:01 np0005532585.localdomain ceph-mon[300199]: pgmap v117: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 655 KiB/s rd, 3.4 MiB/s wr, 139 op/s
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.521 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance shutdown successfully after 24 seconds.
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.528 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.529 281956 DEBUG nova.objects.instance [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.601 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Beginning cold snapshot process
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.828 281956 DEBUG nova.virt.libvirt.imagebackend [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No parent info for c5806483-57a8-4254-b41b-254b888c8606; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163
Nov 23 09:59:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:01.873 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(d341425897f7472cb10ea988db862e04) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 23 09:59:01 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:59:02 np0005532585.localdomain podman[312636]: 2025-11-23 09:59:02.033243719 +0000 UTC m=+0.092476254 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 09:59:02 np0005532585.localdomain podman[312636]: 2025-11-23 09:59:02.052412754 +0000 UTC m=+0.111645299 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 09:59:02 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:59:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:02.135 263258 INFO neutron.agent.linux.ip_lib [None req-b12e4ae5-1c19-461a-a514-e1d88c3b8373 - - - - - -] Device tapb7d31e03-f6 cannot be used as it has no MAC address
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:02 np0005532585.localdomain kernel: device tapb7d31e03-f6 entered promiscuous mode
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:02 np0005532585.localdomain systemd-udevd[312581]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:59:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891942.1640] manager: (tapb7d31e03-f6): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Nov 23 09:59:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:02Z|00135|binding|INFO|Claiming lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 for this chassis.
Nov 23 09:59:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:02Z|00136|binding|INFO|b7d31e03-f6de-49f6-a46f-b6861bfb0ba8: Claiming unknown
Nov 23 09:59:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:02.178 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a693c1f03094401b2a83bfa038e2d85', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71365ad6-1587-4285-8187-e9f4a0e26a00, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b7d31e03-f6de-49f6-a46f-b6861bfb0ba8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:02.180 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 in datapath 0a868746-0c5d-4cb5-b569-e1ea427d7eaf bound to our chassis
Nov 23 09:59:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:02.183 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 25c5c7de-1918-4d73-bfc0-bdb457d5d80e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:59:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:02.183 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:59:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:02.184 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d4b6e8-58f5-4439-95d5-5d3b40bfc62c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:02Z|00137|binding|INFO|Setting lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 ovn-installed in OVS
Nov 23 09:59:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:02Z|00138|binding|INFO|Setting lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 up in Southbound
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.199 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.247 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.277 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e93 e93: 6 total, 6 up, 6 in
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.580 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] cloning vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk@d341425897f7472cb10ea988db862e04 to images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 23 09:59:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:02.767 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] flattening images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 23 09:59:03 np0005532585.localdomain podman[312790]: 
Nov 23 09:59:03 np0005532585.localdomain podman[312790]: 2025-11-23 09:59:03.12991842 +0000 UTC m=+0.080917341 container create 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:59:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:59:03 np0005532585.localdomain systemd[1]: Started libpod-conmon-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720.scope.
Nov 23 09:59:03 np0005532585.localdomain podman[312790]: 2025-11-23 09:59:03.099001676 +0000 UTC m=+0.050000637 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:59:03 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:59:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e82ddd04bc2b03081d5a523028e9859714aedc10be59a8137ed30fe358e9a28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:59:03 np0005532585.localdomain podman[312790]: 2025-11-23 09:59:03.218144413 +0000 UTC m=+0.169143334 container init 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:03 np0005532585.localdomain podman[312790]: 2025-11-23 09:59:03.227132288 +0000 UTC m=+0.178131199 container start 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: started, version 2.85 cachesize 150
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: DNS service limited to local subnets
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: warning: no upstream servers configured
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 0 addresses
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts
Nov 23 09:59:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.312 263258 INFO neutron.agent.dhcp.agent [None req-f761acdd-2a00-45e0-a2f9-c6109c814a22 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:01Z, description=, device_id=8074bdc2-a02c-4214-ad81-7c41b83201d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b88c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6fa0>], id=a3dd1070-8964-4625-913d-1ee38ef5dacb, ip_allocation=immediate, mac_address=fa:16:3e:8b:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:59Z, description=, dns_domain=, id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-789670611, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=799, status=ACTIVE, subnets=['5b3a211b-df16-45c2-b823-d3f707dcbf68'], tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:00Z, vlan_transparent=None, network_id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, port_security_enabled=False, project_id=2a693c1f03094401b2a83bfa038e2d85, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=821, status=DOWN, tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:01Z on network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf
Nov 23 09:59:03 np0005532585.localdomain podman[312803]: 2025-11-23 09:59:03.319034713 +0000 UTC m=+0.137815268 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:59:03 np0005532585.localdomain podman[312803]: 2025-11-23 09:59:03.36148976 +0000 UTC m=+0.180270345 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 09:59:03 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:59:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.392 263258 INFO neutron.agent.dhcp.agent [None req-ba197019-39c6-4712-9ab1-e6e9bcda8962 - - - - - -] DHCP configuration for ports {'8b5b0b62-f6a4-474d-8fa1-6c8b82d9241f'} is completed
Nov 23 09:59:03 np0005532585.localdomain ceph-mon[300199]: osdmap e93: 6 total, 6 up, 6 in
Nov 23 09:59:03 np0005532585.localdomain ceph-mon[300199]: pgmap v119: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 786 KiB/s rd, 4.0 MiB/s wr, 166 op/s
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 1 addresses
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts
Nov 23 09:59:03 np0005532585.localdomain podman[312846]: 2025-11-23 09:59:03.549325935 +0000 UTC m=+0.067727480 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 09:59:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:03.672 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] removing snapshot(d341425897f7472cb10ea988db862e04) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Nov 23 09:59:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.715 263258 INFO neutron.agent.dhcp.agent [None req-2202ad04-66a5-45ae-92b7-eb2ecb293a2c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:01Z, description=, device_id=8074bdc2-a02c-4214-ad81-7c41b83201d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b8e430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b8eee0>], id=a3dd1070-8964-4625-913d-1ee38ef5dacb, ip_allocation=immediate, mac_address=fa:16:3e:8b:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:59Z, description=, dns_domain=, id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-789670611, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=799, status=ACTIVE, subnets=['5b3a211b-df16-45c2-b823-d3f707dcbf68'], tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:00Z, vlan_transparent=None, network_id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, port_security_enabled=False, project_id=2a693c1f03094401b2a83bfa038e2d85, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=821, status=DOWN, tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:01Z on network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf
Nov 23 09:59:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.833 263258 INFO neutron.agent.dhcp.agent [None req-dd0ce42d-cbc2-4637-a8fc-03534d4f8634 - - - - - -] DHCP configuration for ports {'a3dd1070-8964-4625-913d-1ee38ef5dacb'} is completed
Nov 23 09:59:03 np0005532585.localdomain dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 1 addresses
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host
Nov 23 09:59:03 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts
Nov 23 09:59:03 np0005532585.localdomain podman[312904]: 2025-11-23 09:59:03.936767453 +0000 UTC m=+0.059503818 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:59:03 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:03.946 2 INFO neutron.agent.securitygroups_rpc [None req-9d56fe05-4ef8-4d55-837b-9cee7fc5dad7 4b677b000abe4b0687ff1afcd1016893 2a693c1f03094401b2a83bfa038e2d85 - - default default] Security group member updated ['e11e3507-78f9-4b55-80fe-2aa7bb5d486d']
Nov 23 09:59:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:04.173 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9440940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9440df0>], id=f2b26721-5c09-4af3-8bc2-da22f2612b11, ip_allocation=immediate, mac_address=fa:16:3e:54:fd:3f, name=tempest-FloatingIPNegativeTestJSON-1148406642, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:59Z, description=, dns_domain=, id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-789670611, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=799, status=ACTIVE, subnets=['5b3a211b-df16-45c2-b823-d3f707dcbf68'], tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:00Z, vlan_transparent=None, network_id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['e11e3507-78f9-4b55-80fe-2aa7bb5d486d'], standard_attr_id=832, status=DOWN, tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:03Z on network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf
Nov 23 09:59:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:04.194 263258 INFO neutron.agent.dhcp.agent [None req-89bdf3a4-76b0-403c-a197-85b4954cb932 - - - - - -] DHCP configuration for ports {'a3dd1070-8964-4625-913d-1ee38ef5dacb'} is completed
Nov 23 09:59:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:04Z|00139|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:04.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:04 np0005532585.localdomain dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 2 addresses
Nov 23 09:59:04 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host
Nov 23 09:59:04 np0005532585.localdomain podman[312941]: 2025-11-23 09:59:04.41065974 +0000 UTC m=+0.061223440 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:04 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts
Nov 23 09:59:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e94 e94: 6 total, 6 up, 6 in
Nov 23 09:59:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:04.607 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(snap) on rbd image(7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 23 09:59:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:04.691 263258 INFO neutron.agent.dhcp.agent [None req-42b58f33-73e3-4e72-b723-d8658e402274 - - - - - -] DHCP configuration for ports {'f2b26721-5c09-4af3-8bc2-da22f2612b11'} is completed
Nov 23 09:59:05 np0005532585.localdomain ceph-mon[300199]: osdmap e94: 6 total, 6 up, 6 in
Nov 23 09:59:05 np0005532585.localdomain ceph-mon[300199]: pgmap v121: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 6.3 MiB/s rd, 7.8 MiB/s wr, 217 op/s
Nov 23 09:59:05 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e95 e95: 6 total, 6 up, 6 in
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.323 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Snapshot image upload complete
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.323 281956 DEBUG nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.384 281956 INFO nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelve offloading
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.390 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.394 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.395 281956 DEBUG nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.397 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.398 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquired lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.398 281956 DEBUG nova.network.neutron [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.470 281956 DEBUG nova.network.neutron [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 23 09:59:06 np0005532585.localdomain ceph-mon[300199]: osdmap e95: 6 total, 6 up, 6 in
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.733 281956 DEBUG nova.network.neutron [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.747 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Releasing lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.755 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.
Nov 23 09:59:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:06.756 281956 DEBUG nova.objects.instance [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'resources' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:07.449 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting instance files /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del
Nov 23 09:59:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:07.450 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deletion of /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del complete
Nov 23 09:59:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:07.560 281956 INFO nova.scheduler.client.report [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Deleted allocations for instance 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7
Nov 23 09:59:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:07.606 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:07.607 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:07.649 281956 DEBUG oslo_concurrency.processutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:07 np0005532585.localdomain ceph-mon[300199]: pgmap v123: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.8 MiB/s wr, 138 op/s
Nov 23 09:59:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:59:08 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/273386736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.111 281956 DEBUG oslo_concurrency.processutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.120 281956 DEBUG nova.compute.provider_tree [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.134 281956 DEBUG nova.scheduler.client.report [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.153 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.213 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 30.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:08.353 2 INFO neutron.agent.securitygroups_rpc [None req-0bae4724-8fa8-4216-8cb0-34bcdfbbc61a 4b677b000abe4b0687ff1afcd1016893 2a693c1f03094401b2a83bfa038e2d85 - - default default] Security group member updated ['e11e3507-78f9-4b55-80fe-2aa7bb5d486d']
Nov 23 09:59:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e96 e96: 6 total, 6 up, 6 in
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.551 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.551 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.552 281956 INFO nova.compute.manager [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Unshelving
Nov 23 09:59:08 np0005532585.localdomain dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 1 addresses
Nov 23 09:59:08 np0005532585.localdomain podman[313038]: 2025-11-23 09:59:08.595612704 +0000 UTC m=+0.066751969 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 09:59:08 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host
Nov 23 09:59:08 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.642 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.643 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.646 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.661 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.678 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.679 281956 INFO nova.compute.claims [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Claim successful on node np0005532585.localdomain
Nov 23 09:59:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/273386736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:08 np0005532585.localdomain ceph-mon[300199]: osdmap e96: 6 total, 6 up, 6 in
Nov 23 09:59:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:08.812 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:59:09 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3173906222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:09Z|00140|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.245 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.252 281956 DEBUG nova.compute.provider_tree [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.268 281956 DEBUG nova.scheduler.client.report [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.287 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.291 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:09 np0005532585.localdomain dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 0 addresses
Nov 23 09:59:09 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host
Nov 23 09:59:09 np0005532585.localdomain dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts
Nov 23 09:59:09 np0005532585.localdomain podman[313098]: 2025-11-23 09:59:09.403012853 +0000 UTC m=+0.057490166 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.540 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.541 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquired lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.541 281956 DEBUG nova.network.neutron [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.577 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:09Z|00141|binding|INFO|Releasing lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 from this chassis (sb_readonly=0)
Nov 23 09:59:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:09Z|00142|binding|INFO|Setting lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 down in Southbound
Nov 23 09:59:09 np0005532585.localdomain kernel: device tapb7d31e03-f6 left promiscuous mode
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.587 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a693c1f03094401b2a83bfa038e2d85', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71365ad6-1587-4285-8187-e9f4a0e26a00, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b7d31e03-f6de-49f6-a46f-b6861bfb0ba8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.589 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 in datapath 0a868746-0c5d-4cb5-b569-e1ea427d7eaf unbound from our chassis
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.592 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:59:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:09.593 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[79823d54-92c3-4a36-ab3c-46fb31902cf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.602 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:09.618 281956 DEBUG nova.network.neutron [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 23 09:59:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e97 e97: 6 total, 6 up, 6 in
Nov 23 09:59:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3173906222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:09 np0005532585.localdomain ceph-mon[300199]: pgmap v125: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 7.8 MiB/s wr, 266 op/s
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.177 281956 DEBUG nova.network.neutron [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.192 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Releasing lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.194 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.195 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating image(s)
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.232 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.237 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.289 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.330 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.336 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "d1ec78b5f1b07f2e087c6afff3b972c481121dc6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.338 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "d1ec78b5f1b07f2e087c6afff3b972c481121dc6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.390 281956 DEBUG nova.virt.libvirt.imagebackend [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Image locations are: [{'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.477 281956 DEBUG nova.virt.libvirt.imagebackend [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Selected location: {'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.477 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] cloning images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb@snap to None/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 23 09:59:10 np0005532585.localdomain ceph-mon[300199]: osdmap e97: 6 total, 6 up, 6 in
Nov 23 09:59:10 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e98 e98: 6 total, 6 up, 6 in
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.808 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.809 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.814 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d5a967a-7bc8-43fd-928e-f2f11f2c1621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.809344', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f027764-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '4c3903669b08a5e6e74441ca00461a2b4d4080a6ad00ff24e8048adf1ddb409c'}]}, 'timestamp': '2025-11-23 09:59:10.815120', '_unique_id': '9228642f5fff473c96ababa3a3b718bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:10.820 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "d1ec78b5f1b07f2e087c6afff3b972c481121dc6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.482s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73002d66-fb2f-44b1-957a-ca981c4773b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.818306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f083bc2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '007384d2f4824aede4156938503e74e6763deb3cad7e4bf2800e75912e3be00e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.818306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f085170-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'e44f70d648f3fe59b7e609300be55e27241bbf51372b8735a012e9f4751abe55'}]}, 'timestamp': '2025-11-23 09:59:10.853179', '_unique_id': '61272adc7f6744fda9c0b7ce042812fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.857 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '960bfdfb-fb3f-40f3-8330-fed6ee8b3533', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.857699', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f092ad2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'fbbf1b18b5ee9ce2e5710acd8f3f5b58d2ccec53a758661c920f1842b27b4897'}]}, 'timestamp': '2025-11-23 09:59:10.859122', '_unique_id': 'c68026bb9ccd4e73a396823941fedef1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88a6350d-3028-42b5-aca3-cfaa08dd8420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.862800', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f09e6ca-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'f173d672340e88dd3904e8ad794a4c398ef9fa71542b0c9900163191192321ac'}]}, 'timestamp': '2025-11-23 09:59:10.863576', '_unique_id': '21faabe0b62f4f44afba4765778a8fa1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee846f90-3755-4d18-907c-b77e73ab670b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.867230', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0a88fa-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'abec897939af0d768316f63b73b4b2c0fdf1299ebc135961ce7420d03b2b5a27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.867230', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0a9b1a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '59f6df8d40d90ddccf35503f84b93df0e8c0a20be850f76689808d5528bb7b59'}]}, 'timestamp': '2025-11-23 09:59:10.868185', '_unique_id': '61d5fba8cf0142fe885c0b837174ef9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9010612-b30a-4cde-9ff8-2e9784892680', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.871719', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0b39a8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '4684e771ffdde6eff2e073f594d235701ce0d28b909111aaf8c6dcc9947a9c68'}]}, 'timestamp': '2025-11-23 09:59:10.872361', '_unique_id': 'fce7870049c14db18f12e688d6aaf673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.875 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dba06eda-2118-4997-a7d3-d145d335d797', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.875265', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0bc0c6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '0754996772b7d133605cf1c00ed7b64ad783b2b2c41cbc2334cb87499cc91ebd'}]}, 'timestamp': '2025-11-23 09:59:10.875612', '_unique_id': '6189c6a91826422fa5f969fef20f5953'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b21eaff-9513-4faa-9a75-2929f1fa6b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.877025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0c039c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '52ebe5008ff29e360310d72888c536bedbaecbb06db91516dc257217039a63a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.877025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0c0edc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '78a67f829c6f36702d8aa71b8e8efe1737a9fe3909efd30cec802c790b6b0f95'}]}, 'timestamp': '2025-11-23 09:59:10.877590', '_unique_id': 'de1ba1d27b254b7b891d91dbf5cb2234'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa968336-cc32-4d32-bb65-089a63f6cb4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.879488', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0c6440-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'e66528cf7217ddb64e14de845b83cdb24e00cb7cfbebe2db00cc64af4b8c5494'}]}, 'timestamp': '2025-11-23 09:59:10.879798', '_unique_id': '0dd49377b76040169dc7b38df6143457'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6d84d20-6d20-4691-b09b-39147c8b1cec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.881733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0e6092-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': '2f6ee57967d62b6e1b5e7ed73b3d333818ed02030d8c7beaaa851ebdfa7d8dd4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.881733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0e751e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': '5788e29f1702db7134227b417e3f5023c6910559d3ffc91f950be0e43ef8f573'}]}, 'timestamp': '2025-11-23 09:59:10.893333', '_unique_id': 'c7ae7e2e2e2846aea19be6cb8f74b8b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe63f4ca-ca83-429e-bee3-0e81d830dab4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.895434', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0ed45a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'f1950e038edfe449da496ec7cafd709ba50396d4c263ede93dd10411c31ef4ea'}]}, 'timestamp': '2025-11-23 09:59:10.895782', '_unique_id': 'e535fe8b0ec54cdba6ec508437981886'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51e28022-709a-4a85-9dc2-ddd652e61e63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.897448', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0f227a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '36f4275bc057a52a9888c2ca3c45e4ce065e1a7feb7ad30c700da5c1b3efca19'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.897448', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0f2ffe-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '3c9b86e8430bd35b24eea4504cd5723fcfcc628c99e878a6a084ebf6a111334a'}]}, 'timestamp': '2025-11-23 09:59:10.898160', '_unique_id': 'a7bf4cdb7edc49faaab610089ba150da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36ad4d64-4288-417b-9234-bfcc2ca57a43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.900332', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0f923c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '7b8e2f0e2f8b1b3c3b89eb624679d468bcfdbaa76f2dd73cf57a2b27a3d91103'}]}, 'timestamp': '2025-11-23 09:59:10.900640', '_unique_id': '33501276c2624320a8f879dbba250b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '339f231f-756d-4bce-b640-b9b46109912c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.902012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0fd440-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': '81519590db75c10b3e1ad8e4bbf8799db4b6e7518b40af49af1f9dc2acefd629'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.902012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0fde54-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': 'c21ef9526d8b7a753dd7cd5903fa4f1e90d18e4b1cd5bf984d1e3f84f8be5284'}]}, 'timestamp': '2025-11-23 09:59:10.902560', '_unique_id': '6c172542001742ec954a108a10b945a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9d80885-1bf6-4459-b128-03ffedb978b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.903885', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f101d56-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': 'b08a00c7477358e862aee5045f4548d884661ba464b9dfc6f1632a89086d738b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.903885', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f102724-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': 'b50b58605d2a71106b76f1f9c0325c41e9136a76af372572a6fda630a473ec71'}]}, 'timestamp': '2025-11-23 09:59:10.904424', '_unique_id': '6fa15a57a82e46faa6fdf9613c939acf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d4826b5-53a4-44f4-bc09-f4c7bb9c8ef7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:59:10.905738', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0f12e860-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.099510459, 'message_signature': 'cecee09bda2d392fe42f55850a793c70a210699641fa1f414ba3f6f1bdba2b39'}]}, 'timestamp': '2025-11-23 09:59:10.922550', '_unique_id': '2f3d0f88a5bd44f08eb9913787993c9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 15950000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f6734b7-f3df-4c28-adab-66ba7d41d35a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15950000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:59:10.924402', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0f133e64-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.099510459, 'message_signature': 'bf52a1e27c46778318090bf594f26bbeceedf3d0bff631fd0f6c7b485b04e06e'}]}, 'timestamp': '2025-11-23 09:59:10.924748', '_unique_id': 'a0773c5e33ca4d36a45b3ac13127185a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f080e37-3358-423e-aa62-5b60683c8169', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.926332', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f1389be-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '441bb17b322ab83fae1454090dcb0102ceb5f7b303ece49dd90fa80cc821e390'}]}, 'timestamp': '2025-11-23 09:59:10.926634', '_unique_id': '6875085e12ba4e14a68ab778f5584fde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18dcce94-e273-4497-8bcb-303751d4de9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.929014', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f1401fa-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'af584afff2a4adb748122b5dc4c82336a3b75970bcdc7cfcac660672608eb479'}]}, 'timestamp': '2025-11-23 09:59:10.929736', '_unique_id': '5b004c0248b04a6499685b7b532ae9a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b9e6183-a52c-4c42-9df1-a5b5b39ee686', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.931505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f14539e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'bfbfce4be5875210f9191ad9539e0b41c5778cb71c5e14505fc553a03f00367e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.931505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f1464e2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'c726d5885349747b7e0247e384233017662fa605b1c50771a5a2b327a847ed52'}]}, 'timestamp': '2025-11-23 09:59:10.932319', '_unique_id': '401b1a0067234ce58c2193cb3e909702'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fa5640d-601a-4478-8775-862e1c42158a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.934082', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f14b834-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '9005cfa4c5d4893cb3a542d4553f45f4b2ac95392583e40381baf85b98a9bce0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.934082', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f14c216-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '7fb417122aa73d3196daf0f3f1ab32ad9b09d73638b5ab58b4f7b394e4ead79b'}]}, 'timestamp': '2025-11-23 09:59:10.934606', '_unique_id': 'fc84eb5393ac4687bc67f29e453096c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 09:59:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging 
Nov 23 09:59:11 np0005532585.localdomain systemd[1]: tmp-crun.JgpR9K.mount: Deactivated successfully.
Nov 23 09:59:11 np0005532585.localdomain podman[313298]: 2025-11-23 09:59:11.063223671 +0000 UTC m=+0.073532935 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:59:11 np0005532585.localdomain dnsmasq[312818]: exiting on receipt of SIGTERM
Nov 23 09:59:11 np0005532585.localdomain systemd[1]: libpod-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720.scope: Deactivated successfully.
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.111 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:11 np0005532585.localdomain podman[313331]: 2025-11-23 09:59:11.135637292 +0000 UTC m=+0.054237952 container died 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:59:11 np0005532585.localdomain systemd[1]: tmp-crun.8OeMIG.mount: Deactivated successfully.
Nov 23 09:59:11 np0005532585.localdomain podman[313331]: 2025-11-23 09:59:11.161776177 +0000 UTC m=+0.080376817 container cleanup 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:59:11 np0005532585.localdomain systemd[1]: libpod-conmon-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720.scope: Deactivated successfully.
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.206 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] flattening vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 23 09:59:11 np0005532585.localdomain podman[313332]: 2025-11-23 09:59:11.216341838 +0000 UTC m=+0.127307223 container remove 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.261 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.262 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.367 281956 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763891936.366917, 8f62292f-5719-4b19-9188-3715b94493a7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.368 281956 INFO nova.compute.manager [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] VM Stopped (Lifecycle Event)
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.391 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.402 281956 DEBUG nova.compute.manager [None req-6ed1d632-97b7-40f4-819e-fe3170f6833f - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:11.574 263258 INFO neutron.agent.dhcp.agent [None req-019f614a-0716-4c02-9a77-9686dcef5e01 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:11.575 263258 INFO neutron.agent.dhcp.agent [None req-019f614a-0716-4c02-9a77-9686dcef5e01 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:11 np0005532585.localdomain ceph-mon[300199]: osdmap e98: 6 total, 6 up, 6 in
Nov 23 09:59:11 np0005532585.localdomain ceph-mon[300199]: pgmap v128: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 8.3 KiB/s wr, 133 op/s
Nov 23 09:59:11 np0005532585.localdomain podman[240668]: time="2025-11-23T09:59:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:59:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:59:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 09:59:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:59:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19234 "" "Go-http-client/1.1"
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.987 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Image rbd:vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.988 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.988 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Ensure instance console log exists: /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.988 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.989 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.989 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.991 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-11-23T09:58:37Z,direct_url=<?>,disk_format='raw',id=7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2005076685-shelved',owner='37a58b702f564a81ab5a59cf4201b4f0',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-11-23T09:59:06Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'image_id': 'c5806483-57a8-4254-b41b-254b888c8606'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.995 281956 WARNING nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.997 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.997 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.999 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 23 09:59:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:11.999 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.000 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.000 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T09:56:44Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43b374b4-75d9-47f9-aa6b-ddb1a45f7c04',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-11-23T09:58:37Z,direct_url=<?>,disk_format='raw',id=7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2005076685-shelved',owner='37a58b702f564a81ab5a59cf4201b4f0',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-11-23T09:59:06Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.000 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.002 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.002 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.002 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.003 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.003 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.003 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.016 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-2e82ddd04bc2b03081d5a523028e9859714aedc10be59a8137ed30fe358e9a28-merged.mount: Deactivated successfully.
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d0a868746\x2d0c5d\x2d4cb5\x2db569\x2de1ea427d7eaf.mount: Deactivated successfully.
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 09:59:12 np0005532585.localdomain sshd[313412]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.316 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.317 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.318 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.318 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:12.398 2 INFO neutron.agent.securitygroups_rpc [req-a16da276-a12d-4c8d-9117-64c33f913ca9 req-9efe9caa-9e38-4b50-8b4e-539fa928addc 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group member updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']
Nov 23 09:59:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:12.420 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e99 e99: 6 total, 6 up, 6 in
Nov 23 09:59:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 09:59:12 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3540974265' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.601 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.636 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.645 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:12 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 1 addresses
Nov 23 09:59:12 np0005532585.localdomain podman[313449]: 2025-11-23 09:59:12.7063986 +0000 UTC m=+0.062795445 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 09:59:12 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:59:12 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: tmp-crun.2v4RVR.mount: Deactivated successfully.
Nov 23 09:59:12 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:12Z|00143|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.807 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:12 np0005532585.localdomain podman[313466]: 2025-11-23 09:59:12.81999679 +0000 UTC m=+0.087305491 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Nov 23 09:59:12 np0005532585.localdomain podman[313465]: 2025-11-23 09:59:12.830880105 +0000 UTC m=+0.099320071 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.834 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.851 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:59:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:12.853 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 09:59:12 np0005532585.localdomain podman[313465]: 2025-11-23 09:59:12.864215282 +0000 UTC m=+0.132655288 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:59:12 np0005532585.localdomain podman[313464]: 2025-11-23 09:59:12.875580302 +0000 UTC m=+0.144635767 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:59:12 np0005532585.localdomain podman[313466]: 2025-11-23 09:59:12.903813882 +0000 UTC m=+0.171122583 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 23 09:59:12 np0005532585.localdomain podman[313464]: 2025-11-23 09:59:12.911476128 +0000 UTC m=+0.180531593 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:59:12 np0005532585.localdomain sshd[313412]: Invalid user postgres from 207.154.194.2 port 46040
Nov 23 09:59:12 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:59:13 np0005532585.localdomain sshd[313412]: Received disconnect from 207.154.194.2 port 46040:11: Bye Bye [preauth]
Nov 23 09:59:13 np0005532585.localdomain sshd[313412]: Disconnected from invalid user postgres 207.154.194.2 port 46040 [preauth]
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.097 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.100 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.126 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] End _get_guest_xml xml=<domain type="kvm">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <uuid>8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7</uuid>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <name>instance-00000009</name>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <memory>131072</memory>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <vcpu>1</vcpu>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <metadata>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-2005076685</nova:name>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:creationTime>2025-11-23 09:59:11</nova:creationTime>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:flavor name="m1.nano">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:memory>128</nova:memory>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:disk>1</nova:disk>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:swap>0</nova:swap>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:ephemeral>0</nova:ephemeral>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:vcpus>1</nova:vcpus>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       </nova:flavor>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:owner>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:user uuid="55581f20ed8d4bd8a61a81c525ca8141">tempest-UnshelveToHostMultiNodesTest-612486733-project-member</nova:user>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <nova:project uuid="37a58b702f564a81ab5a59cf4201b4f0">tempest-UnshelveToHostMultiNodesTest-612486733</nova:project>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       </nova:owner>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:root type="image" uuid="7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <nova:ports/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </nova:instance>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </metadata>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <sysinfo type="smbios">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <system>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <entry name="manufacturer">RDO</entry>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <entry name="product">OpenStack Compute</entry>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <entry name="serial">8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7</entry>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <entry name="uuid">8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7</entry>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <entry name="family">Virtual Machine</entry>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </system>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </sysinfo>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <os>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <boot dev="hd"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <smbios mode="sysinfo"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </os>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <features>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <acpi/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <apic/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <vmcoreinfo/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </features>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <clock offset="utc">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <timer name="pit" tickpolicy="delay"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <timer name="hpet" present="no"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </clock>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <cpu mode="host-model" match="exact">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <topology sockets="1" cores="1" threads="1"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </cpu>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   <devices>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <disk type="network" device="disk">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <driver type="raw" cache="none"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <source protocol="rbd" name="vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.103" port="6789"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.104" port="6789"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.105" port="6789"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       </source>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <auth username="openstack">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <secret type="ceph" uuid="46550e70-79cb-5f55-bf6d-1204b97e083b"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       </auth>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <target dev="vda" bus="virtio"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <disk type="network" device="cdrom">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <driver type="raw" cache="none"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <source protocol="rbd" name="vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.103" port="6789"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.104" port="6789"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <host name="172.18.0.105" port="6789"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       </source>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <auth username="openstack">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:         <secret type="ceph" uuid="46550e70-79cb-5f55-bf6d-1204b97e083b"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       </auth>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <target dev="sda" bus="sata"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </disk>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <serial type="pty">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <log file="/var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/console.log" append="off"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </serial>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <video>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <model type="virtio"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </video>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <input type="tablet" bus="usb"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <input type="keyboard" bus="usb"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <rng model="virtio">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <backend model="random">/dev/urandom</backend>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </rng>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="pci" model="pcie-root-port"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <controller type="usb" index="0"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     <memballoon model="virtio">
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:       <stats period="10"/>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:     </memballoon>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:   </devices>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: </domain>
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.181 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.182 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.183 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Using config drive
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.221 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.228 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.243 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.288 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'keypairs' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.366 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating config drive at /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.378 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkac12wxz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:13 np0005532585.localdomain ceph-mon[300199]: osdmap e99: 6 total, 6 up, 6 in
Nov 23 09:59:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3540974265' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:59:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2564275816' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:59:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3568239067' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:13 np0005532585.localdomain ceph-mon[300199]: pgmap v130: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 115 KiB/s rd, 9.9 KiB/s wr, 158 op/s
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.509 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkac12wxz" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.554 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.559 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.779 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:13.781 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting local config drive /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config because it was imported into RBD.
Nov 23 09:59:13 np0005532585.localdomain systemd-machined[84275]: New machine qemu-5-instance-00000009.
Nov 23 09:59:13 np0005532585.localdomain systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.209 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Removed pending event for 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.210 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891954.2086325, 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.211 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Resumed (Lifecycle Event)
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.215 281956 DEBUG nova.compute.manager [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.216 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.217 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.217 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.222 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance spawned successfully.
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.241 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.246 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.278 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.279 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event <LifecycleEvent: 1763891954.2100148, 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.279 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Started (Lifecycle Event)
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.298 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.302 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.320 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 23 09:59:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e100 e100: 6 total, 6 up, 6 in
Nov 23 09:59:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1089133868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:14 np0005532585.localdomain ceph-mon[300199]: osdmap e100: 6 total, 6 up, 6 in
Nov 23 09:59:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:14.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:15 np0005532585.localdomain dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 0 addresses
Nov 23 09:59:15 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host
Nov 23 09:59:15 np0005532585.localdomain podman[313689]: 2025-11-23 09:59:15.007323041 +0000 UTC m=+0.065866810 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:59:15 np0005532585.localdomain dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts
Nov 23 09:59:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:15.111 281956 DEBUG nova.compute.manager [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:15 np0005532585.localdomain kernel: device tapd6f3b7ff-1b left promiscuous mode
Nov 23 09:59:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:15Z|00144|binding|INFO|Releasing lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 from this chassis (sb_readonly=0)
Nov 23 09:59:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:15Z|00145|binding|INFO|Setting lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 down in Southbound
Nov 23 09:59:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:15.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:15.285 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0497de4959b2494e8036eb39226430d6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54e00d1b-ba48-40e5-8228-7e38f918fa79, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d6f3b7ff-1bfe-4568-bcbd-2732186dba70) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:15.290 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d6f3b7ff-1bfe-4568-bcbd-2732186dba70 in datapath c5d88dfa-0db8-489e-a45a-e843e31a3b26 unbound from our chassis
Nov 23 09:59:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:15.293 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5d88dfa-0db8-489e-a45a-e843e31a3b26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:59:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:15.295 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8ededb-7ca5-4552-8e45-818dad82f0b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:15.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:15.351 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 6.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/929396430' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:15 np0005532585.localdomain ceph-mon[300199]: pgmap v132: 177 pgs: 177 active+clean; 304 MiB data, 1020 MiB used, 41 GiB / 42 GiB avail; 8.6 MiB/s rd, 8.4 MiB/s wr, 375 op/s
Nov 23 09:59:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:16Z|00146|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.251 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.251 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.252 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.252 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.253 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.307 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.392 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1667305845' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.621 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.621 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.622 281956 INFO nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelving
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.649 281956 DEBUG nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 23 09:59:16 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:59:16 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3593812547' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.815 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.899 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.899 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.913 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:59:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:16.913 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.246 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.248 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11169MB free_disk=41.7004280090332GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.249 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.249 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:17 np0005532585.localdomain systemd[1]: tmp-crun.5DSdTd.mount: Deactivated successfully.
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.343 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.343 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.344 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.344 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 09:59:17 np0005532585.localdomain dnsmasq[310864]: exiting on receipt of SIGTERM
Nov 23 09:59:17 np0005532585.localdomain podman[313751]: 2025-11-23 09:59:17.346419028 +0000 UTC m=+0.088068274 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:59:17 np0005532585.localdomain systemd[1]: libpod-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf.scope: Deactivated successfully.
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.363 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.387 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.388 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.414 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 09:59:17 np0005532585.localdomain podman[313764]: 2025-11-23 09:59:17.425451873 +0000 UTC m=+0.064680584 container died 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.439 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 09:59:17 np0005532585.localdomain podman[313764]: 2025-11-23 09:59:17.462831474 +0000 UTC m=+0.102060195 container cleanup 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 09:59:17 np0005532585.localdomain systemd[1]: libpod-conmon-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf.scope: Deactivated successfully.
Nov 23 09:59:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e101 e101: 6 total, 6 up, 6 in
Nov 23 09:59:17 np0005532585.localdomain podman[313766]: 2025-11-23 09:59:17.51171123 +0000 UTC m=+0.141066497 container remove 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:59:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:17.580 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1793000182' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3593812547' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:17 np0005532585.localdomain ceph-mon[300199]: pgmap v133: 177 pgs: 177 active+clean; 304 MiB data, 1020 MiB used, 41 GiB / 42 GiB avail; 7.3 MiB/s rd, 7.1 MiB/s wr, 319 op/s
Nov 23 09:59:17 np0005532585.localdomain ceph-mon[300199]: osdmap e101: 6 total, 6 up, 6 in
Nov 23 09:59:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:17.656 263258 INFO neutron.agent.dhcp.agent [None req-fc0489cb-10d2-4fc6-807c-0937bd6825af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:17.867 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:59:18 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2514870888' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:18.030 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:18.035 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:59:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:18.053 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:59:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:18.091 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 09:59:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:18.092 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7e36441351384cf4ce607d42d9d2a693f2618154ca741337c33c1071de55a0ca-merged.mount: Deactivated successfully.
Nov 23 09:59:18 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:18 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dc5d88dfa\x2d0db8\x2d489e\x2da45a\x2de843e31a3b26.mount: Deactivated successfully.
Nov 23 09:59:18 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2514870888' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:18.807 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:19.093 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:19.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:19 np0005532585.localdomain ceph-mon[300199]: pgmap v135: 177 pgs: 177 active+clean; 225 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 10 MiB/s rd, 6.8 MiB/s wr, 457 op/s
Nov 23 09:59:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:20.208 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 09:59:20 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:20.230 2 INFO neutron.agent.securitygroups_rpc [req-e10db1c6-11f3-4ff7-8a20-47058bab960f req-be246029-4620-443a-8a27-dc66d74bf8a5 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['df6d8f7b-74cc-4864-a7e2-24c32662f7e1']
Nov 23 09:59:20 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:20.724 2 INFO neutron.agent.securitygroups_rpc [req-b63466c9-444c-4747-806e-6e70f6ca8dbf req-1b97f9eb-afe9-470d-a373-457d04103769 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['486481c0-58d7-474c-ac28-9109e6d75e3e']
Nov 23 09:59:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:21.162 263258 INFO neutron.agent.linux.ip_lib [None req-4e36a782-7349-49d3-83e9-cb06f4ff3168 - - - - - -] Device tapbaef0101-f3 cannot be used as it has no MAC address
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.217 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain kernel: device tapbaef0101-f3 entered promiscuous mode
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.225 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891961.2261] manager: (tapbaef0101-f3): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Nov 23 09:59:21 np0005532585.localdomain systemd-udevd[313826]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:59:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:21Z|00147|binding|INFO|Claiming lport baef0101-f381-4af9-b095-1f116c8d43cf for this chassis.
Nov 23 09:59:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:21Z|00148|binding|INFO|baef0101-f381-4af9-b095-1f116c8d43cf: Claiming unknown
Nov 23 09:59:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:21.238 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68f3d06d-7ad2-4e09-a769-ead39666c244, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=baef0101-f381-4af9-b095-1f116c8d43cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:21.240 160439 INFO neutron.agent.ovn.metadata.agent [-] Port baef0101-f381-4af9-b095-1f116c8d43cf in datapath 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7 bound to our chassis
Nov 23 09:59:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:21.242 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 69bb29d4-192f-481a-aefe-e710b536360e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 09:59:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:21.243 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:59:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:21.244 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[577b8ee4-4aa1-482d-bd6f-393ea04d349c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:21Z|00149|binding|INFO|Setting lport baef0101-f381-4af9-b095-1f116c8d43cf ovn-installed in OVS
Nov 23 09:59:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:21Z|00150|binding|INFO|Setting lport baef0101-f381-4af9-b095-1f116c8d43cf up in Southbound
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.274 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.293 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbaef0101-f3: No such device
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.358 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:21.394 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:21 np0005532585.localdomain ceph-mon[300199]: pgmap v136: 177 pgs: 177 active+clean; 225 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 8.9 MiB/s rd, 5.8 MiB/s wr, 392 op/s
Nov 23 09:59:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:21.669 2 INFO neutron.agent.securitygroups_rpc [req-5b5e1aae-0130-44ef-b3c8-2b4a33b1f155 req-73054398-e6b7-4548-a319-a659c6c54985 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['9f0e447c-560b-475e-bb8e-29f8dd459211']
Nov 23 09:59:22 np0005532585.localdomain podman[313897]: 
Nov 23 09:59:22 np0005532585.localdomain podman[313897]: 2025-11-23 09:59:22.276130991 +0000 UTC m=+0.102893840 container create 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:59:22 np0005532585.localdomain systemd[1]: Started libpod-conmon-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96.scope.
Nov 23 09:59:22 np0005532585.localdomain podman[313897]: 2025-11-23 09:59:22.22873806 +0000 UTC m=+0.055500900 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:59:22 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:59:22 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d1bd68d3770234ddf8ac90b3c6d8de140a8a1263855d13554012024e6eb7216/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:59:22 np0005532585.localdomain podman[313897]: 2025-11-23 09:59:22.356349502 +0000 UTC m=+0.183112361 container init 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:59:22 np0005532585.localdomain podman[313897]: 2025-11-23 09:59:22.365002018 +0000 UTC m=+0.191764867 container start 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 09:59:22 np0005532585.localdomain dnsmasq[313916]: started, version 2.85 cachesize 150
Nov 23 09:59:22 np0005532585.localdomain dnsmasq[313916]: DNS service limited to local subnets
Nov 23 09:59:22 np0005532585.localdomain dnsmasq[313916]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:59:22 np0005532585.localdomain dnsmasq[313916]: warning: no upstream servers configured
Nov 23 09:59:22 np0005532585.localdomain dnsmasq-dhcp[313916]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:59:22 np0005532585.localdomain dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 0 addresses
Nov 23 09:59:22 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host
Nov 23 09:59:22 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts
Nov 23 09:59:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:22.428 263258 INFO neutron.agent.dhcp.agent [None req-6c086316-9d8e-4945-aca8-3830dd2bc335 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:21Z, description=, device_id=89783321-05cb-41ec-bfca-a08a32ddb0e0, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8be0400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c942d310>], id=31fc4967-c13f-4c2d-85f7-b02043b66946, ip_allocation=immediate, mac_address=fa:16:3e:e2:c2:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:17Z, description=, dns_domain=, id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2006905300, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19922, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=889, status=ACTIVE, subnets=['c95d6240-1bf0-485d-b67f-51fb71cd6afd'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:19Z, vlan_transparent=None, network_id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=957, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:21Z on network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7
Nov 23 09:59:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e102 e102: 6 total, 6 up, 6 in
Nov 23 09:59:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:22.509 263258 INFO neutron.agent.dhcp.agent [None req-b83a353c-8acf-4179-8fe0-c32627ec534e - - - - - -] DHCP configuration for ports {'f40396a1-ffcc-4210-912d-fa3a2a29cc54'} is completed
Nov 23 09:59:22 np0005532585.localdomain dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 1 addresses
Nov 23 09:59:22 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host
Nov 23 09:59:22 np0005532585.localdomain podman[313932]: 2025-11-23 09:59:22.688426922 +0000 UTC m=+0.063403904 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:59:22 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts
Nov 23 09:59:22 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:22.697 2 INFO neutron.agent.securitygroups_rpc [req-2f39ed52-ad77-4aa6-9471-651d11ecbf13 req-f42d6273-96eb-4a84-b6a8-20685191fd4a 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['9d3d4eb8-5be7-4867-b930-e62b16d22d58']
Nov 23 09:59:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:59:22 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:59:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:22.966 263258 INFO neutron.agent.dhcp.agent [None req-28ad30cc-102c-46f6-9e38-400cc098b6ec - - - - - -] DHCP configuration for ports {'31fc4967-c13f-4c2d-85f7-b02043b66946'} is completed
Nov 23 09:59:23 np0005532585.localdomain podman[313953]: 2025-11-23 09:59:23.022233135 +0000 UTC m=+0.080360156 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 09:59:23 np0005532585.localdomain podman[313953]: 2025-11-23 09:59:23.034155832 +0000 UTC m=+0.092282913 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:59:23 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:59:23 np0005532585.localdomain podman[313952]: 2025-11-23 09:59:23.100387983 +0000 UTC m=+0.159813814 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:59:23 np0005532585.localdomain podman[313952]: 2025-11-23 09:59:23.112847726 +0000 UTC m=+0.172273537 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:59:23 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:59:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:23.261 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:21Z, description=, device_id=89783321-05cb-41ec-bfca-a08a32ddb0e0, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c94268b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9426040>], id=31fc4967-c13f-4c2d-85f7-b02043b66946, ip_allocation=immediate, mac_address=fa:16:3e:e2:c2:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:17Z, description=, dns_domain=, id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2006905300, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19922, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=889, status=ACTIVE, subnets=['c95d6240-1bf0-485d-b67f-51fb71cd6afd'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:19Z, vlan_transparent=None, network_id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=957, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:21Z on network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7
Nov 23 09:59:23 np0005532585.localdomain ceph-mon[300199]: osdmap e102: 6 total, 6 up, 6 in
Nov 23 09:59:23 np0005532585.localdomain ceph-mon[300199]: pgmap v138: 177 pgs: 177 active+clean; 225 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 1.7 KiB/s wr, 130 op/s
Nov 23 09:59:23 np0005532585.localdomain systemd[1]: tmp-crun.K0MHoS.mount: Deactivated successfully.
Nov 23 09:59:23 np0005532585.localdomain dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 1 addresses
Nov 23 09:59:23 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host
Nov 23 09:59:23 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts
Nov 23 09:59:23 np0005532585.localdomain podman[314011]: 2025-11-23 09:59:23.52132715 +0000 UTC m=+0.075976541 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 09:59:23 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:23.530 2 INFO neutron.agent.securitygroups_rpc [req-9fd26688-f794-4469-9fd8-a5b40d60592d req-5ebc8662-650f-469d-8c45-5ce5c30495b8 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['b40df903-b9f3-4a1c-8419-71383dae71f9']
Nov 23 09:59:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:23.722 263258 INFO neutron.agent.dhcp.agent [None req-4551f10f-a18f-43fa-82a2-f6c777bb83cd - - - - - -] DHCP configuration for ports {'31fc4967-c13f-4c2d-85f7-b02043b66946'} is completed
Nov 23 09:59:23 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:23.997 2 INFO neutron.agent.securitygroups_rpc [req-e3ac8e09-5876-4e41-80c7-46043b4c6329 req-ca05e120-41d0-4e85-be51-0d5858a51936 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['b40df903-b9f3-4a1c-8419-71383dae71f9']
Nov 23 09:59:24 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:24.404 2 INFO neutron.agent.securitygroups_rpc [None req-55c9cfb2-f59a-42d6-ace6-61788e22f102 f30cb7ce3bac485ca16e284ef2514162 493833d8fb394637b29c3fb2052aca9c - - default default] Security group member updated ['6a5ca8fc-febe-492b-8ed6-1c2faceb11b7']
Nov 23 09:59:24 np0005532585.localdomain sshd[314033]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:59:24 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:24.911 2 INFO neutron.agent.securitygroups_rpc [None req-dfbbbdd6-9764-4926-ad6f-603dbba55323 f30cb7ce3bac485ca16e284ef2514162 493833d8fb394637b29c3fb2052aca9c - - default default] Security group member updated ['6a5ca8fc-febe-492b-8ed6-1c2faceb11b7']
Nov 23 09:59:24 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:24.922 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:24 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:24.929 2 INFO neutron.agent.securitygroups_rpc [req-3634fbe3-812b-4789-a7d3-12a0d9366017 req-05979dae-61e1-4fc8-b138-901b668995d3 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['b40df903-b9f3-4a1c-8419-71383dae71f9']
Nov 23 09:59:25 np0005532585.localdomain ceph-mon[300199]: pgmap v139: 177 pgs: 177 active+clean; 225 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 1.7 KiB/s wr, 130 op/s
Nov 23 09:59:25 np0005532585.localdomain sshd[314033]: Invalid user testftp from 175.126.166.172 port 51612
Nov 23 09:59:25 np0005532585.localdomain sshd[314033]: Received disconnect from 175.126.166.172 port 51612:11: Bye Bye [preauth]
Nov 23 09:59:25 np0005532585.localdomain sshd[314033]: Disconnected from invalid user testftp 175.126.166.172 port 51612 [preauth]
Nov 23 09:59:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:26.296 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:26.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:26.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:26.699 281956 DEBUG nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 23 09:59:27 np0005532585.localdomain ceph-mon[300199]: pgmap v140: 177 pgs: 177 active+clean; 225 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 1.4 KiB/s wr, 105 op/s
Nov 23 09:59:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:29 np0005532585.localdomain ceph-mon[300199]: pgmap v141: 177 pgs: 177 active+clean; 226 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 636 KiB/s rd, 44 KiB/s wr, 55 op/s
Nov 23 09:59:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:29.494 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:29 np0005532585.localdomain dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 0 addresses
Nov 23 09:59:29 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host
Nov 23 09:59:29 np0005532585.localdomain podman[314051]: 2025-11-23 09:59:29.550973295 +0000 UTC m=+0.065996154 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 09:59:29 np0005532585.localdomain dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts
Nov 23 09:59:29 np0005532585.localdomain systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 23 09:59:29 np0005532585.localdomain systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 13.612s CPU time.
Nov 23 09:59:29 np0005532585.localdomain systemd-machined[84275]: Machine qemu-5-instance-00000009 terminated.
Nov 23 09:59:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:29.889 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance shutdown successfully after 13 seconds.
Nov 23 09:59:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:29.900 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.
Nov 23 09:59:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:29.901 281956 DEBUG nova.objects.instance [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:29Z|00151|binding|INFO|Releasing lport baef0101-f381-4af9-b095-1f116c8d43cf from this chassis (sb_readonly=0)
Nov 23 09:59:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:29.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:29Z|00152|binding|INFO|Setting lport baef0101-f381-4af9-b095-1f116c8d43cf down in Southbound
Nov 23 09:59:29 np0005532585.localdomain kernel: device tapbaef0101-f3 left promiscuous mode
Nov 23 09:59:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:29.931 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68f3d06d-7ad2-4e09-a769-ead39666c244, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=baef0101-f381-4af9-b095-1f116c8d43cf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:29.933 160439 INFO neutron.agent.ovn.metadata.agent [-] Port baef0101-f381-4af9-b095-1f116c8d43cf in datapath 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7 unbound from our chassis
Nov 23 09:59:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:29.935 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:59:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:29.936 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae44200-ad19-4b77-b8c3-465981eda5cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:29.945 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:59:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:29.991 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Beginning cold snapshot process
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:59:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:59:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:30.170 281956 DEBUG nova.virt.libvirt.imagebackend [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No parent info for c5806483-57a8-4254-b41b-254b888c8606; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163
Nov 23 09:59:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:30.224 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(d0131164664c4853ad3a327c704f4dc4) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 23 09:59:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e103 e103: 6 total, 6 up, 6 in
Nov 23 09:59:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:30.586 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] cloning vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk@d0131164664c4853ad3a327c704f4dc4 to images/b6d724dc-26d8-4b53-bc02-990c8b280c9a clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 23 09:59:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:30.764 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] flattening images/b6d724dc-26d8-4b53-bc02-990c8b280c9a flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 23 09:59:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:30.841 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:31.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:31.397 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:31 np0005532585.localdomain ceph-mon[300199]: osdmap e103: 6 total, 6 up, 6 in
Nov 23 09:59:31 np0005532585.localdomain ceph-mon[300199]: pgmap v143: 177 pgs: 177 active+clean; 226 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 719 KiB/s rd, 50 KiB/s wr, 62 op/s
Nov 23 09:59:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:31.640 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:31.695 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] removing snapshot(d0131164664c4853ad3a327c704f4dc4) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Nov 23 09:59:31 np0005532585.localdomain systemd[1]: tmp-crun.7CVRpJ.mount: Deactivated successfully.
Nov 23 09:59:31 np0005532585.localdomain dnsmasq[313916]: exiting on receipt of SIGTERM
Nov 23 09:59:31 np0005532585.localdomain systemd[1]: libpod-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96.scope: Deactivated successfully.
Nov 23 09:59:31 np0005532585.localdomain podman[314218]: 2025-11-23 09:59:31.777031851 +0000 UTC m=+0.059892446 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:59:31 np0005532585.localdomain podman[314230]: 2025-11-23 09:59:31.846670906 +0000 UTC m=+0.059127852 container died 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:59:31 np0005532585.localdomain podman[314230]: 2025-11-23 09:59:31.886213574 +0000 UTC m=+0.098670490 container cleanup 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 09:59:31 np0005532585.localdomain systemd[1]: libpod-conmon-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96.scope: Deactivated successfully.
Nov 23 09:59:31 np0005532585.localdomain podman[314234]: 2025-11-23 09:59:31.925988699 +0000 UTC m=+0.127336873 container remove 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:59:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:32.155 263258 INFO neutron.agent.dhcp.agent [None req-253b9119-1abe-4cc0-951b-ba25bb348876 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 09:59:32 np0005532585.localdomain podman[314260]: 2025-11-23 09:59:32.287061333 +0000 UTC m=+0.075626751 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 09:59:32 np0005532585.localdomain podman[314260]: 2025-11-23 09:59:32.293858522 +0000 UTC m=+0.082423920 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:32 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 09:59:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:32.486 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e104 e104: 6 total, 6 up, 6 in
Nov 23 09:59:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:32.604 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(snap) on rbd image(b6d724dc-26d8-4b53-bc02-990c8b280c9a) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 23 09:59:32 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:32Z|00153|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:32 np0005532585.localdomain systemd[1]: tmp-crun.wySTIM.mount: Deactivated successfully.
Nov 23 09:59:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4d1bd68d3770234ddf8ac90b3c6d8de140a8a1263855d13554012024e6eb7216-merged.mount: Deactivated successfully.
Nov 23 09:59:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:32 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d1fc969d8\x2d6d2f\x2d49ab\x2d83f5\x2de28a94f4b4e7.mount: Deactivated successfully.
Nov 23 09:59:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:32.856 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:33 np0005532585.localdomain ceph-mon[300199]: osdmap e104: 6 total, 6 up, 6 in
Nov 23 09:59:33 np0005532585.localdomain ceph-mon[300199]: pgmap v145: 177 pgs: 177 active+clean; 226 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 796 KiB/s rd, 55 KiB/s wr, 68 op/s
Nov 23 09:59:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e105 e105: 6 total, 6 up, 6 in
Nov 23 09:59:33 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 09:59:34 np0005532585.localdomain systemd[1]: tmp-crun.PxFlut.mount: Deactivated successfully.
Nov 23 09:59:34 np0005532585.localdomain podman[314294]: 2025-11-23 09:59:34.041962763 +0000 UTC m=+0.097869986 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 09:59:34 np0005532585.localdomain podman[314294]: 2025-11-23 09:59:34.054535011 +0000 UTC m=+0.110442214 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 09:59:34 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 09:59:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:34Z|00154|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.245 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.412 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Snapshot image upload complete
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.412 281956 DEBUG nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.483 281956 INFO nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelve offloading
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.489 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.490 281956 DEBUG nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.492 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.492 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquired lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.492 281956 DEBUG nova.network.neutron [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 23 09:59:34 np0005532585.localdomain sudo[314318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 09:59:34 np0005532585.localdomain sudo[314318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:59:34 np0005532585.localdomain sudo[314318]: pam_unix(sudo:session): session closed for user root
Nov 23 09:59:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:34.549 281956 DEBUG nova.network.neutron [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 23 09:59:34 np0005532585.localdomain ceph-mon[300199]: osdmap e105: 6 total, 6 up, 6 in
Nov 23 09:59:34 np0005532585.localdomain sudo[314336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 09:59:34 np0005532585.localdomain sudo[314336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.040 281956 DEBUG nova.network.neutron [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.059 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Releasing lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.068 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.069 281956 DEBUG nova.objects.instance [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'resources' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 09:59:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:35.092 263258 INFO neutron.agent.linux.ip_lib [None req-6c47639e-df62-4c96-bb42-44d566d06402 - - - - - -] Device tap53af77be-60 cannot be used as it has no MAC address
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.119 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:35 np0005532585.localdomain kernel: device tap53af77be-60 entered promiscuous mode
Nov 23 09:59:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:35Z|00155|binding|INFO|Claiming lport 53af77be-60e9-4895-8dfb-1340d4308442 for this chassis.
Nov 23 09:59:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:35Z|00156|binding|INFO|53af77be-60e9-4895-8dfb-1340d4308442: Claiming unknown
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.128 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:35 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891975.1339] manager: (tap53af77be-60): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Nov 23 09:59:35 np0005532585.localdomain systemd-udevd[314400]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:59:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:35.138 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86837c26-d38c-4c74-814e-bcb8777abcca, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=53af77be-60e9-4895-8dfb-1340d4308442) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:35.139 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 53af77be-60e9-4895-8dfb-1340d4308442 in datapath 43d7b2ef-542d-499e-b6e4-ba4caed0547d bound to our chassis
Nov 23 09:59:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:35.140 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43d7b2ef-542d-499e-b6e4-ba4caed0547d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:59:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:35.142 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[230163d0-3ebd-41b3-8ee4-fb3af32ca5a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:35Z|00157|binding|INFO|Setting lport 53af77be-60e9-4895-8dfb-1340d4308442 ovn-installed in OVS
Nov 23 09:59:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:35Z|00158|binding|INFO|Setting lport 53af77be-60e9-4895-8dfb-1340d4308442 up in Southbound
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.150 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.166 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.199 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:35 np0005532585.localdomain sudo[314336]: pam_unix(sudo:session): session closed for user root
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.218 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:35 np0005532585.localdomain sudo[314437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 09:59:35 np0005532585.localdomain sudo[314437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 09:59:35 np0005532585.localdomain ceph-mon[300199]: pgmap v147: 177 pgs: 177 active+clean; 307 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 7.8 MiB/s wr, 198 op/s
Nov 23 09:59:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 09:59:35 np0005532585.localdomain sudo[314437]: pam_unix(sudo:session): session closed for user root
Nov 23 09:59:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 09:59:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:59:35 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 09:59:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e106 e106: 6 total, 6 up, 6 in
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.788 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting instance files /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.789 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deletion of /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del complete
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.870 281956 INFO nova.scheduler.client.report [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Deleted allocations for instance 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.914 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.915 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 09:59:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:35.967 281956 DEBUG oslo_concurrency.processutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 09:59:36 np0005532585.localdomain podman[314488]: 
Nov 23 09:59:36 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:36Z|00159|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:36 np0005532585.localdomain podman[314488]: 2025-11-23 09:59:36.125062054 +0000 UTC m=+0.122282238 container create 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6.scope.
Nov 23 09:59:36 np0005532585.localdomain systemd[1]: tmp-crun.t5m8YR.mount: Deactivated successfully.
Nov 23 09:59:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:59:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd8876e7ddd7dafa11908fbfb871118f8df1d821738197ffd26d3d2606b14a04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:59:36 np0005532585.localdomain podman[314488]: 2025-11-23 09:59:36.095654598 +0000 UTC m=+0.092874822 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:59:36 np0005532585.localdomain podman[314488]: 2025-11-23 09:59:36.202643374 +0000 UTC m=+0.199863598 container init 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 09:59:36 np0005532585.localdomain podman[314488]: 2025-11-23 09:59:36.212127566 +0000 UTC m=+0.209347780 container start 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 09:59:36 np0005532585.localdomain dnsmasq[314524]: started, version 2.85 cachesize 150
Nov 23 09:59:36 np0005532585.localdomain dnsmasq[314524]: DNS service limited to local subnets
Nov 23 09:59:36 np0005532585.localdomain dnsmasq[314524]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:59:36 np0005532585.localdomain dnsmasq[314524]: warning: no upstream servers configured
Nov 23 09:59:36 np0005532585.localdomain dnsmasq-dhcp[314524]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 09:59:36 np0005532585.localdomain dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 0 addresses
Nov 23 09:59:36 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host
Nov 23 09:59:36 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.304 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:36.350 263258 INFO neutron.agent.dhcp.agent [None req-5ff18459-bf3c-415c-a5fe-548681ab746d - - - - - -] DHCP configuration for ports {'79df25e1-9d21-4fd1-bbcc-556f66257d24'} is completed
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 09:59:36 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1805846685' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.517 281956 DEBUG oslo_concurrency.processutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.524 281956 DEBUG nova.compute.provider_tree [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.540 281956 DEBUG nova.scheduler.client.report [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.559 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:36.597 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 09:59:36 np0005532585.localdomain ceph-mon[300199]: osdmap e106: 6 total, 6 up, 6 in
Nov 23 09:59:36 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1805846685' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e107 e107: 6 total, 6 up, 6 in
Nov 23 09:59:37 np0005532585.localdomain ceph-mon[300199]: pgmap v149: 177 pgs: 177 active+clean; 307 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 7.8 MiB/s wr, 198 op/s
Nov 23 09:59:37 np0005532585.localdomain ceph-mon[300199]: osdmap e107: 6 total, 6 up, 6 in
Nov 23 09:59:38 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:38.238 2 INFO neutron.agent.securitygroups_rpc [None req-0fc01b46-65ca-4975-99ff-e6e4d0974af8 32512604c08f4fa48e6e985a3f6cd6d1 79509bc833494f3598e01347dc55dea9 - - default default] Security group member updated ['cfab2162-6afe-48a0-9f05-cee7f160244c']
Nov 23 09:59:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:38.268 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:37Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c603d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9440a90>], id=83c25346-d3af-4486-a725-9089387b3a54, ip_allocation=immediate, mac_address=fa:16:3e:f6:34:69, name=tempest-RoutersTest-1314517952, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:32Z, description=, dns_domain=, id=43d7b2ef-542d-499e-b6e4-ba4caed0547d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-484078650, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19673, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1021, status=ACTIVE, subnets=['d042e155-fe30-481b-9077-8053dec275b7'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:34Z, vlan_transparent=None, network_id=43d7b2ef-542d-499e-b6e4-ba4caed0547d, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cfab2162-6afe-48a0-9f05-cee7f160244c'], standard_attr_id=1045, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:37Z on network 43d7b2ef-542d-499e-b6e4-ba4caed0547d
Nov 23 09:59:38 np0005532585.localdomain dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 1 addresses
Nov 23 09:59:38 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host
Nov 23 09:59:38 np0005532585.localdomain podman[314544]: 2025-11-23 09:59:38.54269937 +0000 UTC m=+0.058040259 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:59:38 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts
Nov 23 09:59:38 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3217501975' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 09:59:39 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:39.110 263258 INFO neutron.agent.dhcp.agent [None req-6b9e32d1-9fec-4ff5-9e3b-4f5e27f68e4f - - - - - -] DHCP configuration for ports {'83c25346-d3af-4486-a725-9089387b3a54'} is completed
Nov 23 09:59:39 np0005532585.localdomain ceph-mon[300199]: pgmap v151: 177 pgs: 177 active+clean; 226 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 7.8 MiB/s wr, 295 op/s
Nov 23 09:59:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e108 e108: 6 total, 6 up, 6 in
Nov 23 09:59:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:40.847 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:41.307 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:41.402 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:41.505 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:37Z, description=, device_id=ec85bb4c-cbc6-4764-81ae-e806625613bc, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8baefd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8baef40>], id=83c25346-d3af-4486-a725-9089387b3a54, ip_allocation=immediate, mac_address=fa:16:3e:f6:34:69, name=tempest-RoutersTest-1314517952, network_id=43d7b2ef-542d-499e-b6e4-ba4caed0547d, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['cfab2162-6afe-48a0-9f05-cee7f160244c'], standard_attr_id=1045, status=ACTIVE, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:40Z on network 43d7b2ef-542d-499e-b6e4-ba4caed0547d
Nov 23 09:59:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e109 e109: 6 total, 6 up, 6 in
Nov 23 09:59:41 np0005532585.localdomain ceph-mon[300199]: osdmap e108: 6 total, 6 up, 6 in
Nov 23 09:59:41 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/561663991' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:59:41 np0005532585.localdomain ceph-mon[300199]: pgmap v153: 177 pgs: 177 active+clean; 226 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 4.7 KiB/s wr, 96 op/s
Nov 23 09:59:41 np0005532585.localdomain dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 1 addresses
Nov 23 09:59:41 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host
Nov 23 09:59:41 np0005532585.localdomain podman[314579]: 2025-11-23 09:59:41.708628478 +0000 UTC m=+0.056909074 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 09:59:41 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts
Nov 23 09:59:41 np0005532585.localdomain podman[240668]: time="2025-11-23T09:59:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 09:59:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:59:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 09:59:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:09:59:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19244 "" "Go-http-client/1.1"
Nov 23 09:59:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:42.044 263258 INFO neutron.agent.dhcp.agent [None req-40f0cdb7-bf0d-4e20-9098-f3703e5f37e8 - - - - - -] DHCP configuration for ports {'83c25346-d3af-4486-a725-9089387b3a54'} is completed
Nov 23 09:59:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e110 e110: 6 total, 6 up, 6 in
Nov 23 09:59:42 np0005532585.localdomain ceph-mon[300199]: osdmap e109: 6 total, 6 up, 6 in
Nov 23 09:59:42 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/694137683' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 09:59:42 np0005532585.localdomain ceph-mon[300199]: osdmap e110: 6 total, 6 up, 6 in
Nov 23 09:59:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 09:59:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 09:59:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 09:59:43 np0005532585.localdomain systemd[1]: tmp-crun.HyIUQy.mount: Deactivated successfully.
Nov 23 09:59:43 np0005532585.localdomain podman[314603]: 2025-11-23 09:59:43.034032397 +0000 UTC m=+0.080020495 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 23 09:59:43 np0005532585.localdomain podman[314602]: 2025-11-23 09:59:43.132927414 +0000 UTC m=+0.183069431 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm)
Nov 23 09:59:43 np0005532585.localdomain podman[314603]: 2025-11-23 09:59:43.153465577 +0000 UTC m=+0.199453625 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 23 09:59:43 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 09:59:43 np0005532585.localdomain podman[314602]: 2025-11-23 09:59:43.168260152 +0000 UTC m=+0.218402079 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 23 09:59:43 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 09:59:43 np0005532585.localdomain podman[314601]: 2025-11-23 09:59:43.240174468 +0000 UTC m=+0.292341947 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 23 09:59:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:43 np0005532585.localdomain podman[314601]: 2025-11-23 09:59:43.276424035 +0000 UTC m=+0.328591554 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 09:59:43 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 09:59:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:43.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:43 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:43.338 2 INFO neutron.agent.securitygroups_rpc [None req-6713e12c-736e-4c48-95b8-a64782f68ffc 32512604c08f4fa48e6e985a3f6cd6d1 79509bc833494f3598e01347dc55dea9 - - default default] Security group member updated ['cfab2162-6afe-48a0-9f05-cee7f160244c']
Nov 23 09:59:43 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:43.518 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:43 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:43.519 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 09:59:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:43.519 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:43 np0005532585.localdomain dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 0 addresses
Nov 23 09:59:43 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host
Nov 23 09:59:43 np0005532585.localdomain dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts
Nov 23 09:59:43 np0005532585.localdomain podman[314679]: 2025-11-23 09:59:43.635862007 +0000 UTC m=+0.050560318 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 09:59:43 np0005532585.localdomain ceph-mon[300199]: pgmap v156: 177 pgs: 177 active+clean; 226 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 4.8 KiB/s wr, 98 op/s
Nov 23 09:59:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e111 e111: 6 total, 6 up, 6 in
Nov 23 09:59:43 np0005532585.localdomain kernel: device tap53af77be-60 left promiscuous mode
Nov 23 09:59:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:43.814 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:43 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:43Z|00160|binding|INFO|Releasing lport 53af77be-60e9-4895-8dfb-1340d4308442 from this chassis (sb_readonly=0)
Nov 23 09:59:43 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:43Z|00161|binding|INFO|Setting lport 53af77be-60e9-4895-8dfb-1340d4308442 down in Southbound
Nov 23 09:59:43 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:43.826 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86837c26-d38c-4c74-814e-bcb8777abcca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=53af77be-60e9-4895-8dfb-1340d4308442) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:43 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:43.828 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 53af77be-60e9-4895-8dfb-1340d4308442 in datapath 43d7b2ef-542d-499e-b6e4-ba4caed0547d unbound from our chassis
Nov 23 09:59:43 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:43.830 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43d7b2ef-542d-499e-b6e4-ba4caed0547d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 09:59:43 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:43.832 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[90b19db1-4349-4782-b683-3241ab06f033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:43.836 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:43.838 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:44Z|00162|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:44.429 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:44 np0005532585.localdomain ceph-mon[300199]: osdmap e111: 6 total, 6 up, 6 in
Nov 23 09:59:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:44.889 281956 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763891969.8882997, 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 23 09:59:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:44.890 281956 INFO nova.compute.manager [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Stopped (Lifecycle Event)
Nov 23 09:59:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:44.907 281956 DEBUG nova.compute.manager [None req-afe46cb7-374a-4fa0-a5a9-fefa134b5545 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 23 09:59:45 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:45Z|00163|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:45.564 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:45 np0005532585.localdomain dnsmasq[314524]: exiting on receipt of SIGTERM
Nov 23 09:59:45 np0005532585.localdomain podman[314717]: 2025-11-23 09:59:45.650533459 +0000 UTC m=+0.067222811 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:45 np0005532585.localdomain systemd[1]: libpod-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6.scope: Deactivated successfully.
Nov 23 09:59:45 np0005532585.localdomain podman[314731]: 2025-11-23 09:59:45.730956037 +0000 UTC m=+0.062653260 container died 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:59:45 np0005532585.localdomain ceph-mon[300199]: pgmap v158: 177 pgs: 177 active+clean; 226 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 12 MiB/s rd, 10 MiB/s wr, 422 op/s
Nov 23 09:59:45 np0005532585.localdomain podman[314731]: 2025-11-23 09:59:45.768859974 +0000 UTC m=+0.100557127 container cleanup 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 09:59:45 np0005532585.localdomain systemd[1]: libpod-conmon-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6.scope: Deactivated successfully.
Nov 23 09:59:45 np0005532585.localdomain podman[314732]: 2025-11-23 09:59:45.807687481 +0000 UTC m=+0.131324817 container remove 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 09:59:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:46.038 263258 INFO neutron.agent.dhcp.agent [None req-b49c767d-f68a-4ced-bc43-09b6b17f7157 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:46.040 263258 INFO neutron.agent.dhcp.agent [None req-b49c767d-f68a-4ced-bc43-09b6b17f7157 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:46.310 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:46.404 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:46.521 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:46.633 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fd8876e7ddd7dafa11908fbfb871118f8df1d821738197ffd26d3d2606b14a04-merged.mount: Deactivated successfully.
Nov 23 09:59:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:46 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d43d7b2ef\x2d542d\x2d499e\x2db6e4\x2dba4caed0547d.mount: Deactivated successfully.
Nov 23 09:59:47 np0005532585.localdomain ceph-mon[300199]: pgmap v159: 177 pgs: 177 active+clean; 226 MiB data, 983 MiB used, 41 GiB / 42 GiB avail; 9.2 MiB/s rd, 7.8 MiB/s wr, 329 op/s
Nov 23 09:59:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 e112: 6 total, 6 up, 6 in
Nov 23 09:59:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:48 np0005532585.localdomain ceph-mon[300199]: osdmap e112: 6 total, 6 up, 6 in
Nov 23 09:59:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1107415024' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 09:59:48 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:48.868 263258 INFO neutron.agent.linux.ip_lib [None req-088f6986-4544-4c9b-a0ae-3cd17c55f952 - - - - - -] Device tap2f7157df-cb cannot be used as it has no MAC address
Nov 23 09:59:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:48.889 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:48 np0005532585.localdomain kernel: device tap2f7157df-cb entered promiscuous mode
Nov 23 09:59:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:48Z|00164|binding|INFO|Claiming lport 2f7157df-cb90-4428-a3df-6f057985d5d6 for this chassis.
Nov 23 09:59:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891988.8975] manager: (tap2f7157df-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Nov 23 09:59:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:48Z|00165|binding|INFO|2f7157df-cb90-4428-a3df-6f057985d5d6: Claiming unknown
Nov 23 09:59:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:48.897 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:48 np0005532585.localdomain systemd-udevd[314770]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:59:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:48.911 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=2f7157df-cb90-4428-a3df-6f057985d5d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:48.913 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2f7157df-cb90-4428-a3df-6f057985d5d6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 09:59:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:48.916 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:59:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:48.917 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2662e08b-6888-4279-bf50-157caa3294a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:48Z|00166|binding|INFO|Setting lport 2f7157df-cb90-4428-a3df-6f057985d5d6 ovn-installed in OVS
Nov 23 09:59:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:48Z|00167|binding|INFO|Setting lport 2f7157df-cb90-4428-a3df-6f057985d5d6 up in Southbound
Nov 23 09:59:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:48.936 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap2f7157df-cb: No such device
Nov 23 09:59:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:48.976 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:49.003 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:49 np0005532585.localdomain ceph-mon[300199]: pgmap v161: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 10 MiB/s rd, 6.8 MiB/s wr, 409 op/s
Nov 23 09:59:49 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:49.578 2 INFO neutron.agent.securitygroups_rpc [None req-061cdcce-87b3-4fab-8b64-8613c3b5bd77 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 09:59:49 np0005532585.localdomain podman[314841]: 
Nov 23 09:59:49 np0005532585.localdomain podman[314841]: 2025-11-23 09:59:49.818170896 +0000 UTC m=+0.094208734 container create 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 09:59:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1.scope.
Nov 23 09:59:49 np0005532585.localdomain podman[314841]: 2025-11-23 09:59:49.77252743 +0000 UTC m=+0.048565288 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:59:49 np0005532585.localdomain systemd[1]: tmp-crun.dThqOZ.mount: Deactivated successfully.
Nov 23 09:59:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:59:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef3addd800f914b793d519bed727ded0e4b934dd69c77876f8acbef5e0fac26a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:59:49 np0005532585.localdomain podman[314841]: 2025-11-23 09:59:49.89362004 +0000 UTC m=+0.169657878 container init 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:49 np0005532585.localdomain podman[314841]: 2025-11-23 09:59:49.903790803 +0000 UTC m=+0.179828641 container start 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 09:59:49 np0005532585.localdomain dnsmasq[314859]: started, version 2.85 cachesize 150
Nov 23 09:59:49 np0005532585.localdomain dnsmasq[314859]: DNS service limited to local subnets
Nov 23 09:59:49 np0005532585.localdomain dnsmasq[314859]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:59:49 np0005532585.localdomain dnsmasq[314859]: warning: no upstream servers configured
Nov 23 09:59:49 np0005532585.localdomain dnsmasq-dhcp[314859]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 09:59:49 np0005532585.localdomain dnsmasq[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 09:59:49 np0005532585.localdomain dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 09:59:49 np0005532585.localdomain dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 09:59:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:49.958 263258 INFO neutron.agent.dhcp.agent [None req-088f6986-4544-4c9b-a0ae-3cd17c55f952 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:48Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b88850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b88c70>], id=4172c62e-88e1-41e0-99ba-63eaec4dfdc4, ip_allocation=immediate, mac_address=fa:16:3e:be:35:69, name=tempest-NetworksTestDHCPv6-1192500850, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['bac168bc-6669-4cb8-b775-3a9746bd36ef'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:47Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1114, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:49Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 09:59:50 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:50.058 263258 INFO neutron.agent.dhcp.agent [None req-ec6d5fd7-efb9-4eac-8775-4c7e8371b66c - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 09:59:50 np0005532585.localdomain dnsmasq[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 09:59:50 np0005532585.localdomain dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 09:59:50 np0005532585.localdomain podman[314878]: 2025-11-23 09:59:50.156251021 +0000 UTC m=+0.055097799 container kill 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 09:59:50 np0005532585.localdomain dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 09:59:50 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:50.411 263258 INFO neutron.agent.dhcp.agent [None req-ebc4ae50-499d-47a7-b809-f0574156c4cb - - - - - -] DHCP configuration for ports {'4172c62e-88e1-41e0-99ba-63eaec4dfdc4'} is completed
Nov 23 09:59:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:50.520 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 09:59:50 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:50.742 2 INFO neutron.agent.securitygroups_rpc [None req-5782673d-3ad2-4525-b0b7-33b67eb33956 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 09:59:51 np0005532585.localdomain dnsmasq[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 09:59:51 np0005532585.localdomain dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 09:59:51 np0005532585.localdomain podman[314914]: 2025-11-23 09:59:51.306916956 +0000 UTC m=+0.059124212 container kill 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 09:59:51 np0005532585.localdomain dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 09:59:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:51.313 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:51.406 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:51 np0005532585.localdomain ceph-mon[300199]: pgmap v162: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 8.8 MiB/s rd, 5.8 MiB/s wr, 351 op/s
Nov 23 09:59:52 np0005532585.localdomain dnsmasq[314859]: exiting on receipt of SIGTERM
Nov 23 09:59:52 np0005532585.localdomain systemd[1]: libpod-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1.scope: Deactivated successfully.
Nov 23 09:59:52 np0005532585.localdomain podman[314953]: 2025-11-23 09:59:52.138396109 +0000 UTC m=+0.058989258 container kill 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 09:59:52 np0005532585.localdomain podman[314967]: 2025-11-23 09:59:52.208999764 +0000 UTC m=+0.055205201 container died 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:59:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:52 np0005532585.localdomain podman[314967]: 2025-11-23 09:59:52.243716704 +0000 UTC m=+0.089922091 container cleanup 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:52 np0005532585.localdomain systemd[1]: libpod-conmon-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1.scope: Deactivated successfully.
Nov 23 09:59:52 np0005532585.localdomain podman[314969]: 2025-11-23 09:59:52.294025384 +0000 UTC m=+0.133762522 container remove 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:59:52 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:52Z|00168|binding|INFO|Releasing lport 2f7157df-cb90-4428-a3df-6f057985d5d6 from this chassis (sb_readonly=0)
Nov 23 09:59:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:52.348 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:52 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:52Z|00169|binding|INFO|Setting lport 2f7157df-cb90-4428-a3df-6f057985d5d6 down in Southbound
Nov 23 09:59:52 np0005532585.localdomain kernel: device tap2f7157df-cb left promiscuous mode
Nov 23 09:59:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:52.357 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=2f7157df-cb90-4428-a3df-6f057985d5d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:52.359 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2f7157df-cb90-4428-a3df-6f057985d5d6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 09:59:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:52.360 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:59:52 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:52.361 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[643148b5-c0dc-4802-a22e-9a9f1fb61ea2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:52.372 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:52 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:52.641 2 INFO neutron.agent.securitygroups_rpc [None req-633bd2af-73f5-42be-a8e1-16475aa1b324 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 09:59:53 np0005532585.localdomain sshd[314999]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:59:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ef3addd800f914b793d519bed727ded0e4b934dd69c77876f8acbef5e0fac26a-merged.mount: Deactivated successfully.
Nov 23 09:59:53 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 09:59:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 09:59:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 09:59:53 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:53.178 2 INFO neutron.agent.securitygroups_rpc [None req-33fb7598-4629-4242-b064-9d05bdc1e723 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 09:59:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:53 np0005532585.localdomain podman[315002]: 2025-11-23 09:59:53.256694688 +0000 UTC m=+0.087652571 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 09:59:53 np0005532585.localdomain podman[315002]: 2025-11-23 09:59:53.275235669 +0000 UTC m=+0.106193552 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 09:59:53 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 09:59:53 np0005532585.localdomain podman[315001]: 2025-11-23 09:59:53.373456555 +0000 UTC m=+0.206956966 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 09:59:53 np0005532585.localdomain sshd[314999]: Invalid user builder from 107.172.15.139 port 60128
Nov 23 09:59:53 np0005532585.localdomain podman[315001]: 2025-11-23 09:59:53.417363508 +0000 UTC m=+0.250863939 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 23 09:59:53 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:53.417 263258 INFO neutron.agent.linux.ip_lib [None req-cabf86b4-993a-41b4-af7e-6ccd00ca4a4e - - - - - -] Device tapddb7852b-7c cannot be used as it has no MAC address
Nov 23 09:59:53 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 09:59:53 np0005532585.localdomain sshd[314999]: Received disconnect from 107.172.15.139 port 60128:11: Bye Bye [preauth]
Nov 23 09:59:53 np0005532585.localdomain sshd[314999]: Disconnected from invalid user builder 107.172.15.139 port 60128 [preauth]
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.476 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:53 np0005532585.localdomain kernel: device tapddb7852b-7c entered promiscuous mode
Nov 23 09:59:53 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891993.4838] manager: (tapddb7852b-7c): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.484 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:53Z|00170|binding|INFO|Claiming lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a for this chassis.
Nov 23 09:59:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:53Z|00171|binding|INFO|ddb7852b-7c36-4aa5-8295-8df19c4d8b4a: Claiming unknown
Nov 23 09:59:53 np0005532585.localdomain systemd-udevd[315052]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:59:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:53Z|00172|binding|INFO|Setting lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a ovn-installed in OVS
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:53Z|00173|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 09:59:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:53.502 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ddb7852b-7c36-4aa5-8295-8df19c4d8b4a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:53.505 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ddb7852b-7c36-4aa5-8295-8df19c4d8b4a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 09:59:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:53.507 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:59:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:53.508 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3de220b8-09c9-4932-a40a-a53f25ff5ace]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.519 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:53 np0005532585.localdomain ceph-mon[300199]: pgmap v163: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 7.3 MiB/s rd, 4.9 MiB/s wr, 291 op/s
Nov 23 09:59:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:53Z|00174|binding|INFO|Setting lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a up in Southbound
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.545 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.560 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:53 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapddb7852b-7c: No such device
Nov 23 09:59:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:53.590 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:54 np0005532585.localdomain podman[315124]: 
Nov 23 09:59:54 np0005532585.localdomain podman[315124]: 2025-11-23 09:59:54.414278439 +0000 UTC m=+0.089922861 container create 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 09:59:54 np0005532585.localdomain systemd[1]: Started libpod-conmon-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533.scope.
Nov 23 09:59:54 np0005532585.localdomain podman[315124]: 2025-11-23 09:59:54.37183137 +0000 UTC m=+0.047475812 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 09:59:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 09:59:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4637a3d824d286df4a1b1fcf765afe447a8e7356de1248847cdf31f132b25d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 09:59:54 np0005532585.localdomain podman[315124]: 2025-11-23 09:59:54.489721603 +0000 UTC m=+0.165366025 container init 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 09:59:54 np0005532585.localdomain podman[315124]: 2025-11-23 09:59:54.500302208 +0000 UTC m=+0.175946620 container start 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:59:54 np0005532585.localdomain dnsmasq[315143]: started, version 2.85 cachesize 150
Nov 23 09:59:54 np0005532585.localdomain dnsmasq[315143]: DNS service limited to local subnets
Nov 23 09:59:54 np0005532585.localdomain dnsmasq[315143]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 09:59:54 np0005532585.localdomain dnsmasq[315143]: warning: no upstream servers configured
Nov 23 09:59:54 np0005532585.localdomain dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 09:59:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:59:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:59:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 09:59:54 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:54.737 263258 INFO neutron.agent.dhcp.agent [None req-b5c71915-d143-4e32-b3d1-cfe8e7c6f766 - - - - - -] DHCP configuration for ports {'f4a3c9d5-826f-4752-9fc8-6210e097bf26', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 09:59:54 np0005532585.localdomain sshd[315169]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 09:59:54 np0005532585.localdomain dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 09:59:54 np0005532585.localdomain podman[315159]: 2025-11-23 09:59:54.871875565 +0000 UTC m=+0.063275330 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 09:59:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:55.008 263258 INFO neutron.agent.dhcp.agent [None req-fc438f45-a2d7-4875-80ac-5c05f6b61b09 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:52Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c94401c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9440f10>], id=f4a3c9d5-826f-4752-9fc8-6210e097bf26, ip_allocation=immediate, mac_address=fa:16:3e:00:d6:91, name=tempest-NetworksTestDHCPv6-954743884, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['2ac68f23-53c8-4938-b68e-b9f69a6a44c5'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:52Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1142, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:52Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 09:59:55 np0005532585.localdomain dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 09:59:55 np0005532585.localdomain podman[315199]: 2025-11-23 09:59:55.174768716 +0000 UTC m=+0.058615747 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 09:59:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:55.193 263258 INFO neutron.agent.dhcp.agent [None req-3a18f5df-3aea-4672-b539-9e53d1b559a1 - - - - - -] DHCP configuration for ports {'ddb7852b-7c36-4aa5-8295-8df19c4d8b4a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 09:59:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:55.418 263258 INFO neutron.agent.dhcp.agent [None req-c425ca3f-6b9b-418b-9451-0fa7302be87d - - - - - -] DHCP configuration for ports {'f4a3c9d5-826f-4752-9fc8-6210e097bf26'} is completed
Nov 23 09:59:55 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:55.432 2 INFO neutron.agent.securitygroups_rpc [None req-4a1b8c07-dec2-4711-92de-c07233183ccc 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 09:59:55 np0005532585.localdomain ceph-mon[300199]: pgmap v164: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 1.6 MiB/s rd, 1.4 KiB/s wr, 83 op/s
Nov 23 09:59:55 np0005532585.localdomain dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 09:59:55 np0005532585.localdomain podman[315237]: 2025-11-23 09:59:55.501727858 +0000 UTC m=+0.061657781 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 09:59:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:56.166 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:54Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4d040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4db50>], id=38a31dac-5d6f-422a-b220-42a6c32d9bbc, ip_allocation=immediate, mac_address=fa:16:3e:9d:c2:b1, name=tempest-NetworksTestDHCPv6-934745237, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['5d113a57-a631-4e71-848c-68cfe31da68d'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:54Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1152, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:54Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 09:59:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:56.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:56 np0005532585.localdomain dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 09:59:56 np0005532585.localdomain podman[315276]: 2025-11-23 09:59:56.340834707 +0000 UTC m=+0.062282400 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 09:59:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:56.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:56.580 263258 INFO neutron.agent.dhcp.agent [None req-8610995d-fcc2-4559-b6ae-ad54e6c521ee - - - - - -] DHCP configuration for ports {'38a31dac-5d6f-422a-b220-42a6c32d9bbc'} is completed
Nov 23 09:59:56 np0005532585.localdomain sshd[315169]: Invalid user hacluster from 185.156.73.233 port 19304
Nov 23 09:59:56 np0005532585.localdomain sshd[315169]: Connection closed by invalid user hacluster 185.156.73.233 port 19304 [preauth]
Nov 23 09:59:57 np0005532585.localdomain ceph-mon[300199]: pgmap v165: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 1.6 MiB/s rd, 1.4 KiB/s wr, 83 op/s
Nov 23 09:59:57 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 09:59:57.988 2 INFO neutron.agent.securitygroups_rpc [None req-d239cbef-5e7b-4e18-8195-2f02667a16df 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 09:59:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:58.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:58 np0005532585.localdomain dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 09:59:58 np0005532585.localdomain podman[315314]: 2025-11-23 09:59:58.189254929 +0000 UTC m=+0.064041234 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 09:59:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 09:59:58 np0005532585.localdomain dnsmasq[315143]: exiting on receipt of SIGTERM
Nov 23 09:59:58 np0005532585.localdomain systemd[1]: tmp-crun.NJhgVy.mount: Deactivated successfully.
Nov 23 09:59:58 np0005532585.localdomain systemd[1]: libpod-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533.scope: Deactivated successfully.
Nov 23 09:59:58 np0005532585.localdomain podman[315353]: 2025-11-23 09:59:58.836323642 +0000 UTC m=+0.069451730 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:58 np0005532585.localdomain podman[315365]: 2025-11-23 09:59:58.913780429 +0000 UTC m=+0.065203201 container died 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 09:59:58 np0005532585.localdomain podman[315365]: 2025-11-23 09:59:58.945815515 +0000 UTC m=+0.097238267 container cleanup 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 09:59:58 np0005532585.localdomain systemd[1]: libpod-conmon-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533.scope: Deactivated successfully.
Nov 23 09:59:59 np0005532585.localdomain podman[315372]: 2025-11-23 09:59:59.000385437 +0000 UTC m=+0.138069655 container remove 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 09:59:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:59Z|00175|binding|INFO|Releasing lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a from this chassis (sb_readonly=0)
Nov 23 09:59:59 np0005532585.localdomain kernel: device tapddb7852b-7c left promiscuous mode
Nov 23 09:59:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:59Z|00176|binding|INFO|Setting lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a down in Southbound
Nov 23 09:59:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:59.057 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.074 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ddb7852b-7c36-4aa5-8295-8df19c4d8b4a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.075 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ddb7852b-7c36-4aa5-8295-8df19c4d8b4a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.076 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.077 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[82370e26-4b2c-4dd4-9376-e6340f32cfdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:59.082 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c4637a3d824d286df4a1b1fcf765afe447a8e7356de1248847cdf31f132b25d9-merged.mount: Deactivated successfully.
Nov 23 09:59:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533-userdata-shm.mount: Deactivated successfully.
Nov 23 09:59:59 np0005532585.localdomain ceph-mon[300199]: pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 1.2 KiB/s wr, 70 op/s
Nov 23 09:59:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:59.496 263258 INFO neutron.agent.dhcp.agent [None req-5d1e6f94-57c1-4b29-97f5-d5053daeb570 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 09:59:59 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 09:59:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 09:59:59.902 263258 INFO neutron.agent.linux.ip_lib [None req-168fae75-3402-4a90-b826-664c5b0144bc - - - - - -] Device tape328d85c-9f cannot be used as it has no MAC address
Nov 23 09:59:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:59.925 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:59 np0005532585.localdomain kernel: device tape328d85c-9f entered promiscuous mode
Nov 23 09:59:59 np0005532585.localdomain NetworkManager[5975]: <info>  [1763891999.9337] manager: (tape328d85c-9f): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Nov 23 09:59:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:59Z|00177|binding|INFO|Claiming lport e328d85c-9f4d-4a9c-9609-f789abfbba67 for this chassis.
Nov 23 09:59:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:59Z|00178|binding|INFO|e328d85c-9f4d-4a9c-9609-f789abfbba67: Claiming unknown
Nov 23 09:59:59 np0005532585.localdomain systemd-udevd[315404]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 09:59:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:59.942 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.949 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd38528c3feb64b31add54cee7508cb83', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02c6b597-012a-4c61-8f3c-0e9a58e1964d, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=e328d85c-9f4d-4a9c-9609-f789abfbba67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.951 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e328d85c-9f4d-4a9c-9609-f789abfbba67 in datapath 7292f404-64ea-4ef3-b81e-f698709e4eec bound to our chassis
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.953 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7292f404-64ea-4ef3-b81e-f698709e4eec or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 09:59:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 09:59:59.954 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cb35318d-b8a3-4a87-992c-74c6a610421e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 09:59:59 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 09:59:59 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 09:59:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:59Z|00179|binding|INFO|Setting lport e328d85c-9f4d-4a9c-9609-f789abfbba67 ovn-installed in OVS
Nov 23 09:59:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T09:59:59Z|00180|binding|INFO|Setting lport e328d85c-9f4d-4a9c-9609-f789abfbba67 up in Southbound
Nov 23 09:59:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 09:59:59.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 09:59:59 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 09:59:59 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 09:59:59 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   09:59:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 09:59:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 09:59:59 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 10:00:00 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 10:00:00 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape328d85c-9f: No such device
Nov 23 10:00:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:00.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:00.048 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:00 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:00.277 2 INFO neutron.agent.securitygroups_rpc [None req-07e0e453-973e-4bb8-a740-58232b03adf2 e78ebdfe612745638abad47217c77d70 a40d996843764f32a4281f01703f5aee - - default default] Security group member updated ['e81e3952-d0ad-411e-a904-c021d2ed129c']
Nov 23 10:00:00 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 10:00:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2092452904' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:00:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2092452904' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:00:00 np0005532585.localdomain podman[315475]: 
Nov 23 10:00:00 np0005532585.localdomain podman[315475]: 2025-11-23 10:00:00.852064279 +0000 UTC m=+0.084086292 container create 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:00 np0005532585.localdomain systemd[1]: Started libpod-conmon-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537.scope.
Nov 23 10:00:00 np0005532585.localdomain podman[315475]: 2025-11-23 10:00:00.810135277 +0000 UTC m=+0.042157290 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:00 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc1d3c6a45bfeaa6a92f58a5f600b24b7eb89e79f1a1294714d9a566d06522ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:00 np0005532585.localdomain podman[315475]: 2025-11-23 10:00:00.942970919 +0000 UTC m=+0.174992902 container init 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:00:00 np0005532585.localdomain podman[315475]: 2025-11-23 10:00:00.951930534 +0000 UTC m=+0.183952507 container start 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:00:00 np0005532585.localdomain dnsmasq[315494]: started, version 2.85 cachesize 150
Nov 23 10:00:00 np0005532585.localdomain dnsmasq[315494]: DNS service limited to local subnets
Nov 23 10:00:00 np0005532585.localdomain dnsmasq[315494]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:00 np0005532585.localdomain dnsmasq[315494]: warning: no upstream servers configured
Nov 23 10:00:00 np0005532585.localdomain dnsmasq-dhcp[315494]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:00:00 np0005532585.localdomain dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 0 addresses
Nov 23 10:00:00 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host
Nov 23 10:00:00 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts
Nov 23 10:00:01 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:01.072 263258 INFO neutron.agent.linux.ip_lib [None req-5d7b1cd6-bebe-493a-a524-d593c0794e40 - - - - - -] Device tap52f1f0cb-3e cannot be used as it has no MAC address
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.096 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain kernel: device tap52f1f0cb-3e entered promiscuous mode
Nov 23 10:00:01 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892001.1032] manager: (tap52f1f0cb-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Nov 23 10:00:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:01Z|00181|binding|INFO|Claiming lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c for this chassis.
Nov 23 10:00:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:01Z|00182|binding|INFO|52f1f0cb-3ec5-4ffc-9a9d-29176c63170c: Claiming unknown
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.103 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:01Z|00183|binding|INFO|Setting lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c ovn-installed in OVS
Nov 23 10:00:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:01Z|00184|binding|INFO|Setting lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c up in Southbound
Nov 23 10:00:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:01.119 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=52f1f0cb-3ec5-4ffc-9a9d-29176c63170c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.121 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:01.123 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:01.125 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:01 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:01.125 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e4953a1c-3986-478d-b6b3-ee287c275edd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.141 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:01.173 263258 INFO neutron.agent.dhcp.agent [None req-a7bd10e3-34c5-4365-8cc0-2551d9d95195 - - - - - -] DHCP configuration for ports {'8be851d6-7884-4472-9a21-8b074f4b4419'} is completed
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.320 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:01.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:01 np0005532585.localdomain ceph-mon[300199]: pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:01 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:01.668 2 INFO neutron.agent.securitygroups_rpc [None req-68228748-d1d1-4e93-958e-faf2dd4d659b 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:01 np0005532585.localdomain podman[315558]: 
Nov 23 10:00:01 np0005532585.localdomain podman[315558]: 2025-11-23 10:00:01.998723792 +0000 UTC m=+0.089783197 container create 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:00:02 np0005532585.localdomain systemd[1]: Started libpod-conmon-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6.scope.
Nov 23 10:00:02 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e345bc7b1352ed7e306c74f9fd2b5a68f452619862709da970e829614cf1996d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:02 np0005532585.localdomain podman[315558]: 2025-11-23 10:00:01.953522369 +0000 UTC m=+0.044581874 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:02 np0005532585.localdomain podman[315558]: 2025-11-23 10:00:02.059254947 +0000 UTC m=+0.150314352 container init 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:00:02 np0005532585.localdomain podman[315558]: 2025-11-23 10:00:02.0674649 +0000 UTC m=+0.158524325 container start 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: started, version 2.85 cachesize 150
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: DNS service limited to local subnets
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: warning: no upstream servers configured
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:02.107 2 INFO neutron.agent.securitygroups_rpc [None req-ba51cecc-c6ac-46c9-99a7-bae53245da97 e78ebdfe612745638abad47217c77d70 a40d996843764f32a4281f01703f5aee - - default default] Security group member updated ['e81e3952-d0ad-411e-a904-c021d2ed129c']
Nov 23 10:00:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.123 263258 INFO neutron.agent.dhcp.agent [None req-5d7b1cd6-bebe-493a-a524-d593c0794e40 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:00Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9403fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8be6d60>], id=b9f1c825-1100-4db2-a98a-7d9238cb1fc7, ip_allocation=immediate, mac_address=fa:16:3e:46:c3:1d, name=tempest-NetworksTestDHCPv6-99455620, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['2a51fc6e-7d45-4225-8d84-a35a19a90f22'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:59Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1189, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:01Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.144 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.225 263258 INFO neutron.agent.dhcp.agent [None req-13352e43-82c6-489f-bfa7-5e9abe7e047f - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:02 np0005532585.localdomain podman[315593]: 2025-11-23 10:00:02.324964501 +0000 UTC m=+0.064122755 container kill 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.571 263258 INFO neutron.agent.dhcp.agent [None req-397dccf5-f60e-49c7-b245-9a8c97791b23 - - - - - -] DHCP configuration for ports {'b9f1c825-1100-4db2-a98a-7d9238cb1fc7'} is completed
Nov 23 10:00:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:02Z|00185|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:00:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:02.744 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:02.745 2 INFO neutron.agent.securitygroups_rpc [None req-be23f1ad-34a8-40d2-b634-8fb333cbd4a1 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:02 np0005532585.localdomain podman[315615]: 2025-11-23 10:00:02.807652451 +0000 UTC m=+0.106887143 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:00:02 np0005532585.localdomain podman[315615]: 2025-11-23 10:00:02.823375945 +0000 UTC m=+0.122610617 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:00:02 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:00:02 np0005532585.localdomain dnsmasq[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:02 np0005532585.localdomain dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:02 np0005532585.localdomain podman[315649]: 2025-11-23 10:00:02.94752511 +0000 UTC m=+0.064558770 container kill 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:00:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:03 np0005532585.localdomain ceph-mon[300199]: pgmap v168: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:03 np0005532585.localdomain podman[315688]: 2025-11-23 10:00:03.829930573 +0000 UTC m=+0.057929496 container kill 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:03 np0005532585.localdomain dnsmasq[315576]: exiting on receipt of SIGTERM
Nov 23 10:00:03 np0005532585.localdomain systemd[1]: libpod-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6.scope: Deactivated successfully.
Nov 23 10:00:03 np0005532585.localdomain podman[315702]: 2025-11-23 10:00:03.888723824 +0000 UTC m=+0.048135835 container died 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:00:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e345bc7b1352ed7e306c74f9fd2b5a68f452619862709da970e829614cf1996d-merged.mount: Deactivated successfully.
Nov 23 10:00:03 np0005532585.localdomain podman[315702]: 2025-11-23 10:00:03.925510787 +0000 UTC m=+0.084922748 container cleanup 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:00:03 np0005532585.localdomain systemd[1]: libpod-conmon-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6.scope: Deactivated successfully.
Nov 23 10:00:03 np0005532585.localdomain podman[315704]: 2025-11-23 10:00:03.970442791 +0000 UTC m=+0.121947818 container remove 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:00:04 np0005532585.localdomain kernel: device tap52f1f0cb-3e left promiscuous mode
Nov 23 10:00:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:04.017 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:04Z|00186|binding|INFO|Releasing lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c from this chassis (sb_readonly=0)
Nov 23 10:00:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:04Z|00187|binding|INFO|Setting lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c down in Southbound
Nov 23 10:00:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:04.034 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=52f1f0cb-3ec5-4ffc-9a9d-29176c63170c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:04.036 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:00:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:04.036 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:04.038 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:04.039 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[490d65d2-0839-464f-9946-62df02652edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:04.306 263258 INFO neutron.agent.dhcp.agent [None req-1e2c7cbf-448b-437e-99cc-11c814803bad - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:00:04 np0005532585.localdomain podman[315731]: 2025-11-23 10:00:04.407797694 +0000 UTC m=+0.079854232 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:00:04 np0005532585.localdomain podman[315731]: 2025-11-23 10:00:04.422413464 +0000 UTC m=+0.094469992 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:00:04 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:00:04 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:00:05 np0005532585.localdomain ceph-mon[300199]: pgmap v169: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:05 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:05.740 263258 INFO neutron.agent.linux.ip_lib [None req-6377a058-1dee-49ef-a9ac-2ba6ba735af3 - - - - - -] Device tap1361841c-06 cannot be used as it has no MAC address
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:05 np0005532585.localdomain kernel: device tap1361841c-06 entered promiscuous mode
Nov 23 10:00:05 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892005.7720] manager: (tap1361841c-06): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Nov 23 10:00:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:05Z|00188|binding|INFO|Claiming lport 1361841c-06ad-4d69-bc3f-025652255be1 for this chassis.
Nov 23 10:00:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:05Z|00189|binding|INFO|1361841c-06ad-4d69-bc3f-025652255be1: Claiming unknown
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.774 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:05 np0005532585.localdomain systemd-udevd[315762]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:05Z|00190|binding|INFO|Setting lport 1361841c-06ad-4d69-bc3f-025652255be1 ovn-installed in OVS
Nov 23 10:00:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:05Z|00191|binding|INFO|Setting lport 1361841c-06ad-4d69-bc3f-025652255be1 up in Southbound
Nov 23 10:00:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:05.783 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1361841c-06ad-4d69-bc3f-025652255be1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:05.785 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1361841c-06ad-4d69-bc3f-025652255be1 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.785 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:05.786 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:05.787 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2741a8aa-8841-49c8-a4a5-960ee92e1173]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.809 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.816 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.845 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:05.871 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:06.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:06.410 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:06 np0005532585.localdomain podman[315818]: 
Nov 23 10:00:06 np0005532585.localdomain podman[315818]: 2025-11-23 10:00:06.680265168 +0000 UTC m=+0.077048684 container create a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:06 np0005532585.localdomain systemd[1]: Started libpod-conmon-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9.scope.
Nov 23 10:00:06 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:06.734 2 INFO neutron.agent.securitygroups_rpc [None req-43a1a170-ce3b-4775-9849-731eb3e4f92f 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:06 np0005532585.localdomain systemd[1]: tmp-crun.JgwRav.mount: Deactivated successfully.
Nov 23 10:00:06 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:06 np0005532585.localdomain podman[315818]: 2025-11-23 10:00:06.647198069 +0000 UTC m=+0.043981625 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:06 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8e400ebade1d5745f819f7e784ec959ce2672dd5f201c317d18abdd769f4f96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:06 np0005532585.localdomain podman[315818]: 2025-11-23 10:00:06.762203752 +0000 UTC m=+0.158987298 container init a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:00:06 np0005532585.localdomain podman[315818]: 2025-11-23 10:00:06.773526941 +0000 UTC m=+0.170310467 container start a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:00:06 np0005532585.localdomain dnsmasq[315837]: started, version 2.85 cachesize 150
Nov 23 10:00:06 np0005532585.localdomain dnsmasq[315837]: DNS service limited to local subnets
Nov 23 10:00:06 np0005532585.localdomain dnsmasq[315837]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:06 np0005532585.localdomain dnsmasq[315837]: warning: no upstream servers configured
Nov 23 10:00:06 np0005532585.localdomain dnsmasq[315837]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:06.864 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:05Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bdbd30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bdbeb0>], id=748d6a3f-f1c6-434f-b00a-7c0ef0e7aba9, ip_allocation=immediate, mac_address=fa:16:3e:a7:25:4d, name=tempest-NetworksTestDHCPv6-1735144909, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['ef90d8b7-7256-412e-b516-d1aad7604a50'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:04Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1203, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:06Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:06.913 263258 INFO neutron.agent.dhcp.agent [None req-e60af20a-8387-4929-a046-275fef5541b1 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:07 np0005532585.localdomain dnsmasq[315837]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:07 np0005532585.localdomain podman[315854]: 2025-11-23 10:00:07.053613329 +0000 UTC m=+0.062879758 container kill a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:00:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:07.216 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:07.280 263258 INFO neutron.agent.dhcp.agent [None req-614445ea-dfe2-427b-b0a8-794db535cf75 - - - - - -] DHCP configuration for ports {'748d6a3f-f1c6-434f-b00a-7c0ef0e7aba9'} is completed
Nov 23 10:00:07 np0005532585.localdomain ceph-mon[300199]: pgmap v170: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:08.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:08.411 2 INFO neutron.agent.securitygroups_rpc [None req-c2842295-e0a9-464d-87c0-40db2e234814 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:08 np0005532585.localdomain dnsmasq[315837]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:08 np0005532585.localdomain podman[315894]: 2025-11-23 10:00:08.651261045 +0000 UTC m=+0.051500478 container kill a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:00:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:08.911 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:08Z, description=, device_id=6c959180-536e-4cbb-a6e5-3082c340988b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd130>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b502b0>], id=4a56a8b3-653a-448c-b174-df8ff4b0cddd, ip_allocation=immediate, mac_address=fa:16:3e:2c:29:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:56Z, description=, dns_domain=, id=7292f404-64ea-4ef3-b81e-f698709e4eec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1432555221-network, port_security_enabled=True, project_id=d38528c3feb64b31add54cee7508cb83, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62314, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1166, status=ACTIVE, subnets=['6dac9eaf-88ad-4550-8301-df230ecdba75'], tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T09:59:58Z, vlan_transparent=None, network_id=7292f404-64ea-4ef3-b81e-f698709e4eec, port_security_enabled=False, project_id=d38528c3feb64b31add54cee7508cb83, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1211, status=DOWN, tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T10:00:08Z on network 7292f404-64ea-4ef3-b81e-f698709e4eec
Nov 23 10:00:09 np0005532585.localdomain dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 1 addresses
Nov 23 10:00:09 np0005532585.localdomain podman[315932]: 2025-11-23 10:00:09.139562587 +0000 UTC m=+0.056765159 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:00:09 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host
Nov 23 10:00:09 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts
Nov 23 10:00:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:00:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:00:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:09.300 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:00:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:09.378 263258 INFO neutron.agent.dhcp.agent [None req-976ea8c9-fe78-4fd1-a780-eaea2a2a7b47 - - - - - -] DHCP configuration for ports {'4a56a8b3-653a-448c-b174-df8ff4b0cddd'} is completed
Nov 23 10:00:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:09.383 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:09 np0005532585.localdomain ceph-mon[300199]: pgmap v171: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:09.841 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:08Z, description=, device_id=6c959180-536e-4cbb-a6e5-3082c340988b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afb100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afb040>], id=4a56a8b3-653a-448c-b174-df8ff4b0cddd, ip_allocation=immediate, mac_address=fa:16:3e:2c:29:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:56Z, description=, dns_domain=, id=7292f404-64ea-4ef3-b81e-f698709e4eec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1432555221-network, port_security_enabled=True, project_id=d38528c3feb64b31add54cee7508cb83, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62314, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1166, status=ACTIVE, subnets=['6dac9eaf-88ad-4550-8301-df230ecdba75'], tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T09:59:58Z, vlan_transparent=None, network_id=7292f404-64ea-4ef3-b81e-f698709e4eec, port_security_enabled=False, project_id=d38528c3feb64b31add54cee7508cb83, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1211, status=DOWN, tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T10:00:08Z on network 7292f404-64ea-4ef3-b81e-f698709e4eec
Nov 23 10:00:10 np0005532585.localdomain systemd[1]: tmp-crun.Iw6nSX.mount: Deactivated successfully.
Nov 23 10:00:10 np0005532585.localdomain dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 1 addresses
Nov 23 10:00:10 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host
Nov 23 10:00:10 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts
Nov 23 10:00:10 np0005532585.localdomain podman[315981]: 2025-11-23 10:00:10.128738979 +0000 UTC m=+0.066321205 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 10:00:10 np0005532585.localdomain dnsmasq[315837]: exiting on receipt of SIGTERM
Nov 23 10:00:10 np0005532585.localdomain podman[315998]: 2025-11-23 10:00:10.184473506 +0000 UTC m=+0.051890130 container kill a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:00:10 np0005532585.localdomain systemd[1]: libpod-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9.scope: Deactivated successfully.
Nov 23 10:00:10 np0005532585.localdomain podman[316015]: 2025-11-23 10:00:10.259243259 +0000 UTC m=+0.054759568 container died a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:10.350 263258 INFO neutron.agent.dhcp.agent [None req-1470576d-d2ca-4f1e-8345-83ef73dfaf8e - - - - - -] DHCP configuration for ports {'4a56a8b3-653a-448c-b174-df8ff4b0cddd'} is completed
Nov 23 10:00:10 np0005532585.localdomain podman[316015]: 2025-11-23 10:00:10.366831153 +0000 UTC m=+0.162347452 container remove a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:00:10 np0005532585.localdomain systemd[1]: libpod-conmon-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9.scope: Deactivated successfully.
Nov 23 10:00:10 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:10Z|00192|binding|INFO|Releasing lport 1361841c-06ad-4d69-bc3f-025652255be1 from this chassis (sb_readonly=0)
Nov 23 10:00:10 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:10Z|00193|binding|INFO|Setting lport 1361841c-06ad-4d69-bc3f-025652255be1 down in Southbound
Nov 23 10:00:10 np0005532585.localdomain kernel: device tap1361841c-06 left promiscuous mode
Nov 23 10:00:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:10.382 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:10.397 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1361841c-06ad-4d69-bc3f-025652255be1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:10.398 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1361841c-06ad-4d69-bc3f-025652255be1 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:00:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:10.399 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:10.400 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc12948-2404-42e8-b030-bfb1f3359faa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:10.401 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:10.728 263258 INFO neutron.agent.dhcp.agent [None req-01df078b-a456-43c1-8df4-ab8e3d09ebe5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:11 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a8e400ebade1d5745f819f7e784ec959ce2672dd5f201c317d18abdd769f4f96-merged.mount: Deactivated successfully.
Nov 23 10:00:11 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:00:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:11.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:11.412 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:11.439 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:11 np0005532585.localdomain ceph-mon[300199]: pgmap v172: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:00:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:00:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:00:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 10:00:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:11.912 263258 INFO neutron.agent.linux.ip_lib [None req-9abd4672-52f3-4ee0-bc5e-3d41ce6ff1f9 - - - - - -] Device tape1ef4443-64 cannot be used as it has no MAC address
Nov 23 10:00:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:11.940 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:11 np0005532585.localdomain kernel: device tape1ef4443-64 entered promiscuous mode
Nov 23 10:00:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:11Z|00194|binding|INFO|Claiming lport e1ef4443-64c5-4e20-a52c-3034596e7966 for this chassis.
Nov 23 10:00:11 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892011.9488] manager: (tape1ef4443-64): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Nov 23 10:00:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:11Z|00195|binding|INFO|e1ef4443-64c5-4e20-a52c-3034596e7966: Claiming unknown
Nov 23 10:00:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:00:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19238 "" "Go-http-client/1.1"
Nov 23 10:00:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:11.950 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:11 np0005532585.localdomain systemd-udevd[316058]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:11Z|00196|binding|INFO|Setting lport e1ef4443-64c5-4e20-a52c-3034596e7966 ovn-installed in OVS
Nov 23 10:00:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:11.960 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=e1ef4443-64c5-4e20-a52c-3034596e7966) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:11Z|00197|binding|INFO|Setting lport e1ef4443-64c5-4e20-a52c-3034596e7966 up in Southbound
Nov 23 10:00:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:11.962 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:11.964 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e1ef4443-64c5-4e20-a52c-3034596e7966 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:11.965 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:11.966 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[efe1fad7-83b9-4162-9a15-a9b4c93ca24f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:11.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:12 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:12 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:12 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:12 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:12 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape1ef4443-64: No such device
Nov 23 10:00:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:12.036 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:12.053 2 INFO neutron.agent.securitygroups_rpc [None req-311a0dd8-e6e7-491c-ad02-2876db83aabe 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:12.064 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:12 np0005532585.localdomain podman[316129]: 
Nov 23 10:00:12 np0005532585.localdomain podman[316129]: 2025-11-23 10:00:12.876435323 +0000 UTC m=+0.081943745 container create 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:00:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522.scope.
Nov 23 10:00:12 np0005532585.localdomain podman[316129]: 2025-11-23 10:00:12.83835188 +0000 UTC m=+0.043860302 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:12 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9db4e9f89b2d873d6f9e5ee936e95cdb8a3e819e23918b397cc465f572f0c23e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:12 np0005532585.localdomain podman[316129]: 2025-11-23 10:00:12.957206602 +0000 UTC m=+0.162715024 container init 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:12 np0005532585.localdomain podman[316129]: 2025-11-23 10:00:12.966307552 +0000 UTC m=+0.171815974 container start 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:12 np0005532585.localdomain dnsmasq[316147]: started, version 2.85 cachesize 150
Nov 23 10:00:12 np0005532585.localdomain dnsmasq[316147]: DNS service limited to local subnets
Nov 23 10:00:12 np0005532585.localdomain dnsmasq[316147]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:12 np0005532585.localdomain dnsmasq[316147]: warning: no upstream servers configured
Nov 23 10:00:12 np0005532585.localdomain dnsmasq-dhcp[316147]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:00:12 np0005532585.localdomain dnsmasq[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:12 np0005532585.localdomain dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:12 np0005532585.localdomain dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:13.025 263258 INFO neutron.agent.dhcp.agent [None req-9abd4672-52f3-4ee0-bc5e-3d41ce6ff1f9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:11Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3ca00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afb670>], id=8ff2de60-ed9d-46a1-a09b-0f3c4a9e36af, ip_allocation=immediate, mac_address=fa:16:3e:d5:50:83, name=tempest-NetworksTestDHCPv6-1256092217, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['02326955-b246-4007-9c83-62554b760510'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:10Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1232, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:11Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:13 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:13.109 2 INFO neutron.agent.securitygroups_rpc [None req-e5debdc8-a6b2-4239-b0b7-2d251ec66c55 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:13.124 263258 INFO neutron.agent.dhcp.agent [None req-79dc8a8d-cdb6-4c30-8c66-e0cb306034b7 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:13 np0005532585.localdomain dnsmasq[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:13 np0005532585.localdomain dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:13 np0005532585.localdomain podman[316166]: 2025-11-23 10:00:13.203274402 +0000 UTC m=+0.064112146 container kill 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:13 np0005532585.localdomain dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:13.441 263258 INFO neutron.agent.dhcp.agent [None req-b4e79595-aeb3-4100-becc-38a590576024 - - - - - -] DHCP configuration for ports {'8ff2de60-ed9d-46a1-a09b-0f3c4a9e36af'} is completed
Nov 23 10:00:13 np0005532585.localdomain dnsmasq[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:13 np0005532585.localdomain dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:13 np0005532585.localdomain dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:13 np0005532585.localdomain podman[316204]: 2025-11-23 10:00:13.509768143 +0000 UTC m=+0.048671690 container kill 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:13 np0005532585.localdomain ceph-mon[300199]: pgmap v173: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:00:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:00:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:00:13 np0005532585.localdomain podman[316224]: 2025-11-23 10:00:13.78526571 +0000 UTC m=+0.087767465 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 10:00:13 np0005532585.localdomain podman[316226]: 2025-11-23 10:00:13.850381026 +0000 UTC m=+0.148940580 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:00:13 np0005532585.localdomain podman[316226]: 2025-11-23 10:00:13.862235151 +0000 UTC m=+0.160794695 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 23 10:00:13 np0005532585.localdomain podman[316224]: 2025-11-23 10:00:13.872724595 +0000 UTC m=+0.175226390 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 10:00:13 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:00:13 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:00:13 np0005532585.localdomain podman[316225]: 2025-11-23 10:00:13.762247501 +0000 UTC m=+0.066462179 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:00:13 np0005532585.localdomain podman[316225]: 2025-11-23 10:00:13.945168966 +0000 UTC m=+0.249383674 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:13 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:00:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:14.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:14.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:00:14 np0005532585.localdomain dnsmasq[316147]: exiting on receipt of SIGTERM
Nov 23 10:00:14 np0005532585.localdomain podman[316302]: 2025-11-23 10:00:14.233737435 +0000 UTC m=+0.061556887 container kill 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:00:14 np0005532585.localdomain systemd[1]: libpod-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522.scope: Deactivated successfully.
Nov 23 10:00:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:14.297 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 23 10:00:14 np0005532585.localdomain podman[316316]: 2025-11-23 10:00:14.304752153 +0000 UTC m=+0.056377847 container died 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:00:14 np0005532585.localdomain systemd[1]: tmp-crun.9af0m0.mount: Deactivated successfully.
Nov 23 10:00:14 np0005532585.localdomain podman[316316]: 2025-11-23 10:00:14.339947707 +0000 UTC m=+0.091573371 container cleanup 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:00:14 np0005532585.localdomain systemd[1]: libpod-conmon-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522.scope: Deactivated successfully.
Nov 23 10:00:14 np0005532585.localdomain podman[316318]: 2025-11-23 10:00:14.391556487 +0000 UTC m=+0.135227246 container remove 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:14.436 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:14Z|00198|binding|INFO|Releasing lport e1ef4443-64c5-4e20-a52c-3034596e7966 from this chassis (sb_readonly=0)
Nov 23 10:00:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:14Z|00199|binding|INFO|Setting lport e1ef4443-64c5-4e20-a52c-3034596e7966 down in Southbound
Nov 23 10:00:14 np0005532585.localdomain kernel: device tape1ef4443-64 left promiscuous mode
Nov 23 10:00:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:14.448 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=e1ef4443-64c5-4e20-a52c-3034596e7966) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:14.450 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e1ef4443-64c5-4e20-a52c-3034596e7966 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:00:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:14.452 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:14.453 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[74025d70-b89d-4382-8ebd-7c68163d87b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:14.461 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:14.756 263258 INFO neutron.agent.dhcp.agent [None req-3ba73744-a7bb-4a1f-9f69-b6a0b3c24a92 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:14 np0005532585.localdomain dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 0 addresses
Nov 23 10:00:14 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host
Nov 23 10:00:14 np0005532585.localdomain podman[316364]: 2025-11-23 10:00:14.875769274 +0000 UTC m=+0.067338166 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:14 np0005532585.localdomain dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts
Nov 23 10:00:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9db4e9f89b2d873d6f9e5ee936e95cdb8a3e819e23918b397cc465f572f0c23e-merged.mount: Deactivated successfully.
Nov 23 10:00:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:14 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:00:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:15.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:15.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:00:15 np0005532585.localdomain ceph-mon[300199]: pgmap v174: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:15.552 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:15Z|00200|binding|INFO|Releasing lport e328d85c-9f4d-4a9c-9609-f789abfbba67 from this chassis (sb_readonly=0)
Nov 23 10:00:15 np0005532585.localdomain kernel: device tape328d85c-9f left promiscuous mode
Nov 23 10:00:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:15Z|00201|binding|INFO|Setting lport e328d85c-9f4d-4a9c-9609-f789abfbba67 down in Southbound
Nov 23 10:00:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:15.566 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd38528c3feb64b31add54cee7508cb83', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02c6b597-012a-4c61-8f3c-0e9a58e1964d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=e328d85c-9f4d-4a9c-9609-f789abfbba67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:15.568 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e328d85c-9f4d-4a9c-9609-f789abfbba67 in datapath 7292f404-64ea-4ef3-b81e-f698709e4eec unbound from our chassis
Nov 23 10:00:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:15.571 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7292f404-64ea-4ef3-b81e-f698709e4eec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:00:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:15.572 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[15329bf2-b781-4e7c-a538-7507f500eabb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:15.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.238 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.239 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.391 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4139638832' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1141783378' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:16.502 263258 INFO neutron.agent.linux.ip_lib [None req-42d2d7f7-1adb-4eaa-9dcb-87d671731a21 - - - - - -] Device tap95b29c7c-cf cannot be used as it has no MAC address
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.528 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain kernel: device tap95b29c7c-cf entered promiscuous mode
Nov 23 10:00:16 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892016.5362] manager: (tap95b29c7c-cf): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.537 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:16Z|00202|binding|INFO|Claiming lport 95b29c7c-cf48-409a-9165-0c602a1e631c for this chassis.
Nov 23 10:00:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:16Z|00203|binding|INFO|95b29c7c-cf48-409a-9165-0c602a1e631c: Claiming unknown
Nov 23 10:00:16 np0005532585.localdomain systemd-udevd[316417]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:16.546 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=95b29c7c-cf48-409a-9165-0c602a1e631c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:16Z|00204|binding|INFO|Setting lport 95b29c7c-cf48-409a-9165-0c602a1e631c ovn-installed in OVS
Nov 23 10:00:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:16Z|00205|binding|INFO|Setting lport 95b29c7c-cf48-409a-9165-0c602a1e631c up in Southbound
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:16.550 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 95b29c7c-cf48-409a-9165-0c602a1e631c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:16.551 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:16.552 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3fcb38-1b8c-44bd-939c-e0e0163eeb29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.606 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.615 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:16 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:00:16 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1277874800' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.694 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.777 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:00:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:16.778 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:00:16 np0005532585.localdomain sshd[316458]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.027 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.029 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11204MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.029 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.030 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.095 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.134 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:00:17 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:17.160 2 INFO neutron.agent.securitygroups_rpc [None req-eec5559f-af9d-40c0-b1ad-486b576202fe 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:17 np0005532585.localdomain podman[316513]: 
Nov 23 10:00:17 np0005532585.localdomain podman[316513]: 2025-11-23 10:00:17.421493995 +0000 UTC m=+0.092437288 container create ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:00:17 np0005532585.localdomain systemd[1]: Started libpod-conmon-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d.scope.
Nov 23 10:00:17 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:17 np0005532585.localdomain podman[316513]: 2025-11-23 10:00:17.375349924 +0000 UTC m=+0.046293287 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:17 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34fe63b57edfb9e1b264cb2cb286bcf45ccef5cee20574124af5a76e14a96835/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:17 np0005532585.localdomain podman[316513]: 2025-11-23 10:00:17.487486868 +0000 UTC m=+0.158430171 container init ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2257090387' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1277874800' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4073821418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:17 np0005532585.localdomain ceph-mon[300199]: pgmap v175: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:17 np0005532585.localdomain podman[316513]: 2025-11-23 10:00:17.49629143 +0000 UTC m=+0.167234733 container start ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:00:17 np0005532585.localdomain dnsmasq[316531]: started, version 2.85 cachesize 150
Nov 23 10:00:17 np0005532585.localdomain dnsmasq[316531]: DNS service limited to local subnets
Nov 23 10:00:17 np0005532585.localdomain dnsmasq[316531]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:17 np0005532585.localdomain dnsmasq[316531]: warning: no upstream servers configured
Nov 23 10:00:17 np0005532585.localdomain dnsmasq-dhcp[316531]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:00:17 np0005532585.localdomain dnsmasq[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:17 np0005532585.localdomain dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:17 np0005532585.localdomain dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:17.559 263258 INFO neutron.agent.dhcp.agent [None req-42d2d7f7-1adb-4eaa-9dcb-87d671731a21 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:16Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b58970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b58c40>], id=dc44aaa3-710b-4f8f-a729-299c89225fa7, ip_allocation=immediate, mac_address=fa:16:3e:9c:57:f5, name=tempest-NetworksTestDHCPv6-1200518540, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['479278d7-0738-4be2-9ede-7f789b35986a'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:14Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1249, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:16Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:00:17 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/458760552' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.582 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.586 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.603 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.625 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:00:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:17.625 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:00:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:17.626 263258 INFO neutron.agent.dhcp.agent [None req-d6c0039d-cf8b-4514-ab61-ee8f413fb5e8 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:17 np0005532585.localdomain sshd[316458]: Received disconnect from 207.154.194.2 port 56162:11: Bye Bye [preauth]
Nov 23 10:00:17 np0005532585.localdomain sshd[316458]: Disconnected from authenticating user root 207.154.194.2 port 56162 [preauth]
Nov 23 10:00:17 np0005532585.localdomain podman[316551]: 2025-11-23 10:00:17.72128411 +0000 UTC m=+0.039846148 container kill ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:00:17 np0005532585.localdomain dnsmasq[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:17 np0005532585.localdomain dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:17 np0005532585.localdomain dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:18 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:18.212 263258 INFO neutron.agent.dhcp.agent [None req-8a1c29c8-3381-4308-a43f-4e71fce393e7 - - - - - -] DHCP configuration for ports {'dc44aaa3-710b-4f8f-a729-299c89225fa7'} is completed
Nov 23 10:00:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:18 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/458760552' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:00:19 np0005532585.localdomain ceph-mon[300199]: pgmap v176: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:20 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:20.060 2 INFO neutron.agent.securitygroups_rpc [None req-0b580d26-bcb7-428b-9b48-acb40f408d24 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:20 np0005532585.localdomain dnsmasq[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:20 np0005532585.localdomain podman[316588]: 2025-11-23 10:00:20.25103484 +0000 UTC m=+0.046637367 container kill ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:00:20 np0005532585.localdomain dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:20 np0005532585.localdomain dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:20.624 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:20.625 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:00:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:21.393 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:21.415 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:21Z|00206|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:21 np0005532585.localdomain podman[316625]: 2025-11-23 10:00:21.483585768 +0000 UTC m=+0.063201497 container kill ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:21 np0005532585.localdomain dnsmasq[316531]: exiting on receipt of SIGTERM
Nov 23 10:00:21 np0005532585.localdomain systemd[1]: libpod-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d.scope: Deactivated successfully.
Nov 23 10:00:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:21.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:21 np0005532585.localdomain ceph-mon[300199]: pgmap v177: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:21 np0005532585.localdomain podman[316640]: 2025-11-23 10:00:21.547559289 +0000 UTC m=+0.044941015 container died ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:00:21 np0005532585.localdomain systemd[1]: tmp-crun.A74vZz.mount: Deactivated successfully.
Nov 23 10:00:21 np0005532585.localdomain podman[316640]: 2025-11-23 10:00:21.602360038 +0000 UTC m=+0.099741704 container remove ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:00:21 np0005532585.localdomain systemd[1]: libpod-conmon-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d.scope: Deactivated successfully.
Nov 23 10:00:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:21.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:21 np0005532585.localdomain kernel: device tap95b29c7c-cf left promiscuous mode
Nov 23 10:00:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:21Z|00207|binding|INFO|Releasing lport 95b29c7c-cf48-409a-9165-0c602a1e631c from this chassis (sb_readonly=0)
Nov 23 10:00:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:21Z|00208|binding|INFO|Setting lport 95b29c7c-cf48-409a-9165-0c602a1e631c down in Southbound
Nov 23 10:00:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:21.639 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=95b29c7c-cf48-409a-9165-0c602a1e631c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:21.640 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:21.642 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 95b29c7c-cf48-409a-9165-0c602a1e631c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:00:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:21.643 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:21.644 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[94039926-b881-455c-a027-10e0be496add]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:21.946 263258 INFO neutron.agent.dhcp.agent [None req-4e4a752d-f6ad-424f-8706-c4fa15a972ea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:22 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:22Z|00209|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:22.120 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:22 np0005532585.localdomain dnsmasq[315494]: exiting on receipt of SIGTERM
Nov 23 10:00:22 np0005532585.localdomain podman[316684]: 2025-11-23 10:00:22.317318122 +0000 UTC m=+0.058906825 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: libpod-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537.scope: Deactivated successfully.
Nov 23 10:00:22 np0005532585.localdomain podman[316698]: 2025-11-23 10:00:22.38994865 +0000 UTC m=+0.059272777 container died 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:00:22 np0005532585.localdomain podman[316698]: 2025-11-23 10:00:22.421649706 +0000 UTC m=+0.090973783 container cleanup 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: libpod-conmon-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537.scope: Deactivated successfully.
Nov 23 10:00:22 np0005532585.localdomain podman[316705]: 2025-11-23 10:00:22.477048193 +0000 UTC m=+0.133242646 container remove 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-34fe63b57edfb9e1b264cb2cb286bcf45ccef5cee20574124af5a76e14a96835-merged.mount: Deactivated successfully.
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-dc1d3c6a45bfeaa6a92f58a5f600b24b7eb89e79f1a1294714d9a566d06522ea-merged.mount: Deactivated successfully.
Nov 23 10:00:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:23 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d7292f404\x2d64ea\x2d4ef3\x2db81e\x2df698709e4eec.mount: Deactivated successfully.
Nov 23 10:00:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.014 263258 INFO neutron.agent.dhcp.agent [None req-f3721c64-3324-4463-b4c0-bae26c902645 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.015 263258 INFO neutron.agent.dhcp.agent [None req-f3721c64-3324-4463-b4c0-bae26c902645 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.055 263258 INFO neutron.agent.linux.ip_lib [None req-2b84a56f-eea8-4921-9446-c7ce5ebcdec5 - - - - - -] Device tap5f699be7-b1 cannot be used as it has no MAC address
Nov 23 10:00:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:23.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:23 np0005532585.localdomain kernel: device tap5f699be7-b1 entered promiscuous mode
Nov 23 10:00:23 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892023.0851] manager: (tap5f699be7-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Nov 23 10:00:23 np0005532585.localdomain systemd-udevd[316736]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:23Z|00210|binding|INFO|Claiming lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f for this chassis.
Nov 23 10:00:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:23Z|00211|binding|INFO|5f699be7-b1a4-4b0a-9b05-aa2b5159827f: Claiming unknown
Nov 23 10:00:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:23.089 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:23.116 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=5f699be7-b1a4-4b0a-9b05-aa2b5159827f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:23.117 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 5f699be7-b1a4-4b0a-9b05-aa2b5159827f in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:23.118 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:23.119 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3c0f35-ca88-453a-8bc3-cc2bdfccb15c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:23Z|00212|binding|INFO|Setting lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f ovn-installed in OVS
Nov 23 10:00:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:23Z|00213|binding|INFO|Setting lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f up in Southbound
Nov 23 10:00:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:23.135 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap5f699be7-b1: No such device
Nov 23 10:00:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:23.171 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:23.194 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.472 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:23 np0005532585.localdomain ceph-mon[300199]: pgmap v178: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:00:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:00:24 np0005532585.localdomain podman[316808]: 2025-11-23 10:00:24.044961793 +0000 UTC m=+0.095884385 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 23 10:00:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:24.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:24 np0005532585.localdomain podman[316808]: 2025-11-23 10:00:24.083317785 +0000 UTC m=+0.134240337 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:00:24 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:00:24 np0005532585.localdomain podman[316807]: 
Nov 23 10:00:24 np0005532585.localdomain podman[316807]: 2025-11-23 10:00:24.114768854 +0000 UTC m=+0.173928069 container create 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:24 np0005532585.localdomain podman[316814]: 2025-11-23 10:00:24.138521225 +0000 UTC m=+0.186705423 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:00:24 np0005532585.localdomain podman[316814]: 2025-11-23 10:00:24.145210292 +0000 UTC m=+0.193394490 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:00:24 np0005532585.localdomain podman[316807]: 2025-11-23 10:00:24.051737461 +0000 UTC m=+0.110896676 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:24 np0005532585.localdomain systemd[1]: Started libpod-conmon-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339.scope.
Nov 23 10:00:24 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:00:24 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:24 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/038226a67fa7d74c36919df9cd9edb6225722632a91f4877537f142348f28939/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:24 np0005532585.localdomain podman[316807]: 2025-11-23 10:00:24.19259137 +0000 UTC m=+0.251750595 container init 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:24 np0005532585.localdomain podman[316807]: 2025-11-23 10:00:24.204033383 +0000 UTC m=+0.263192648 container start 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:24 np0005532585.localdomain dnsmasq[316868]: started, version 2.85 cachesize 150
Nov 23 10:00:24 np0005532585.localdomain dnsmasq[316868]: DNS service limited to local subnets
Nov 23 10:00:24 np0005532585.localdomain dnsmasq[316868]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:24 np0005532585.localdomain dnsmasq[316868]: warning: no upstream servers configured
Nov 23 10:00:24 np0005532585.localdomain dnsmasq-dhcp[316868]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:00:24 np0005532585.localdomain dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:24 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:24 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:24 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:24.402 263258 INFO neutron.agent.dhcp.agent [None req-e4604f50-da11-412e-9d24-d5397ebfd058 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:24 np0005532585.localdomain dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:24 np0005532585.localdomain podman[316887]: 2025-11-23 10:00:24.53667651 +0000 UTC m=+0.059242946 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:00:24 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:24 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:24 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:24.837 263258 INFO neutron.agent.dhcp.agent [None req-f455726f-5d44-445b-81f1-fb5d4bd8cce1 - - - - - -] DHCP configuration for ports {'5f699be7-b1a4-4b0a-9b05-aa2b5159827f', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:25 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:25.376 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:25Z, description=, device_id=146cb74b-0e80-472c-b385-97b8554397ad, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f760>], id=b1a93d84-9bcf-438f-92ff-bbd336b9d0db, ip_allocation=immediate, mac_address=fa:16:3e:b2:74:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['53ae3a3e-05c5-40f2-93cb-3cdbe8c6e451'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:23Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1289, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:25Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:25 np0005532585.localdomain ceph-mon[300199]: pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:25 np0005532585.localdomain dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:25 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:25 np0005532585.localdomain systemd[1]: tmp-crun.62q2SZ.mount: Deactivated successfully.
Nov 23 10:00:25 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:25 np0005532585.localdomain podman[316927]: 2025-11-23 10:00:25.556923681 +0000 UTC m=+0.048905618 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:00:25 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:25.805 263258 INFO neutron.agent.dhcp.agent [None req-14b67171-52bf-45e3-8b04-4010502dfe11 - - - - - -] DHCP configuration for ports {'b1a93d84-9bcf-438f-92ff-bbd336b9d0db'} is completed
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.397 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.416 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:26.518 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:25Z, description=, device_id=146cb74b-0e80-472c-b385-97b8554397ad, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6310>], id=b1a93d84-9bcf-438f-92ff-bbd336b9d0db, ip_allocation=immediate, mac_address=fa:16:3e:b2:74:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['53ae3a3e-05c5-40f2-93cb-3cdbe8c6e451'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:23Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1289, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:25Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.631 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:26.655 263258 INFO neutron.agent.linux.ip_lib [None req-76d4543b-402d-4d69-89ce-e1dc8ae635c4 - - - - - -] Device tap10e1f965-56 cannot be used as it has no MAC address
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain kernel: device tap10e1f965-56 entered promiscuous mode
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.685 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892026.6867] manager: (tap10e1f965-56): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Nov 23 10:00:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:26Z|00214|binding|INFO|Claiming lport 10e1f965-5681-42dc-916c-83e697ea474c for this chassis.
Nov 23 10:00:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:26Z|00215|binding|INFO|10e1f965-5681-42dc-916c-83e697ea474c: Claiming unknown
Nov 23 10:00:26 np0005532585.localdomain systemd-udevd[316986]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:26.706 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10754024-8e92-4669-8b3d-be0210470d0a, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=10e1f965-5681-42dc-916c-83e697ea474c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:26.709 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 10e1f965-5681-42dc-916c-83e697ea474c in datapath 30192eb7-6210-4b4d-956f-dbc64d7c0b7c bound to our chassis
Nov 23 10:00:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:26.710 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:26.711 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a802c4-b012-46ac-a240-73181f698402]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:26 np0005532585.localdomain systemd[1]: tmp-crun.12YjOm.mount: Deactivated successfully.
Nov 23 10:00:26 np0005532585.localdomain podman[316971]: 2025-11-23 10:00:26.728634816 +0000 UTC m=+0.082001988 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:26 np0005532585.localdomain dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:26 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:26 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:26Z|00216|binding|INFO|Setting lport 10e1f965-5681-42dc-916c-83e697ea474c ovn-installed in OVS
Nov 23 10:00:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:26Z|00217|binding|INFO|Setting lport 10e1f965-5681-42dc-916c-83e697ea474c up in Southbound
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.740 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.774 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.803 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:26.926 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:26.947 263258 INFO neutron.agent.dhcp.agent [None req-1d6e31aa-1ed7-411e-8e51-3408088f4bd1 - - - - - -] DHCP configuration for ports {'b1a93d84-9bcf-438f-92ff-bbd336b9d0db'} is completed
Nov 23 10:00:27 np0005532585.localdomain ceph-mon[300199]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:27 np0005532585.localdomain podman[317050]: 
Nov 23 10:00:27 np0005532585.localdomain podman[317050]: 2025-11-23 10:00:27.677836937 +0000 UTC m=+0.094113600 container create b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:00:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f.scope.
Nov 23 10:00:27 np0005532585.localdomain podman[317050]: 2025-11-23 10:00:27.631768338 +0000 UTC m=+0.048045071 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:27 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0701b612469235a6b7433f4e52ad0b8dcaf36306964dddada708c650d1295ed6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:27 np0005532585.localdomain podman[317050]: 2025-11-23 10:00:27.751788325 +0000 UTC m=+0.168064998 container init b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:00:27 np0005532585.localdomain podman[317050]: 2025-11-23 10:00:27.765975072 +0000 UTC m=+0.182251745 container start b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:00:27 np0005532585.localdomain dnsmasq[317068]: started, version 2.85 cachesize 150
Nov 23 10:00:27 np0005532585.localdomain dnsmasq[317068]: DNS service limited to local subnets
Nov 23 10:00:27 np0005532585.localdomain dnsmasq[317068]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:27 np0005532585.localdomain dnsmasq[317068]: warning: no upstream servers configured
Nov 23 10:00:27 np0005532585.localdomain dnsmasq-dhcp[317068]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:00:27 np0005532585.localdomain dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 0 addresses
Nov 23 10:00:27 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host
Nov 23 10:00:27 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts
Nov 23 10:00:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:27.895 263258 INFO neutron.agent.dhcp.agent [None req-5460cacc-060a-4e29-b443-a26ca8b60dac - - - - - -] DHCP configuration for ports {'89a49f8b-be13-4044-a6f3-e04e5a35b524'} is completed
Nov 23 10:00:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:28 np0005532585.localdomain systemd[1]: tmp-crun.kFOWr3.mount: Deactivated successfully.
Nov 23 10:00:29 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:29.000 2 INFO neutron.agent.securitygroups_rpc [None req-7e066fee-86c7-4254-86ea-1e6408304d23 e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']
Nov 23 10:00:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:29.176 263258 INFO neutron.agent.linux.ip_lib [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Device tap40554683-ca cannot be used as it has no MAC address
Nov 23 10:00:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:29Z|00218|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:29.243 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:29 np0005532585.localdomain kernel: device tap40554683-ca entered promiscuous mode
Nov 23 10:00:29 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892029.2525] manager: (tap40554683-ca): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Nov 23 10:00:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:29Z|00219|binding|INFO|Claiming lport 40554683-cae1-4516-8207-1a403f64812d for this chassis.
Nov 23 10:00:29 np0005532585.localdomain systemd-udevd[316988]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:29Z|00220|binding|INFO|40554683-cae1-4516-8207-1a403f64812d: Claiming unknown
Nov 23 10:00:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:29.252 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:29.273 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb02ad4f92a44de895fb1e96459d4a8d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=646e4187-302b-4a7e-8182-fdc259bed00e, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=40554683-cae1-4516-8207-1a403f64812d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:29.276 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 40554683-cae1-4516-8207-1a403f64812d in datapath de8825f6-e682-4aa7-96e0-b4e3b7280d13 bound to our chassis
Nov 23 10:00:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:29.277 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network de8825f6-e682-4aa7-96e0-b4e3b7280d13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:29.278 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1ca5f5-c3cd-4be3-bd67-a47a76813bef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:29.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:29Z|00221|binding|INFO|Setting lport 40554683-cae1-4516-8207-1a403f64812d ovn-installed in OVS
Nov 23 10:00:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:29Z|00222|binding|INFO|Setting lport 40554683-cae1-4516-8207-1a403f64812d up in Southbound
Nov 23 10:00:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:29.298 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:29.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:29.355 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:29 np0005532585.localdomain ceph-mon[300199]: pgmap v181: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:29.694 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b33df0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4da00>], id=1ff81b8b-de5b-48b0-b2f2-303ad670977c, ip_allocation=immediate, mac_address=fa:16:3e:81:2c:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:24Z, description=, dns_domain=, id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-585029312, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4326, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1285, status=ACTIVE, subnets=['6432ddf9-5778-4055-86ee-5aa10ffd470f'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:25Z, vlan_transparent=None, network_id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1330, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:29Z on network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c
Nov 23 10:00:29 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:29.800 2 INFO neutron.agent.securitygroups_rpc [None req-28d507eb-defd-436d-9561-198db6b19aa5 e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']
Nov 23 10:00:29 np0005532585.localdomain dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:29 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:29 np0005532585.localdomain dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:29 np0005532585.localdomain podman[317125]: 2025-11-23 10:00:29.807718418 +0000 UTC m=+0.064254271 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:00:29 np0005532585.localdomain systemd[1]: tmp-crun.INd9hk.mount: Deactivated successfully.
Nov 23 10:00:29 np0005532585.localdomain dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 1 addresses
Nov 23 10:00:29 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host
Nov 23 10:00:29 np0005532585.localdomain podman[317161]: 2025-11-23 10:00:29.958472402 +0000 UTC m=+0.071800663 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:29 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:00:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:00:30 np0005532585.localdomain podman[317205]: 
Nov 23 10:00:30 np0005532585.localdomain podman[317205]: 2025-11-23 10:00:30.16131862 +0000 UTC m=+0.085437482 container create d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.165 263258 INFO neutron.agent.dhcp.agent [None req-e9c3f30c-1390-4658-9425-0ac2a8c7f35b - - - - - -] DHCP configuration for ports {'1ff81b8b-de5b-48b0-b2f2-303ad670977c'} is completed
Nov 23 10:00:30 np0005532585.localdomain systemd[1]: Started libpod-conmon-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883.scope.
Nov 23 10:00:30 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:30 np0005532585.localdomain podman[317205]: 2025-11-23 10:00:30.116621184 +0000 UTC m=+0.040740076 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:30 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c90e5066c28a5f27a8a00727318e8ddaad558b249ed18b61f33f5561371dec20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:30 np0005532585.localdomain podman[317205]: 2025-11-23 10:00:30.22784263 +0000 UTC m=+0.151961472 container init d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:00:30 np0005532585.localdomain podman[317205]: 2025-11-23 10:00:30.237520618 +0000 UTC m=+0.161639450 container start d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: started, version 2.85 cachesize 150
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: DNS service limited to local subnets
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: warning: no upstream servers configured
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 0 addresses
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.300 263258 INFO neutron.agent.dhcp.agent [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b336d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b33a60>], id=540485dd-5323-420d-bb5e-137503a34ac6, ip_allocation=immediate, mac_address=fa:16:3e:67:34:8c, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1268709717, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:26Z, description=, dns_domain=, id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1098943766, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1297, status=ACTIVE, subnets=['ea3934cc-3b39-4b5f-922b-bfb1b349d5c3'], tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:27Z, vlan_transparent=None, network_id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['19c917c4-9016-402d-a3b3-6d8ae59f7b74'], standard_attr_id=1328, status=DOWN, tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:28Z on network de8825f6-e682-4aa7-96e0-b4e3b7280d13
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.405 263258 INFO neutron.agent.dhcp.agent [None req-8fd16439-bd5b-4cd7-952a-2d9208a9f1c0 - - - - - -] DHCP configuration for ports {'9b03d173-5707-4af9-badd-14643f8c8076'} is completed
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 1 addresses
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts
Nov 23 10:00:30 np0005532585.localdomain podman[317244]: 2025-11-23 10:00:30.480493703 +0000 UTC m=+0.078780208 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.599 263258 INFO neutron.agent.dhcp.agent [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac5760>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac5580>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac5640>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac5730>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac5610>], id=c210641c-12c4-499a-9e67-1f37df4d3cc6, ip_allocation=immediate, mac_address=fa:16:3e:30:12:10, name=tempest-ExtraDHCPOptionsIpV6TestJSON-2007863999, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:26Z, description=, dns_domain=, id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1098943766, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1297, status=ACTIVE, subnets=['ea3934cc-3b39-4b5f-922b-bfb1b349d5c3'], tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:27Z, vlan_transparent=None, network_id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['19c917c4-9016-402d-a3b3-6d8ae59f7b74'], standard_attr_id=1333, status=DOWN, tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:29Z on network de8825f6-e682-4aa7-96e0-b4e3b7280d13
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.616 263258 INFO neutron.agent.linux.dhcp [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.616 263258 INFO neutron.agent.linux.dhcp [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.617 263258 INFO neutron.agent.linux.dhcp [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.646 263258 INFO neutron.agent.dhcp.agent [None req-a2a271ba-36e4-45ab-9c06-5efc2e7ada79 - - - - - -] DHCP configuration for ports {'540485dd-5323-420d-bb5e-137503a34ac6'} is completed
Nov 23 10:00:30 np0005532585.localdomain dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 2 addresses
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host
Nov 23 10:00:30 np0005532585.localdomain podman[317280]: 2025-11-23 10:00:30.772647962 +0000 UTC m=+0.059319638 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:00:30 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts
Nov 23 10:00:30 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:30Z|00223|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:30 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:30.867 2 INFO neutron.agent.securitygroups_rpc [None req-d4d902e4-381c-4584-8a85-837df381e7eb e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']
Nov 23 10:00:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:30.879 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.966 263258 INFO neutron.agent.dhcp.agent [None req-861641be-c57a-48f6-9827-eaad0158157f - - - - - -] DHCP configuration for ports {'c210641c-12c4-499a-9e67-1f37df4d3cc6'} is completed
Nov 23 10:00:31 np0005532585.localdomain dnsmasq[316868]: exiting on receipt of SIGTERM
Nov 23 10:00:31 np0005532585.localdomain podman[317320]: 2025-11-23 10:00:31.012224973 +0000 UTC m=+0.047561196 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:31 np0005532585.localdomain systemd[1]: libpod-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339.scope: Deactivated successfully.
Nov 23 10:00:31 np0005532585.localdomain podman[317347]: 2025-11-23 10:00:31.074856912 +0000 UTC m=+0.046481262 container died 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:00:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-038226a67fa7d74c36919df9cd9edb6225722632a91f4877537f142348f28939-merged.mount: Deactivated successfully.
Nov 23 10:00:31 np0005532585.localdomain podman[317347]: 2025-11-23 10:00:31.115478264 +0000 UTC m=+0.087102614 container remove 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:00:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:31Z|00224|binding|INFO|Releasing lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f from this chassis (sb_readonly=0)
Nov 23 10:00:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:31Z|00225|binding|INFO|Setting lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f down in Southbound
Nov 23 10:00:31 np0005532585.localdomain kernel: device tap5f699be7-b1 left promiscuous mode
Nov 23 10:00:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:31.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:31 np0005532585.localdomain systemd[1]: libpod-conmon-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339.scope: Deactivated successfully.
Nov 23 10:00:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:31.147 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=5f699be7-b1a4-4b0a-9b05-aa2b5159827f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:31.149 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 5f699be7-b1a4-4b0a-9b05-aa2b5159827f in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:00:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:31.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:31.151 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:31.152 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6e0631-a481-46d3-a6fc-a01bf2847ce3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:31 np0005532585.localdomain dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 1 addresses
Nov 23 10:00:31 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host
Nov 23 10:00:31 np0005532585.localdomain podman[317369]: 2025-11-23 10:00:31.18416833 +0000 UTC m=+0.099420813 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:31 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts
Nov 23 10:00:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:31.400 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:31.416 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.426 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdde20>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cddd00>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd730>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd280>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8d36940>], id=540485dd-5323-420d-bb5e-137503a34ac6, ip_allocation=immediate, mac_address=fa:16:3e:67:34:8c, name=tempest-new-port-name-1833756636, network_id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['19c917c4-9016-402d-a3b3-6d8ae59f7b74'], standard_attr_id=1328, status=DOWN, tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:31Z on network de8825f6-e682-4aa7-96e0-b4e3b7280d13
Nov 23 10:00:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.443 263258 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Nov 23 10:00:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.443 263258 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Nov 23 10:00:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.444 263258 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Nov 23 10:00:31 np0005532585.localdomain ceph-mon[300199]: pgmap v182: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:31 np0005532585.localdomain dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 1 addresses
Nov 23 10:00:31 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host
Nov 23 10:00:31 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts
Nov 23 10:00:31 np0005532585.localdomain podman[317413]: 2025-11-23 10:00:31.610566645 +0000 UTC m=+0.056849362 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:00:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.722 263258 INFO neutron.agent.dhcp.agent [None req-5fc02e9a-e7d0-4fcc-8829-1fe7178b80d3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.801 263258 INFO neutron.agent.dhcp.agent [None req-8ccb48da-010f-42b8-a029-20f96fd88d86 - - - - - -] DHCP configuration for ports {'540485dd-5323-420d-bb5e-137503a34ac6'} is completed
Nov 23 10:00:31 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:00:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:00:33 np0005532585.localdomain podman[317433]: 2025-11-23 10:00:33.016930659 +0000 UTC m=+0.072251016 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 23 10:00:33 np0005532585.localdomain podman[317433]: 2025-11-23 10:00:33.031475668 +0000 UTC m=+0.086796105 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3)
Nov 23 10:00:33 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:00:33 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:33.119 2 INFO neutron.agent.securitygroups_rpc [None req-ddca7662-4ee1-4886-8e69-87187e050157 e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']
Nov 23 10:00:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:33.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:33 np0005532585.localdomain dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 0 addresses
Nov 23 10:00:33 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host
Nov 23 10:00:33 np0005532585.localdomain dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts
Nov 23 10:00:33 np0005532585.localdomain podman[317468]: 2025-11-23 10:00:33.342995295 +0000 UTC m=+0.048272718 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:00:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:33.516 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b0c460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b0c340>], id=1ff81b8b-de5b-48b0-b2f2-303ad670977c, ip_allocation=immediate, mac_address=fa:16:3e:81:2c:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:24Z, description=, dns_domain=, id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-585029312, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4326, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1285, status=ACTIVE, subnets=['6432ddf9-5778-4055-86ee-5aa10ffd470f'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:25Z, vlan_transparent=None, network_id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1330, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:29Z on network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c
Nov 23 10:00:33 np0005532585.localdomain ceph-mon[300199]: pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:33 np0005532585.localdomain podman[317506]: 2025-11-23 10:00:33.757135092 +0000 UTC m=+0.061401662 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:00:33 np0005532585.localdomain dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 1 addresses
Nov 23 10:00:33 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host
Nov 23 10:00:33 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts
Nov 23 10:00:34 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:34.005 263258 INFO neutron.agent.dhcp.agent [None req-ff2d3634-b698-4615-8b78-141045b1502b - - - - - -] DHCP configuration for ports {'1ff81b8b-de5b-48b0-b2f2-303ad670977c'} is completed
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:34.136 263258 INFO neutron.agent.linux.ip_lib [None req-2bfd3498-e812-4d4c-858f-0433cc608f09 - - - - - -] Device tap41e3cb72-6d cannot be used as it has no MAC address
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain kernel: device tap41e3cb72-6d entered promiscuous mode
Nov 23 10:00:34 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892034.1657] manager: (tap41e3cb72-6d): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Nov 23 10:00:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:34Z|00226|binding|INFO|Claiming lport 41e3cb72-6db6-4670-834e-e198cab9488d for this chassis.
Nov 23 10:00:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:34Z|00227|binding|INFO|41e3cb72-6db6-4670-834e-e198cab9488d: Claiming unknown
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain systemd-udevd[317538]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:34Z|00228|binding|INFO|Setting lport 41e3cb72-6db6-4670-834e-e198cab9488d up in Southbound
Nov 23 10:00:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:34Z|00229|binding|INFO|Setting lport 41e3cb72-6db6-4670-834e-e198cab9488d ovn-installed in OVS
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.174 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.174 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=41e3cb72-6db6-4670-834e-e198cab9488d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.176 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 41e3cb72-6db6-4670-834e-e198cab9488d in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.178 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.180 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[efbdee0e-35b4-4b54-8fc5-f29bf4061ca3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.182 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.218 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain podman[317568]: 2025-11-23 10:00:34.470246859 +0000 UTC m=+0.053272422 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:00:34 np0005532585.localdomain dnsmasq[317225]: exiting on receipt of SIGTERM
Nov 23 10:00:34 np0005532585.localdomain systemd[1]: libpod-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883.scope: Deactivated successfully.
Nov 23 10:00:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:00:34 np0005532585.localdomain podman[317584]: 2025-11-23 10:00:34.53811428 +0000 UTC m=+0.053162439 container died d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:00:34 np0005532585.localdomain podman[317593]: 2025-11-23 10:00:34.592881167 +0000 UTC m=+0.090542080 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:00:34 np0005532585.localdomain podman[317593]: 2025-11-23 10:00:34.604310429 +0000 UTC m=+0.101971372 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:00:34 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:00:34 np0005532585.localdomain podman[317584]: 2025-11-23 10:00:34.621494649 +0000 UTC m=+0.136542788 container cleanup d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:34 np0005532585.localdomain systemd[1]: libpod-conmon-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883.scope: Deactivated successfully.
Nov 23 10:00:34 np0005532585.localdomain podman[317586]: 2025-11-23 10:00:34.670140178 +0000 UTC m=+0.177937223 container remove d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:34Z|00230|binding|INFO|Releasing lport 40554683-cae1-4516-8207-1a403f64812d from this chassis (sb_readonly=0)
Nov 23 10:00:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:34Z|00231|binding|INFO|Setting lport 40554683-cae1-4516-8207-1a403f64812d down in Southbound
Nov 23 10:00:34 np0005532585.localdomain kernel: device tap40554683-ca left promiscuous mode
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.689 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb02ad4f92a44de895fb1e96459d4a8d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=646e4187-302b-4a7e-8182-fdc259bed00e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=40554683-cae1-4516-8207-1a403f64812d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.690 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 40554683-cae1-4516-8207-1a403f64812d in datapath de8825f6-e682-4aa7-96e0-b4e3b7280d13 unbound from our chassis
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.691 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network de8825f6-e682-4aa7-96e0-b4e3b7280d13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:34.692 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4abf6be1-73b5-4385-a0c2-d90eca52a02e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:34.702 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:34 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:34.915 263258 INFO neutron.agent.dhcp.agent [None req-22c76af9-6271-43a7-bbdf-9fee24ca917d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c90e5066c28a5f27a8a00727318e8ddaad558b249ed18b61f33f5561371dec20-merged.mount: Deactivated successfully.
Nov 23 10:00:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:35 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dde8825f6\x2de682\x2d4aa7\x2d96e0\x2db4e3b7280d13.mount: Deactivated successfully.
Nov 23 10:00:35 np0005532585.localdomain podman[317671]: 
Nov 23 10:00:35 np0005532585.localdomain podman[317671]: 2025-11-23 10:00:35.082102398 +0000 UTC m=+0.076241640 container create 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:00:35 np0005532585.localdomain systemd[1]: Started libpod-conmon-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11.scope.
Nov 23 10:00:35 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:35 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2c0ddcbf3ea6dc0d7c80eec26c782e2f344a162226e3650f4c2b5cc4fa9ba34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.148 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:35 np0005532585.localdomain podman[317671]: 2025-11-23 10:00:35.149823774 +0000 UTC m=+0.143963116 container init 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:35 np0005532585.localdomain podman[317671]: 2025-11-23 10:00:35.053801746 +0000 UTC m=+0.047940978 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:35 np0005532585.localdomain podman[317671]: 2025-11-23 10:00:35.160432491 +0000 UTC m=+0.154571733 container start 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:00:35 np0005532585.localdomain dnsmasq[317689]: started, version 2.85 cachesize 150
Nov 23 10:00:35 np0005532585.localdomain dnsmasq[317689]: DNS service limited to local subnets
Nov 23 10:00:35 np0005532585.localdomain dnsmasq[317689]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:35 np0005532585.localdomain dnsmasq[317689]: warning: no upstream servers configured
Nov 23 10:00:35 np0005532585.localdomain dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.369 263258 INFO neutron.agent.dhcp.agent [None req-ddb9fd59-5e2e-4a2a-a2d2-c9374d4d8974 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:35 np0005532585.localdomain ceph-mon[300199]: pgmap v184: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.550 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:34Z, description=, device_id=5835bf8d-282b-4320-8c00-799baca6538f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b68d90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b683a0>], id=034be82f-816e-4b81-b415-e36e7e90dc5c, ip_allocation=immediate, mac_address=fa:16:3e:e9:53:86, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['b2e5987d-a159-4cfc-adef-6fac29f80001'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:32Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1359, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:34Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:35 np0005532585.localdomain dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:35 np0005532585.localdomain podman[317709]: 2025-11-23 10:00:35.713995024 +0000 UTC m=+0.056301136 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:35 np0005532585.localdomain sudo[317719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:00:35 np0005532585.localdomain sudo[317719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:00:35 np0005532585.localdomain sudo[317719]: pam_unix(sudo:session): session closed for user root
Nov 23 10:00:35 np0005532585.localdomain sudo[317745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:00:35 np0005532585.localdomain sudo[317745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:00:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.928 263258 INFO neutron.agent.dhcp.agent [None req-ceca03b5-dec5-4220-9670-62a5f6da3a5b - - - - - -] DHCP configuration for ports {'034be82f-816e-4b81-b415-e36e7e90dc5c'} is completed
Nov 23 10:00:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:36.192 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:36.402 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:36.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:36 np0005532585.localdomain sudo[317745]: pam_unix(sudo:session): session closed for user root
Nov 23 10:00:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:00:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:00:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:00:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:00:36 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:36Z|00232|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:36 np0005532585.localdomain sudo[317800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:00:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:36.866 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:36 np0005532585.localdomain sudo[317800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:00:36 np0005532585.localdomain sudo[317800]: pam_unix(sudo:session): session closed for user root
Nov 23 10:00:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:36.890 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:34Z, description=, device_id=5835bf8d-282b-4320-8c00-799baca6538f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9409a00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c942dd30>], id=034be82f-816e-4b81-b415-e36e7e90dc5c, ip_allocation=immediate, mac_address=fa:16:3e:e9:53:86, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['b2e5987d-a159-4cfc-adef-6fac29f80001'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:32Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1359, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:34Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:37 np0005532585.localdomain dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:00:37 np0005532585.localdomain podman[317837]: 2025-11-23 10:00:37.099173475 +0000 UTC m=+0.067721438 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:00:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:37.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:37.376 263258 INFO neutron.agent.dhcp.agent [None req-03a3e59d-8c52-4a5a-8577-139971d06a05 - - - - - -] DHCP configuration for ports {'034be82f-816e-4b81-b415-e36e7e90dc5c'} is completed
Nov 23 10:00:37 np0005532585.localdomain sshd[317858]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:00:37 np0005532585.localdomain ceph-mon[300199]: pgmap v185: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:38.603 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:38 np0005532585.localdomain sshd[317858]: Invalid user sas from 182.61.18.212 port 48522
Nov 23 10:00:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:38.881 263258 INFO neutron.agent.linux.ip_lib [None req-d9cb8e3a-f670-49c1-82c8-8f0ecfbbdd8b - - - - - -] Device tapb74e35ad-94 cannot be used as it has no MAC address
Nov 23 10:00:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:38.904 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:38 np0005532585.localdomain kernel: device tapb74e35ad-94 entered promiscuous mode
Nov 23 10:00:38 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892038.9138] manager: (tapb74e35ad-94): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Nov 23 10:00:38 np0005532585.localdomain systemd-udevd[317870]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:38 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:38Z|00233|binding|INFO|Claiming lport b74e35ad-94a2-4d4d-af80-3b2024099e6d for this chassis.
Nov 23 10:00:38 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:38Z|00234|binding|INFO|b74e35ad-94a2-4d4d-af80-3b2024099e6d: Claiming unknown
Nov 23 10:00:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:38.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:38 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:38.936 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a45cfb38-a270-4929-a93b-8d89273d60d1, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b74e35ad-94a2-4d4d-af80-3b2024099e6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:38 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:38.938 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b74e35ad-94a2-4d4d-af80-3b2024099e6d in datapath bcc66174-371f-4faf-83f1-5de56d4886ad bound to our chassis
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:38.940 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bcc66174-371f-4faf-83f1-5de56d4886ad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:38 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:38.941 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea07802-45dd-4794-8f20-cb35bee0b15c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:38Z|00235|binding|INFO|Setting lport b74e35ad-94a2-4d4d-af80-3b2024099e6d ovn-installed in OVS
Nov 23 10:00:38 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:38Z|00236|binding|INFO|Setting lport b74e35ad-94a2-4d4d-af80-3b2024099e6d up in Southbound
Nov 23 10:00:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:38.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapb74e35ad-94: No such device
Nov 23 10:00:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:38.981 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:39.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:39 np0005532585.localdomain sshd[317858]: Received disconnect from 182.61.18.212 port 48522:11: Bye Bye [preauth]
Nov 23 10:00:39 np0005532585.localdomain sshd[317858]: Disconnected from invalid user sas 182.61.18.212 port 48522 [preauth]
Nov 23 10:00:39 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:00:39 np0005532585.localdomain ceph-mon[300199]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:40 np0005532585.localdomain podman[317941]: 
Nov 23 10:00:40 np0005532585.localdomain podman[317941]: 2025-11-23 10:00:40.507995745 +0000 UTC m=+0.102615611 container create c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:40 np0005532585.localdomain podman[317941]: 2025-11-23 10:00:40.457549071 +0000 UTC m=+0.052168957 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:40 np0005532585.localdomain systemd[1]: Started libpod-conmon-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736.scope.
Nov 23 10:00:40 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:40 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c21502b2347b802ca27076444fd406a39385c92b799f42a0977ebddef8e75100/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:40 np0005532585.localdomain podman[317941]: 2025-11-23 10:00:40.605640303 +0000 UTC m=+0.200260159 container init c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:00:40 np0005532585.localdomain podman[317941]: 2025-11-23 10:00:40.615442816 +0000 UTC m=+0.210062672 container start c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:00:40 np0005532585.localdomain dnsmasq[317960]: started, version 2.85 cachesize 150
Nov 23 10:00:40 np0005532585.localdomain dnsmasq[317960]: DNS service limited to local subnets
Nov 23 10:00:40 np0005532585.localdomain dnsmasq[317960]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:40 np0005532585.localdomain dnsmasq[317960]: warning: no upstream servers configured
Nov 23 10:00:40 np0005532585.localdomain dnsmasq-dhcp[317960]: DHCP, static leases only on 10.101.0.0, lease time 1d
Nov 23 10:00:40 np0005532585.localdomain dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 0 addresses
Nov 23 10:00:40 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host
Nov 23 10:00:40 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts
Nov 23 10:00:40 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:40.740 263258 INFO neutron.agent.dhcp.agent [None req-d83f04ea-9b4d-4a4d-851b-9e329226e750 - - - - - -] DHCP configuration for ports {'e5dc106a-af41-420f-ba83-3fdcb4306cd4'} is completed
Nov 23 10:00:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:40.990 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:41.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:41.420 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:41 np0005532585.localdomain ceph-mon[300199]: pgmap v187: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:41 np0005532585.localdomain dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:41 np0005532585.localdomain podman[317980]: 2025-11-23 10:00:41.860451949 +0000 UTC m=+0.069597955 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:41 np0005532585.localdomain systemd[1]: tmp-crun.lC5SNP.mount: Deactivated successfully.
Nov 23 10:00:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:00:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:00:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:00:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159228 "" "Go-http-client/1.1"
Nov 23 10:00:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:00:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20175 "" "Go-http-client/1.1"
Nov 23 10:00:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:42.034 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:42 np0005532585.localdomain kernel: device tap41e3cb72-6d left promiscuous mode
Nov 23 10:00:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:42Z|00237|binding|INFO|Releasing lport 41e3cb72-6db6-4670-834e-e198cab9488d from this chassis (sb_readonly=0)
Nov 23 10:00:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:42Z|00238|binding|INFO|Setting lport 41e3cb72-6db6-4670-834e-e198cab9488d down in Southbound
Nov 23 10:00:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:42.042 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=41e3cb72-6db6-4670-834e-e198cab9488d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:42.044 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 41e3cb72-6db6-4670-834e-e198cab9488d in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:00:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:42.046 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:00:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:42.047 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3ab9f0-1102-4e44-8df5-922ee22a89ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:42.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:43 np0005532585.localdomain podman[318018]: 2025-11-23 10:00:43.197001602 +0000 UTC m=+0.063832576 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:00:43 np0005532585.localdomain dnsmasq[317689]: exiting on receipt of SIGTERM
Nov 23 10:00:43 np0005532585.localdomain systemd[1]: libpod-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11.scope: Deactivated successfully.
Nov 23 10:00:43 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:43Z|00239|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:43.249 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:43 np0005532585.localdomain podman[318031]: 2025-11-23 10:00:43.262276364 +0000 UTC m=+0.046380591 container died 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:43 np0005532585.localdomain podman[318031]: 2025-11-23 10:00:43.300400617 +0000 UTC m=+0.084504814 container cleanup 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:43 np0005532585.localdomain systemd[1]: libpod-conmon-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11.scope: Deactivated successfully.
Nov 23 10:00:43 np0005532585.localdomain podman[318032]: 2025-11-23 10:00:43.334538809 +0000 UTC m=+0.118049577 container remove 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:00:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:43 np0005532585.localdomain ceph-mon[300199]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:43.592 263258 INFO neutron.agent.dhcp.agent [None req-f2c9f3b1-fd81-47d9-b1b6-70801fc4861a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:00:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:00:43 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:00:44 np0005532585.localdomain podman[318061]: 2025-11-23 10:00:44.02633656 +0000 UTC m=+0.079144919 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_id=edpm, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:00:44 np0005532585.localdomain podman[318060]: 2025-11-23 10:00:44.075731672 +0000 UTC m=+0.131373488 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller)
Nov 23 10:00:44 np0005532585.localdomain podman[318061]: 2025-11-23 10:00:44.092661114 +0000 UTC m=+0.145469403 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=)
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:00:44 np0005532585.localdomain podman[318060]: 2025-11-23 10:00:44.112650969 +0000 UTC m=+0.168292785 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:00:44 np0005532585.localdomain podman[318090]: 2025-11-23 10:00:44.183348547 +0000 UTC m=+0.121373550 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c2c0ddcbf3ea6dc0d7c80eec26c782e2f344a162226e3650f4c2b5cc4fa9ba34-merged.mount: Deactivated successfully.
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:00:44 np0005532585.localdomain podman[318090]: 2025-11-23 10:00:44.215219699 +0000 UTC m=+0.153244632 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:00:44 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:00:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:44.342 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:43Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9415c10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9415160>], id=07c04277-2dd6-4b10-b1b3-e46f9ab20c05, ip_allocation=immediate, mac_address=fa:16:3e:b4:4e:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:34Z, description=, dns_domain=, id=bcc66174-371f-4faf-83f1-5de56d4886ad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-343519683, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45780, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1360, status=ACTIVE, subnets=['213562cd-cd75-4e9d-9432-8141a3732f62'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:37Z, vlan_transparent=None, network_id=bcc66174-371f-4faf-83f1-5de56d4886ad, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1381, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:43Z on network bcc66174-371f-4faf-83f1-5de56d4886ad
Nov 23 10:00:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:44.543 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:44.545 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:44.548 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:00:44 np0005532585.localdomain podman[318139]: 2025-11-23 10:00:44.570272017 +0000 UTC m=+0.063937331 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:00:44 np0005532585.localdomain dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 1 addresses
Nov 23 10:00:44 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host
Nov 23 10:00:44 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts
Nov 23 10:00:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:44.868 263258 INFO neutron.agent.dhcp.agent [None req-d4a5137d-ea46-4eef-808e-261b19d9605e - - - - - -] DHCP configuration for ports {'07c04277-2dd6-4b10-b1b3-e46f9ab20c05'} is completed
Nov 23 10:00:45 np0005532585.localdomain ceph-mon[300199]: pgmap v189: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:46.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:46.421 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:47 np0005532585.localdomain ceph-mon[300199]: pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:48 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:48.924 263258 INFO neutron.agent.linux.ip_lib [None req-21080b15-717a-47ba-ab2a-5b553674fdd1 - - - - - -] Device tap01290de6-b9 cannot be used as it has no MAC address
Nov 23 10:00:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:48.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:48 np0005532585.localdomain kernel: device tap01290de6-b9 entered promiscuous mode
Nov 23 10:00:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:48.958 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892048.9601] manager: (tap01290de6-b9): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Nov 23 10:00:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:48Z|00240|binding|INFO|Claiming lport 01290de6-b98c-45d9-85a6-538e887d4c81 for this chassis.
Nov 23 10:00:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:48Z|00241|binding|INFO|01290de6-b98c-45d9-85a6-538e887d4c81: Claiming unknown
Nov 23 10:00:48 np0005532585.localdomain systemd-udevd[318169]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:00:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:48.987 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefd:15c6/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=01290de6-b98c-45d9-85a6-538e887d4c81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:48.989 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 01290de6-b98c-45d9-85a6-538e887d4c81 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:00:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:48.992 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:00:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:48.992 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:00:48 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:48.992 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2576aa57-14cf-4474-8e2f-981528ffe519]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:48Z|00242|binding|INFO|Setting lport 01290de6-b98c-45d9-85a6-538e887d4c81 ovn-installed in OVS
Nov 23 10:00:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:48Z|00243|binding|INFO|Setting lport 01290de6-b98c-45d9-85a6-538e887d4c81 up in Southbound
Nov 23 10:00:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:48.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:49 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:49 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:49 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:49 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:49 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:49 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap01290de6-b9: No such device
Nov 23 10:00:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:49.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:49.060 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:49 np0005532585.localdomain ceph-mon[300199]: pgmap v191: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:49.628 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:43Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8abd340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c1bca0>], id=07c04277-2dd6-4b10-b1b3-e46f9ab20c05, ip_allocation=immediate, mac_address=fa:16:3e:b4:4e:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:34Z, description=, dns_domain=, id=bcc66174-371f-4faf-83f1-5de56d4886ad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-343519683, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45780, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1360, status=ACTIVE, subnets=['213562cd-cd75-4e9d-9432-8141a3732f62'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:37Z, vlan_transparent=None, network_id=bcc66174-371f-4faf-83f1-5de56d4886ad, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1381, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:43Z on network bcc66174-371f-4faf-83f1-5de56d4886ad
Nov 23 10:00:49 np0005532585.localdomain podman[318254]: 
Nov 23 10:00:49 np0005532585.localdomain dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 1 addresses
Nov 23 10:00:49 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host
Nov 23 10:00:49 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts
Nov 23 10:00:49 np0005532585.localdomain podman[318268]: 2025-11-23 10:00:49.870755161 +0000 UTC m=+0.040159288 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:49 np0005532585.localdomain podman[318254]: 2025-11-23 10:00:49.916632574 +0000 UTC m=+0.131531343 container create 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:49 np0005532585.localdomain podman[318254]: 2025-11-23 10:00:49.821582986 +0000 UTC m=+0.036481805 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a.scope.
Nov 23 10:00:49 np0005532585.localdomain systemd[1]: tmp-crun.h2eQgA.mount: Deactivated successfully.
Nov 23 10:00:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb7fa617f1cf5bcb988fc404e487efa2ea75be483366f6c07ef0e60e4a6f49f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:49 np0005532585.localdomain podman[318254]: 2025-11-23 10:00:49.987766415 +0000 UTC m=+0.202665144 container init 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:00:49 np0005532585.localdomain podman[318254]: 2025-11-23 10:00:49.994003418 +0000 UTC m=+0.208902187 container start 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:49 np0005532585.localdomain dnsmasq[318298]: started, version 2.85 cachesize 150
Nov 23 10:00:49 np0005532585.localdomain dnsmasq[318298]: DNS service limited to local subnets
Nov 23 10:00:49 np0005532585.localdomain dnsmasq[318298]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:49 np0005532585.localdomain dnsmasq[318298]: warning: no upstream servers configured
Nov 23 10:00:49 np0005532585.localdomain dnsmasq[318298]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:50 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:50.194 263258 INFO neutron.agent.dhcp.agent [None req-2bc5c5c7-ab5b-44db-bb53-ffb7ce7a4219 - - - - - -] DHCP configuration for ports {'07c04277-2dd6-4b10-b1b3-e46f9ab20c05', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:50 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:50.311 2 INFO neutron.agent.securitygroups_rpc [None req-76f35429-371d-4a0a-a261-a7949dd94068 e59892284e454ae28c30542a06194f67 7d06d32932c14944b00061256a49a5ca - - default default] Security group member updated ['3d66d90b-639c-4111-b259-a5454103aaa3']
Nov 23 10:00:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:50.550 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:00:50 np0005532585.localdomain dnsmasq[318298]: exiting on receipt of SIGTERM
Nov 23 10:00:50 np0005532585.localdomain podman[318316]: 2025-11-23 10:00:50.714626697 +0000 UTC m=+0.059926298 container kill 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:50 np0005532585.localdomain systemd[1]: libpod-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a.scope: Deactivated successfully.
Nov 23 10:00:50 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:00:50Z|00244|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:00:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:50.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:50 np0005532585.localdomain podman[318329]: 2025-11-23 10:00:50.77350088 +0000 UTC m=+0.047168664 container died 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:00:50 np0005532585.localdomain podman[318329]: 2025-11-23 10:00:50.813506582 +0000 UTC m=+0.087174316 container cleanup 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:00:50 np0005532585.localdomain systemd[1]: libpod-conmon-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a.scope: Deactivated successfully.
Nov 23 10:00:50 np0005532585.localdomain podman[318331]: 2025-11-23 10:00:50.866378331 +0000 UTC m=+0.130634915 container remove 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:00:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-fb7fa617f1cf5bcb988fc404e487efa2ea75be483366f6c07ef0e60e4a6f49f6-merged.mount: Deactivated successfully.
Nov 23 10:00:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:51.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:51.422 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:51 np0005532585.localdomain ceph-mon[300199]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:52 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:52.353 2 INFO neutron.agent.securitygroups_rpc [None req-aaa2c11d-b613-412d-8872-49d855ed78d3 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:00:53 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:53.033 2 INFO neutron.agent.securitygroups_rpc [None req-6f916793-e69f-491a-861e-c8f5876d7582 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:00:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:53.090 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:00:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:53.092 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:00:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:53.095 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:00:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:53.095 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:00:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:00:53.097 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6bb57a-6faa-4340-9c3b-6b084e1b4860]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:00:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:53 np0005532585.localdomain ceph-mon[300199]: pgmap v193: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:53 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:53.691 2 INFO neutron.agent.securitygroups_rpc [None req-8312fbc9-ef14-45ec-8246-3e5b81a28890 e59892284e454ae28c30542a06194f67 7d06d32932c14944b00061256a49a5ca - - default default] Security group member updated ['3d66d90b-639c-4111-b259-a5454103aaa3']
Nov 23 10:00:54 np0005532585.localdomain podman[318409]: 
Nov 23 10:00:54 np0005532585.localdomain podman[318409]: 2025-11-23 10:00:54.061983483 +0000 UTC m=+0.090678145 container create e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: Started libpod-conmon-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d.scope.
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: tmp-crun.NB9xr6.mount: Deactivated successfully.
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:54 np0005532585.localdomain podman[318409]: 2025-11-23 10:00:54.021002451 +0000 UTC m=+0.049697113 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:54 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3db0c81f1e124d9d69f888333884ab63f725fc79d437184bc44f72fbf237162/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:54 np0005532585.localdomain podman[318409]: 2025-11-23 10:00:54.130147013 +0000 UTC m=+0.158841675 container init e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:00:54 np0005532585.localdomain podman[318409]: 2025-11-23 10:00:54.142883316 +0000 UTC m=+0.171577948 container start e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:00:54 np0005532585.localdomain dnsmasq[318438]: started, version 2.85 cachesize 150
Nov 23 10:00:54 np0005532585.localdomain dnsmasq[318438]: DNS service limited to local subnets
Nov 23 10:00:54 np0005532585.localdomain dnsmasq[318438]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:54 np0005532585.localdomain dnsmasq[318438]: warning: no upstream servers configured
Nov 23 10:00:54 np0005532585.localdomain dnsmasq-dhcp[318438]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:00:54 np0005532585.localdomain dnsmasq[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:54 np0005532585.localdomain dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:54 np0005532585.localdomain dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:54 np0005532585.localdomain podman[318425]: 2025-11-23 10:00:54.228441171 +0000 UTC m=+0.117267994 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:00:54 np0005532585.localdomain podman[318425]: 2025-11-23 10:00:54.245316581 +0000 UTC m=+0.134143424 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true)
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:00:54 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:54.295 263258 INFO neutron.agent.dhcp.agent [None req-60f7553b-97b6-4301-adf4-1115fb7be616 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:54 np0005532585.localdomain podman[318447]: 2025-11-23 10:00:54.338008836 +0000 UTC m=+0.068682506 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:00:54 np0005532585.localdomain podman[318447]: 2025-11-23 10:00:54.348607223 +0000 UTC m=+0.079280903 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:00:54 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:00:54 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:54.443 2 INFO neutron.agent.securitygroups_rpc [None req-0abb4626-8029-4616-a7f3-bc7ec334c676 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:54 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:54.497 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aebd90>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aebc40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aebca0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aebc70>], id=70e0899a-4b27-45ec-a4ef-84aceb7179a1, ip_allocation=immediate, mac_address=fa:16:3e:b6:d9:2a, name=tempest-NetworksTestDHCPv6-743822975, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['1cec0113-a0ae-49ad-b829-a2e2ac9244c4', '90eab0bc-4d60-4712-8f88-98aa394aa338'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:49Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1421, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:54Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:00:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:54.525 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:54 np0005532585.localdomain dnsmasq[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:00:54 np0005532585.localdomain dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:54 np0005532585.localdomain podman[318486]: 2025-11-23 10:00:54.736957696 +0000 UTC m=+0.052250341 container kill e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:00:54 np0005532585.localdomain dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:55.207 263258 INFO neutron.agent.dhcp.agent [None req-85747f76-71b9-4c97-8364-49127a858e90 - - - - - -] DHCP configuration for ports {'70e0899a-4b27-45ec-a4ef-84aceb7179a1'} is completed
Nov 23 10:00:55 np0005532585.localdomain ceph-mon[300199]: pgmap v194: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:55 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:55.787 2 INFO neutron.agent.securitygroups_rpc [None req-f375f660-8a27-403d-b89f-e04d1f758be2 2cfd21f178604be289d8bb16b3b9c18f 0f8848490fb54a5cb41f1607121a115c - - default default] Security group member updated ['c9d46e70-8b37-41f1-b62d-e1679c8d4c9c']
Nov 23 10:00:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:56.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:00:56.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:00:56 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:00:56.616 2 INFO neutron.agent.securitygroups_rpc [None req-9e459c8e-445e-40f9-9db8-24291ed822a4 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:00:56 np0005532585.localdomain dnsmasq[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:56 np0005532585.localdomain dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:56 np0005532585.localdomain podman[318525]: 2025-11-23 10:00:56.867860819 +0000 UTC m=+0.058221404 container kill e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:00:56 np0005532585.localdomain dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:57 np0005532585.localdomain ceph-mon[300199]: pgmap v195: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:00:58 np0005532585.localdomain dnsmasq[318438]: exiting on receipt of SIGTERM
Nov 23 10:00:58 np0005532585.localdomain podman[318566]: 2025-11-23 10:00:58.409994846 +0000 UTC m=+0.050306721 container kill e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:58 np0005532585.localdomain systemd[1]: libpod-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d.scope: Deactivated successfully.
Nov 23 10:00:58 np0005532585.localdomain podman[318581]: 2025-11-23 10:00:58.479785216 +0000 UTC m=+0.053569421 container died e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:00:58 np0005532585.localdomain systemd[1]: tmp-crun.Nu0Zbk.mount: Deactivated successfully.
Nov 23 10:00:58 np0005532585.localdomain podman[318581]: 2025-11-23 10:00:58.518701075 +0000 UTC m=+0.092485250 container cleanup e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:00:58 np0005532585.localdomain systemd[1]: libpod-conmon-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d.scope: Deactivated successfully.
Nov 23 10:00:58 np0005532585.localdomain podman[318580]: 2025-11-23 10:00:58.544465288 +0000 UTC m=+0.116238172 container remove e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:00:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b3db0c81f1e124d9d69f888333884ab63f725fc79d437184bc44f72fbf237162-merged.mount: Deactivated successfully.
Nov 23 10:00:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:00:59 np0005532585.localdomain ceph-mon[300199]: pgmap v196: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:00:59 np0005532585.localdomain podman[318659]: 
Nov 23 10:00:59 np0005532585.localdomain podman[318659]: 2025-11-23 10:00:59.493560875 +0000 UTC m=+0.087031532 container create 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:59 np0005532585.localdomain systemd[1]: Started libpod-conmon-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3.scope.
Nov 23 10:00:59 np0005532585.localdomain systemd[1]: tmp-crun.608Xg4.mount: Deactivated successfully.
Nov 23 10:00:59 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:00:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65662f1a439f1f795895507cf085174a5b8f6eedbf8689b96d3d51fbe5a3798d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:00:59 np0005532585.localdomain podman[318659]: 2025-11-23 10:00:59.45675154 +0000 UTC m=+0.050222257 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:00:59 np0005532585.localdomain podman[318659]: 2025-11-23 10:00:59.563659214 +0000 UTC m=+0.157129871 container init 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:00:59 np0005532585.localdomain podman[318659]: 2025-11-23 10:00:59.572776545 +0000 UTC m=+0.166247202 container start 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:00:59 np0005532585.localdomain dnsmasq[318678]: started, version 2.85 cachesize 150
Nov 23 10:00:59 np0005532585.localdomain dnsmasq[318678]: DNS service limited to local subnets
Nov 23 10:00:59 np0005532585.localdomain dnsmasq[318678]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:00:59 np0005532585.localdomain dnsmasq[318678]: warning: no upstream servers configured
Nov 23 10:00:59 np0005532585.localdomain dnsmasq-dhcp[318678]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:00:59 np0005532585.localdomain dnsmasq[318678]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:00:59 np0005532585.localdomain dnsmasq-dhcp[318678]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:00:59 np0005532585.localdomain dnsmasq-dhcp[318678]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:00:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:00:59.828 263258 INFO neutron.agent.dhcp.agent [None req-a4eba1d5-a06b-4f60-8f7c-35388d336310 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:00:59 np0005532585.localdomain dnsmasq[318678]: exiting on receipt of SIGTERM
Nov 23 10:00:59 np0005532585.localdomain podman[318696]: 2025-11-23 10:00:59.921669933 +0000 UTC m=+0.042041966 container kill 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:00:59 np0005532585.localdomain systemd[1]: libpod-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3.scope: Deactivated successfully.
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:00:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:00:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:00:59 np0005532585.localdomain podman[318709]: 2025-11-23 10:00:59.995355303 +0000 UTC m=+0.060922898 container died 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:01:00 np0005532585.localdomain podman[318709]: 2025-11-23 10:01:00.029719752 +0000 UTC m=+0.095287367 container cleanup 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:00 np0005532585.localdomain systemd[1]: libpod-conmon-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3.scope: Deactivated successfully.
Nov 23 10:01:00 np0005532585.localdomain podman[318711]: 2025-11-23 10:01:00.07605326 +0000 UTC m=+0.135822676 container remove 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-65662f1a439f1f795895507cf085174a5b8f6eedbf8689b96d3d51fbe5a3798d-merged.mount: Deactivated successfully.
Nov 23 10:01:00 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3407167775' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3407167775' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:00 np0005532585.localdomain podman[318786]: 
Nov 23 10:01:00 np0005532585.localdomain podman[318786]: 2025-11-23 10:01:00.917788069 +0000 UTC m=+0.074264148 container create cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:00 np0005532585.localdomain systemd[1]: Started libpod-conmon-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3.scope.
Nov 23 10:01:00 np0005532585.localdomain podman[318786]: 2025-11-23 10:01:00.886944579 +0000 UTC m=+0.043420698 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:00 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:00 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bef31403b2f37e816ee2cb2ed542775de4312b960541de464db08bad6657575/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:00 np0005532585.localdomain podman[318786]: 2025-11-23 10:01:00.998925769 +0000 UTC m=+0.155401848 container init cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:01:01 np0005532585.localdomain podman[318786]: 2025-11-23 10:01:01.007227484 +0000 UTC m=+0.163703563 container start cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:01:01 np0005532585.localdomain dnsmasq[318805]: started, version 2.85 cachesize 150
Nov 23 10:01:01 np0005532585.localdomain dnsmasq[318805]: DNS service limited to local subnets
Nov 23 10:01:01 np0005532585.localdomain dnsmasq[318805]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:01 np0005532585.localdomain dnsmasq[318805]: warning: no upstream servers configured
Nov 23 10:01:01 np0005532585.localdomain dnsmasq-dhcp[318805]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:01:01 np0005532585.localdomain dnsmasq[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:01 np0005532585.localdomain dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:01 np0005532585.localdomain dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:01 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:01.237 263258 INFO neutron.agent.dhcp.agent [None req-df7d977e-a85c-48c3-9718-c4dc21c10d72 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:01.415 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:01.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:01 np0005532585.localdomain ceph-mon[300199]: pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:01:01 np0005532585.localdomain CROND[318807]: (root) CMD (run-parts /etc/cron.hourly)
Nov 23 10:01:01 np0005532585.localdomain run-parts[318810]: (/etc/cron.hourly) starting 0anacron
Nov 23 10:01:01 np0005532585.localdomain run-parts[318816]: (/etc/cron.hourly) finished 0anacron
Nov 23 10:01:01 np0005532585.localdomain CROND[318806]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 23 10:01:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:02.104 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:02.106 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:02.108 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:02.108 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:02.109 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e83def7a-d9c0-4ef0-9509-3adae6d28c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:02.525 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:02.877 2 INFO neutron.agent.securitygroups_rpc [None req-6be775ef-69f0-4d4b-8550-9fb2b9ad120e 2cfd21f178604be289d8bb16b3b9c18f 0f8848490fb54a5cb41f1607121a115c - - default default] Security group member updated ['c9d46e70-8b37-41f1-b62d-e1679c8d4c9c']
Nov 23 10:01:02 np0005532585.localdomain dnsmasq[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:02 np0005532585.localdomain dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:02 np0005532585.localdomain dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:02 np0005532585.localdomain podman[318832]: 2025-11-23 10:01:02.963216558 +0000 UTC m=+0.058194173 container kill cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:03Z|00245|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:03.106 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:03.294 263258 INFO neutron.agent.dhcp.agent [None req-bdd693fd-2955-49da-819e-4a190ec18cb9 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:03 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:03.414 2 INFO neutron.agent.securitygroups_rpc [None req-4c75c4e7-1ac8-40bf-8bb2-ec0e75e4208b a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:03 np0005532585.localdomain ceph-mon[300199]: pgmap v198: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:01:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:01:04 np0005532585.localdomain podman[318852]: 2025-11-23 10:01:04.031080534 +0000 UTC m=+0.085788633 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 23 10:01:04 np0005532585.localdomain podman[318852]: 2025-11-23 10:01:04.042357582 +0000 UTC m=+0.097065711 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 23 10:01:04 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:01:04 np0005532585.localdomain dnsmasq[318805]: exiting on receipt of SIGTERM
Nov 23 10:01:04 np0005532585.localdomain podman[318888]: 2025-11-23 10:01:04.317093945 +0000 UTC m=+0.067054746 container kill cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:04 np0005532585.localdomain systemd[1]: libpod-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3.scope: Deactivated successfully.
Nov 23 10:01:04 np0005532585.localdomain podman[318902]: 2025-11-23 10:01:04.383829771 +0000 UTC m=+0.055036396 container died cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:01:04 np0005532585.localdomain podman[318902]: 2025-11-23 10:01:04.420642005 +0000 UTC m=+0.091848630 container cleanup cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:04 np0005532585.localdomain systemd[1]: libpod-conmon-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3.scope: Deactivated successfully.
Nov 23 10:01:04 np0005532585.localdomain podman[318909]: 2025-11-23 10:01:04.461722331 +0000 UTC m=+0.121584266 container remove cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:01:04 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:01:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6bef31403b2f37e816ee2cb2ed542775de4312b960541de464db08bad6657575-merged.mount: Deactivated successfully.
Nov 23 10:01:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:05 np0005532585.localdomain podman[318933]: 2025-11-23 10:01:05.036991141 +0000 UTC m=+0.091770147 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:01:05 np0005532585.localdomain podman[318933]: 2025-11-23 10:01:05.045167753 +0000 UTC m=+0.099946779 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:01:05 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:01:05 np0005532585.localdomain ceph-mon[300199]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:01:05 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:05.717 263258 INFO neutron.agent.linux.ip_lib [None req-5d853321-94ca-4949-b3fd-1a7ae0b86b56 - - - - - -] Device tap3016fb40-93 cannot be used as it has no MAC address
Nov 23 10:01:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:05.740 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:05 np0005532585.localdomain kernel: device tap3016fb40-93 entered promiscuous mode
Nov 23 10:01:05 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892065.7478] manager: (tap3016fb40-93): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Nov 23 10:01:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:05.748 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:05Z|00246|binding|INFO|Claiming lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 for this chassis.
Nov 23 10:01:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:05Z|00247|binding|INFO|3016fb40-93ab-4df3-956c-74e722dc2fa2: Claiming unknown
Nov 23 10:01:05 np0005532585.localdomain systemd-udevd[319006]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:01:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:05.760 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de25a77c-b919-41f6-9a1b-fd3e354e84bf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=3016fb40-93ab-4df3-956c-74e722dc2fa2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:05.761 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 3016fb40-93ab-4df3-956c-74e722dc2fa2 in datapath d0e9752d-2178-4eb0-b091-dd4d434021e5 bound to our chassis
Nov 23 10:01:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:05.762 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d0e9752d-2178-4eb0-b091-dd4d434021e5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:01:05 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:05.762 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[50cc67d6-e375-41fd-bd90-5ff995fc45a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:05Z|00248|binding|INFO|Setting lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 ovn-installed in OVS
Nov 23 10:01:05 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:05Z|00249|binding|INFO|Setting lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 up in Southbound
Nov 23 10:01:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:05.773 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:05.808 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:05.833 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:05 np0005532585.localdomain podman[319027]: 
Nov 23 10:01:05 np0005532585.localdomain podman[319027]: 2025-11-23 10:01:05.930093364 +0000 UTC m=+0.071207864 container create 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:01:05 np0005532585.localdomain systemd[1]: Started libpod-conmon-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a.scope.
Nov 23 10:01:05 np0005532585.localdomain podman[319027]: 2025-11-23 10:01:05.888337218 +0000 UTC m=+0.029451728 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:05 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:05 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/899e9acb3d362d31418b6273b6c3450aeed57ce3d96336857afd4db90a4a068a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:06 np0005532585.localdomain podman[319027]: 2025-11-23 10:01:06.007739536 +0000 UTC m=+0.148854036 container init 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:01:06 np0005532585.localdomain podman[319027]: 2025-11-23 10:01:06.017834598 +0000 UTC m=+0.158949098 container start 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319052]: started, version 2.85 cachesize 150
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319052]: DNS service limited to local subnets
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319052]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319052]: warning: no upstream servers configured
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319052]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319052]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:06.337 263258 INFO neutron.agent.dhcp.agent [None req-05d11236-511b-490e-bf50-7df7d5b12e40 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:06.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:06.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:06 np0005532585.localdomain podman[319090]: 
Nov 23 10:01:06 np0005532585.localdomain podman[319090]: 2025-11-23 10:01:06.712202497 +0000 UTC m=+0.088660091 container create d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:06 np0005532585.localdomain systemd[1]: Started libpod-conmon-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186.scope.
Nov 23 10:01:06 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:06 np0005532585.localdomain podman[319090]: 2025-11-23 10:01:06.669837522 +0000 UTC m=+0.046295166 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:06 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b893594a3b7f21ead5de7308fbf557090d81368903c395e5edbb120a0248d337/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:06 np0005532585.localdomain podman[319090]: 2025-11-23 10:01:06.784548836 +0000 UTC m=+0.161006430 container init d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:01:06 np0005532585.localdomain podman[319090]: 2025-11-23 10:01:06.790882872 +0000 UTC m=+0.167340466 container start d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319108]: started, version 2.85 cachesize 150
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319108]: DNS service limited to local subnets
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319108]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319108]: warning: no upstream servers configured
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319108]: DHCP, static leases only on 10.103.0.0, lease time 1d
Nov 23 10:01:06 np0005532585.localdomain dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 0 addresses
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host
Nov 23 10:01:06 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts
Nov 23 10:01:06 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e113 e113: 6 total, 6 up, 6 in
Nov 23 10:01:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:06.987 263258 INFO neutron.agent.dhcp.agent [None req-f4c996c7-3eca-4f8d-bff8-201a20995022 - - - - - -] DHCP configuration for ports {'89de9cee-45b5-4855-ac12-45fb015ff1d7'} is completed
Nov 23 10:01:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:07.219 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.3 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:07.220 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:07.222 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:07.222 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:07.223 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcb4714-832d-4c3a-ad45-d41eec655a52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:07 np0005532585.localdomain ceph-mon[300199]: osdmap e113: 6 total, 6 up, 6 in
Nov 23 10:01:07 np0005532585.localdomain ceph-mon[300199]: pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:01:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e114 e114: 6 total, 6 up, 6 in
Nov 23 10:01:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:08.322 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:07Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bec6a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bec7f0>], id=b97eb3d5-f95d-413c-ad24-47f79a3d2882, ip_allocation=immediate, mac_address=fa:16:3e:27:e5:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:02Z, description=, dns_domain=, id=d0e9752d-2178-4eb0-b091-dd4d434021e5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-635928538, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1457, status=ACTIVE, subnets=['e0fb5c14-e60a-431f-a11a-0be1e976c275'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:04Z, vlan_transparent=None, network_id=d0e9752d-2178-4eb0-b091-dd4d434021e5, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1473, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:08Z on network d0e9752d-2178-4eb0-b091-dd4d434021e5
Nov 23 10:01:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:08.498 2 INFO neutron.agent.securitygroups_rpc [None req-d30670ed-c29e-4282-92c2-f353a53316ea 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:08 np0005532585.localdomain dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 1 addresses
Nov 23 10:01:08 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host
Nov 23 10:01:08 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts
Nov 23 10:01:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:08.545 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:07Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bc2640>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bc2550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c35ca0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bc2b50>], id=8e4502b6-105d-4e98-9e94-b93c8b85eb68, ip_allocation=immediate, mac_address=fa:16:3e:b1:eb:ac, name=tempest-NetworksTestDHCPv6-1058152510, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['e8e28b0a-fce4-45c6-b501-a05bd7ea67b9', 'fa0541e1-f00b-4a69-b039-1c3d2c2010ab'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:03Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1472, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:08Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:01:08 np0005532585.localdomain podman[319127]: 2025-11-23 10:01:08.545385541 +0000 UTC m=+0.068716999 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:01:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:08.829 263258 INFO neutron.agent.dhcp.agent [None req-dbbdeeb0-5ec2-4c1b-86c5-563917620282 - - - - - -] DHCP configuration for ports {'b97eb3d5-f95d-413c-ad24-47f79a3d2882'} is completed
Nov 23 10:01:08 np0005532585.localdomain ceph-mon[300199]: osdmap e114: 6 total, 6 up, 6 in
Nov 23 10:01:08 np0005532585.localdomain podman[319165]: 2025-11-23 10:01:08.905473703 +0000 UTC m=+0.060906897 container kill 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:01:08 np0005532585.localdomain dnsmasq[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:01:08 np0005532585.localdomain dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:08 np0005532585.localdomain dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:09.192 263258 INFO neutron.agent.dhcp.agent [None req-7a469220-c6c9-4613-bd8f-baccb0831258 - - - - - -] DHCP configuration for ports {'8e4502b6-105d-4e98-9e94-b93c8b85eb68'} is completed
Nov 23 10:01:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:09.197 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:01:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:09.300 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:01:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:09.301 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:01:09 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:09.415 2 INFO neutron.agent.securitygroups_rpc [None req-ee598349-ae04-45e4-9403-8b439fe516e0 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:09 np0005532585.localdomain dnsmasq[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:09 np0005532585.localdomain podman[319204]: 2025-11-23 10:01:09.666005512 +0000 UTC m=+0.062438574 container kill 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:09 np0005532585.localdomain dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:09 np0005532585.localdomain dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2910603634' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2910603634' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:09 np0005532585.localdomain ceph-mon[300199]: pgmap v203: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 767 B/s wr, 1 op/s
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.809 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.834 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.835 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7756857-11f3-43ee-8351-eff5c460aa7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.810414', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568c3494-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '0100ba19e0b8bac0bd93a13dd9c2b09cc7943e401296e6e23697a0a6386ef330'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.810414', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568c448e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '4976d2a99748263577d4c1ec9b799520629e699cda78994ed442b8c2c37d2b12'}]}, 'timestamp': '2025-11-23 10:01:10.836097', '_unique_id': '66d94295cec14f45845baf327fd9ee4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.838 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a098503-56ca-4ce3-9075-e1c347c995a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.838590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568cb2ca-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'ad286d0e2d90e07ec3d86b8f7c3b35c92282a0176cf6169a409adabcb2eb419d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.838590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568cbfc2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'b29669fc20955e9e0f315081a027829db3f2154dc407f3f7d045218107c44bea'}]}, 'timestamp': '2025-11-23 10:01:10.839227', '_unique_id': '876eb37fb1e9435f9e2e246a119605b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.840 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.840 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.840 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cc8ac29-297a-4f20-8ab2-254900ab75c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.840650', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568d02e8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'f3753301183d396104c3fe137c6fd0fd6820dda7d2378268c96d0ce0caf67b50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.840650', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568d0ea0-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'df937c906553f524a5d57fc63e696ccc043fc2528df1f9c8d65a315615561dff'}]}, 'timestamp': '2025-11-23 10:01:10.841241', '_unique_id': '9f00ca414a9f457496abb78ece338ee3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a9c2918-3749-4314-abd5-a39ca618d12a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.842653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568d5144-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'cad3dd91056d9c555f2589c920cc10399648382a0333ee44682532a9578b1c91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.842653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568d5b80-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '6393aa92b852f4efcad76ef4bc2067db5c7685c5b2a4912b1fa693eb8d24b8dc'}]}, 'timestamp': '2025-11-23 10:01:10.843211', '_unique_id': '3c0317c505b64e75891aa7ebf8eb93e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:10.845 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:07Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aeb160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aeb700>], id=b97eb3d5-f95d-413c-ad24-47f79a3d2882, ip_allocation=immediate, mac_address=fa:16:3e:27:e5:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:02Z, description=, dns_domain=, id=d0e9752d-2178-4eb0-b091-dd4d434021e5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-635928538, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1457, status=ACTIVE, subnets=['e0fb5c14-e60a-431f-a11a-0be1e976c275'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:04Z, vlan_transparent=None, network_id=d0e9752d-2178-4eb0-b091-dd4d434021e5, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1473, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:08Z on network d0e9752d-2178-4eb0-b091-dd4d434021e5
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8269b53-69ce-463c-929a-e1f09097f9a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.844584', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '568e4112-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '8b3f1c3e9f6d55f62bdf973f172a4331f76484aa8c557b5be3b6d3020cf5f1f2'}]}, 'timestamp': '2025-11-23 10:01:10.849133', '_unique_id': '17e77d7785c14ea2954011b740a306ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 16590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1694dab-39c3-40ef-b3bd-0ae80cb94051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16590000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:01:10.850726', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '56909d18-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.041592407, 'message_signature': '3b704ffdb342c0d9a37c6765a03c6c0c8bbf4b047a8020f7ab2f284301333fb0'}]}, 'timestamp': '2025-11-23 10:01:10.864615', '_unique_id': '4594194ee12f40d68e2894c3f5b40132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.866 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ef142de-ebb5-40d6-8110-6ee29d459d75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.866523', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5690f632-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '49b6127de0c7ea6d878c2f6d7ecd1d8db801e77ab7b9e68d843fec04399dc017'}]}, 'timestamp': '2025-11-23 10:01:10.866907', '_unique_id': 'b3eb20f9e18544af83356c680a18396b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5536d7a4-3a9d-48dc-8f59-0c7518fe4758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.868600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5691468c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'ff6a9a8d92f80bdc634cd5cba86804b920f9462282f38d5b736287b97e250127'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.868600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56915190-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '6e952e36bd9495ac036e11066211521c742123c523c3c49af6a4f99a32a13d78'}]}, 'timestamp': '2025-11-23 10:01:10.869164', '_unique_id': '65f19c359f9941769ca553e0ff9ecb35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14a49a84-91e7-48e2-80e6-2964eb5cb7bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.870607', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56931fb6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '7d44dfb1d71927fc6e58a813a553500180c414b7df4c87a5aea6fc8489885c4b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.870607', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56932e20-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': 'ea73a86f4305fb1cf2c850f65c8f72681651da6156771524843d2739b0657612'}]}, 'timestamp': '2025-11-23 10:01:10.881384', '_unique_id': 'c01dc75df53f413c92050e9279561b30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f72bd289-2bca-4bfc-9ca8-e8765545ee45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.883300', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56938546-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '7755703b120c3a6542d5f2699e287b90fc521f22a628b61d510205e10635901c'}]}, 'timestamp': '2025-11-23 10:01:10.883651', '_unique_id': '6101097430774cd99ad5dc48e5b57acf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f62a3db9-b710-48a6-a6ca-161a1c578b54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.885228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5693d08c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '87a0490763d3b5c98a8331fe7e81ffb36dee439850287f91f004b0f003b82937'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.885228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5693dbcc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'e718af9a10ea165d6660524c7fae8e510b747a8f25e804aa02a41a308a6158aa'}]}, 'timestamp': '2025-11-23 10:01:10.885821', '_unique_id': '7989bfab31954f4996e7df50eeb31148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.887 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3ff272e-3dd0-4a85-b36d-2eab3b7176dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.887451', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56942730-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '2c07eb7c36754ad64283f70c3641a823b6565a793f48d34a5b00454e7de182d7'}]}, 'timestamp': '2025-11-23 10:01:10.887764', '_unique_id': '22d37ce43957405484a7338f1c22732b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47c1c677-9cb2-4511-a423-de5663925535', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.889235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56946cae-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '78e9e872415e267599f6704945d537c1e4175422b55544c71e4df435188d3370'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.889235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5694778a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '02df1829ff3ab380a03b537ea4c67a9db3f20c8c7d010cf0f86c88138cccf321'}]}, 'timestamp': '2025-11-23 10:01:10.889799', '_unique_id': '461aa7e7786b424995007153ae198e3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c67f04-51d6-4f06-9080-f81ea9e40db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.891347', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5694bf24-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': 'a20ee216bfb3d1a798172e8de7ce1a965c00e9120d9ee58101ed1813c932b929'}]}, 'timestamp': '2025-11-23 10:01:10.891653', '_unique_id': 'b25229176a1846198d67971db53f79af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79ded583-9029-449d-acae-9dd59ba61969', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.893416', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56950fc4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': 'e940b4aae7d3b7653565c26d6770127fb4e073efb4b201d218d18d9d7bfafb95'}]}, 'timestamp': '2025-11-23 10:01:10.893717', '_unique_id': 'cd21ff49a035494ca77c1ab5cf3cffe5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90b7663d-6397-4f47-b425-94f57b0325fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:01:10.895206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '569555f6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.041592407, 'message_signature': '2c7d37ca84ec6d3ac96dc8caf9518e5c6219d6f566877a29ad8234eadff8020d'}]}, 'timestamp': '2025-11-23 10:01:10.895506', '_unique_id': '26101b24841a4a72a83bfee7be604bcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d40326e-e68d-4ed1-b22e-3ffe062a5531', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.896973', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56959b06-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '901acacef84c13b274bae5e75e2e58f65db5dd523563b25c118531ffb1ec2fde'}]}, 'timestamp': '2025-11-23 10:01:10.897283', '_unique_id': 'd7435fe0962d425a9bcdf249c24a9e51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.898 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7d1c39c-d27d-4b2b-ac49-bd6f5d3d0c3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.898688', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5695ddfa-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '664d7ef3ad580c211d0fd5c36396b74aece2eb8e8b96efe48627347ca3016c7e'}]}, 'timestamp': '2025-11-23 10:01:10.899064', '_unique_id': '174a0203678446e1b186070a46dc7fb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73ecd5d3-7d74-4a4b-ab33-0b38cf9109e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.900692', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56962c42-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '8a4cc52d3e4f9b184b2447417d70a1ca50f5370dfbd332ceb27ed1d128b2cb26'}]}, 'timestamp': '2025-11-23 10:01:10.901030', '_unique_id': 'f1b9d010a57d4f90bc02417f06cd10ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd402703-5ca7-4c95-94a1-8cac8406cbfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.902586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '569675da-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '2cbf3db59250c046d40651b33910ebb6575335875209c22d9b1e020c29150d9b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.902586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '569681ce-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '8c91e4c8f360893ee8907c27f3d8ab9787f76313a900f99234bc1e9adf70bbf9'}]}, 'timestamp': '2025-11-23 10:01:10.903170', '_unique_id': 'efedb6418f684400bc4753944eb56f31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain systemd-journald[48157]: Data hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Nov 23 10:01:10 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 10:01:10 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36e55255-07d8-48fb-99f2-7104d8978538', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.904693', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5696c83c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': 'a15212f16063793f4e80404a4b8a9000b34ecad8be163a159aaeca788a19b437'}]}, 'timestamp': '2025-11-23 10:01:10.905021', '_unique_id': '875957f53e8c4e4284b4d8bf1ff61955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:01:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:01:11 np0005532585.localdomain dnsmasq[319052]: exiting on receipt of SIGTERM
Nov 23 10:01:11 np0005532585.localdomain systemd[1]: tmp-crun.LEFP9Y.mount: Deactivated successfully.
Nov 23 10:01:11 np0005532585.localdomain systemd[1]: libpod-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a.scope: Deactivated successfully.
Nov 23 10:01:11 np0005532585.localdomain podman[319255]: 2025-11-23 10:01:11.057840058 +0000 UTC m=+0.069560364 container kill 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:11 np0005532585.localdomain podman[319262]: 2025-11-23 10:01:11.077952748 +0000 UTC m=+0.063320832 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:11 np0005532585.localdomain dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 1 addresses
Nov 23 10:01:11 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host
Nov 23 10:01:11 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts
Nov 23 10:01:11 np0005532585.localdomain podman[319281]: 2025-11-23 10:01:11.138276425 +0000 UTC m=+0.063554868 container died 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:11 np0005532585.localdomain podman[319281]: 2025-11-23 10:01:11.170175039 +0000 UTC m=+0.095453402 container cleanup 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:01:11 np0005532585.localdomain systemd[1]: libpod-conmon-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a.scope: Deactivated successfully.
Nov 23 10:01:11 np0005532585.localdomain podman[319283]: 2025-11-23 10:01:11.215139094 +0000 UTC m=+0.125982462 container remove 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:01:11 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:11.284 2 INFO neutron.agent.securitygroups_rpc [None req-4b64e528-1440-49a5-b870-0a0f4d60c275 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:11 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:01:11 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/182625698' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:11 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:01:11 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/182625698' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:11.353 263258 INFO neutron.agent.dhcp.agent [None req-842206c5-f2eb-4756-9d89-5375128d7158 - - - - - -] DHCP configuration for ports {'b97eb3d5-f95d-413c-ad24-47f79a3d2882'} is completed
Nov 23 10:01:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/182625698' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/182625698' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:11.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:11.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:01:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:01:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:01:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159333 "" "Go-http-client/1.1"
Nov 23 10:01:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:01:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20194 "" "Go-http-client/1.1"
Nov 23 10:01:12 np0005532585.localdomain podman[319369]: 
Nov 23 10:01:12 np0005532585.localdomain podman[319369]: 2025-11-23 10:01:12.01550805 +0000 UTC m=+0.144161553 container create 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837.scope.
Nov 23 10:01:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-899e9acb3d362d31418b6273b6c3450aeed57ce3d96336857afd4db90a4a068a-merged.mount: Deactivated successfully.
Nov 23 10:01:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:12 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48c328cc9cc6bdb79002724ef74b00a7bbe9050aa89c002fb9e7e8bd3aac1e06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:12 np0005532585.localdomain podman[319369]: 2025-11-23 10:01:11.977036474 +0000 UTC m=+0.105690047 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:12 np0005532585.localdomain podman[319369]: 2025-11-23 10:01:12.080054258 +0000 UTC m=+0.208707761 container init 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:01:12 np0005532585.localdomain podman[319369]: 2025-11-23 10:01:12.090111127 +0000 UTC m=+0.218764620 container start 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:12 np0005532585.localdomain dnsmasq[319387]: started, version 2.85 cachesize 150
Nov 23 10:01:12 np0005532585.localdomain dnsmasq[319387]: DNS service limited to local subnets
Nov 23 10:01:12 np0005532585.localdomain dnsmasq[319387]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:12 np0005532585.localdomain dnsmasq[319387]: warning: no upstream servers configured
Nov 23 10:01:12 np0005532585.localdomain dnsmasq-dhcp[319387]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:12 np0005532585.localdomain dnsmasq[319387]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:12 np0005532585.localdomain dnsmasq-dhcp[319387]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:12 np0005532585.localdomain dnsmasq-dhcp[319387]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:12.329 263258 INFO neutron.agent.dhcp.agent [None req-c0715c88-85f7-4f66-b68c-48c4d6f20ef2 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:12 np0005532585.localdomain ceph-mon[300199]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 767 B/s wr, 1 op/s
Nov 23 10:01:12 np0005532585.localdomain dnsmasq[319387]: exiting on receipt of SIGTERM
Nov 23 10:01:12 np0005532585.localdomain podman[319403]: 2025-11-23 10:01:12.435785367 +0000 UTC m=+0.061992352 container kill 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:12 np0005532585.localdomain systemd[1]: libpod-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837.scope: Deactivated successfully.
Nov 23 10:01:12 np0005532585.localdomain podman[319417]: 2025-11-23 10:01:12.506491914 +0000 UTC m=+0.055371437 container died 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 10:01:12 np0005532585.localdomain podman[319417]: 2025-11-23 10:01:12.543765702 +0000 UTC m=+0.092645195 container cleanup 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:01:12 np0005532585.localdomain systemd[1]: libpod-conmon-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837.scope: Deactivated successfully.
Nov 23 10:01:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 e115: 6 total, 6 up, 6 in
Nov 23 10:01:12 np0005532585.localdomain podman[319418]: 2025-11-23 10:01:12.580134483 +0000 UTC m=+0.121914086 container remove 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:12 np0005532585.localdomain kernel: device tap01290de6-b9 left promiscuous mode
Nov 23 10:01:12 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:12Z|00250|binding|INFO|Releasing lport 01290de6-b98c-45d9-85a6-538e887d4c81 from this chassis (sb_readonly=0)
Nov 23 10:01:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:12.643 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:12 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:12Z|00251|binding|INFO|Setting lport 01290de6-b98c-45d9-85a6-538e887d4c81 down in Southbound
Nov 23 10:01:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:12.656 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fefd:15c6/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=01290de6-b98c-45d9-85a6-538e887d4c81) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:12.658 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 01290de6-b98c-45d9-85a6-538e887d4c81 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:01:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:12.661 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:12 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:12.662 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e877071c-3e77-41f6-8bc9-1e30b3e2a1bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:12.665 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:13.026 263258 INFO neutron.agent.dhcp.agent [None req-0fe671d9-6c7a-4b47-96ce-ebf75861e63e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-48c328cc9cc6bdb79002724ef74b00a7bbe9050aa89c002fb9e7e8bd3aac1e06-merged.mount: Deactivated successfully.
Nov 23 10:01:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:13 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:01:13 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:13.106 2 INFO neutron.agent.securitygroups_rpc [None req-a6cbb5e4-51d9-4058-bf96-80e547b16a25 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:13.210 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:13 np0005532585.localdomain ceph-mon[300199]: osdmap e115: 6 total, 6 up, 6 in
Nov 23 10:01:13 np0005532585.localdomain ceph-mon[300199]: pgmap v206: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 467 B/s rd, 934 B/s wr, 1 op/s
Nov 23 10:01:13 np0005532585.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 23 10:01:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:13.823 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.3 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:13.826 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:13.829 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:13.830 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[144cc3ab-55b7-4daf-81cb-23f53dad8846]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.473 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.474 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.474 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:01:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:14.475 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:01:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:01:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:01:14 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:01:14 np0005532585.localdomain podman[319451]: 2025-11-23 10:01:14.952399551 +0000 UTC m=+0.066740937 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 10:01:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:14.955 263258 INFO neutron.agent.linux.ip_lib [None req-0bb28557-ccbc-4c5e-be5a-312a54185d11 - - - - - -] Device tap67c1b2ca-3e cannot be used as it has no MAC address
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.022 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain kernel: device tap67c1b2ca-3e entered promiscuous mode
Nov 23 10:01:15 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892075.0280] manager: (tap67c1b2ca-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Nov 23 10:01:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:15Z|00252|binding|INFO|Claiming lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a for this chassis.
Nov 23 10:01:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:15Z|00253|binding|INFO|67c1b2ca-3e93-4592-92bd-bb626f12e09a: Claiming unknown
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.027 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:15Z|00254|binding|INFO|Setting lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a ovn-installed in OVS
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.032 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain systemd-udevd[319494]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.035 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:15Z|00255|binding|INFO|Setting lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a up in Southbound
Nov 23 10:01:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:15.042 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=67c1b2ca-3e93-4592-92bd-bb626f12e09a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:15.044 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 67c1b2ca-3e93-4592-92bd-bb626f12e09a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:01:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:15.045 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:15.045 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:15.046 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[250861cb-1f84-403c-9966-063aa7b8d9ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain podman[319453]: 2025-11-23 10:01:15.058924393 +0000 UTC m=+0.169091170 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device
Nov 23 10:01:15 np0005532585.localdomain podman[319451]: 2025-11-23 10:01:15.075442802 +0000 UTC m=+0.189784198 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.083 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain podman[319453]: 2025-11-23 10:01:15.087796362 +0000 UTC m=+0.197963119 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:01:15 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:01:15 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:01:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:15.106 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:15 np0005532585.localdomain systemd[1]: tmp-crun.KWJPmW.mount: Deactivated successfully.
Nov 23 10:01:15 np0005532585.localdomain podman[319450]: 2025-11-23 10:01:15.180706404 +0000 UTC m=+0.294535044 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 23 10:01:15 np0005532585.localdomain podman[319450]: 2025-11-23 10:01:15.213288128 +0000 UTC m=+0.327116728 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 10:01:15 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:01:15 np0005532585.localdomain ceph-mon[300199]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 1.1 KiB/s wr, 21 op/s
Nov 23 10:01:15 np0005532585.localdomain podman[319588]: 
Nov 23 10:01:15 np0005532585.localdomain podman[319588]: 2025-11-23 10:01:15.927064926 +0000 UTC m=+0.089579200 container create cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:01:15 np0005532585.localdomain systemd[1]: Started libpod-conmon-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387.scope.
Nov 23 10:01:15 np0005532585.localdomain podman[319588]: 2025-11-23 10:01:15.886036242 +0000 UTC m=+0.048550516 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:15 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:15 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6504fdec3be8c610cf72edfd595f3b7128e3fb811123dbbe86b75a09399d63a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:16 np0005532585.localdomain podman[319588]: 2025-11-23 10:01:16.005433271 +0000 UTC m=+0.167947535 container init cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:01:16 np0005532585.localdomain podman[319588]: 2025-11-23 10:01:16.015103678 +0000 UTC m=+0.177617942 container start cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:01:16 np0005532585.localdomain dnsmasq[319607]: started, version 2.85 cachesize 150
Nov 23 10:01:16 np0005532585.localdomain dnsmasq[319607]: DNS service limited to local subnets
Nov 23 10:01:16 np0005532585.localdomain dnsmasq[319607]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:16 np0005532585.localdomain dnsmasq[319607]: warning: no upstream servers configured
Nov 23 10:01:16 np0005532585.localdomain dnsmasq-dhcp[319607]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:16 np0005532585.localdomain dnsmasq[319607]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:16 np0005532585.localdomain dnsmasq-dhcp[319607]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:16 np0005532585.localdomain dnsmasq-dhcp[319607]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:16.226 263258 INFO neutron.agent.dhcp.agent [None req-b633e6aa-3662-43dc-b39e-ecdb0bbf128a - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:16.344 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:16.345 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:16.349 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:16.349 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:16.350 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[373f7654-0d98-41dd-a74b-fcb96c865d9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.395 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.419 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.419 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.421 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.421 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.421 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:01:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:16.455 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:16 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:16.647 2 INFO neutron.agent.securitygroups_rpc [None req-49a4dc10-ae4e-41b1-8135-2f2053e37dc6 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:16 np0005532585.localdomain dnsmasq[319607]: exiting on receipt of SIGTERM
Nov 23 10:01:16 np0005532585.localdomain podman[319625]: 2025-11-23 10:01:16.685796319 +0000 UTC m=+0.058682538 container kill cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:01:16 np0005532585.localdomain systemd[1]: libpod-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387.scope: Deactivated successfully.
Nov 23 10:01:16 np0005532585.localdomain podman[319640]: 2025-11-23 10:01:16.75592007 +0000 UTC m=+0.060733893 container died cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:16 np0005532585.localdomain podman[319640]: 2025-11-23 10:01:16.791177715 +0000 UTC m=+0.095991468 container cleanup cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:16 np0005532585.localdomain systemd[1]: libpod-conmon-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387.scope: Deactivated successfully.
Nov 23 10:01:16 np0005532585.localdomain podman[319647]: 2025-11-23 10:01:16.841734312 +0000 UTC m=+0.130099918 container remove cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:01:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d6504fdec3be8c610cf72edfd595f3b7128e3fb811123dbbe86b75a09399d63a-merged.mount: Deactivated successfully.
Nov 23 10:01:16 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:17 np0005532585.localdomain ceph-mon[300199]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 965 B/s wr, 18 op/s
Nov 23 10:01:17 np0005532585.localdomain dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 0 addresses
Nov 23 10:01:17 np0005532585.localdomain podman[319712]: 2025-11-23 10:01:17.567311474 +0000 UTC m=+0.061269668 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:01:17 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host
Nov 23 10:01:17 np0005532585.localdomain dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts
Nov 23 10:01:17 np0005532585.localdomain podman[319752]: 
Nov 23 10:01:17 np0005532585.localdomain podman[319752]: 2025-11-23 10:01:17.774011672 +0000 UTC m=+0.093967975 container create 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:01:17 np0005532585.localdomain systemd[1]: Started libpod-conmon-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18.scope.
Nov 23 10:01:17 np0005532585.localdomain podman[319752]: 2025-11-23 10:01:17.729717867 +0000 UTC m=+0.049674230 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:17 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:17 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b617da3d100e467919636d843ab26dcfbf14a001e0fc52c63ea40186eb9f6dfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:17 np0005532585.localdomain podman[319752]: 2025-11-23 10:01:17.843235025 +0000 UTC m=+0.163191328 container init 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:17 np0005532585.localdomain podman[319752]: 2025-11-23 10:01:17.852229002 +0000 UTC m=+0.172185315 container start 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:17 np0005532585.localdomain dnsmasq[319773]: started, version 2.85 cachesize 150
Nov 23 10:01:17 np0005532585.localdomain dnsmasq[319773]: DNS service limited to local subnets
Nov 23 10:01:17 np0005532585.localdomain dnsmasq[319773]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:17 np0005532585.localdomain dnsmasq[319773]: warning: no upstream servers configured
Nov 23 10:01:17 np0005532585.localdomain dnsmasq-dhcp[319773]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:17 np0005532585.localdomain dnsmasq[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:17 np0005532585.localdomain dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:17 np0005532585.localdomain dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:17 np0005532585.localdomain systemd[1]: tmp-crun.EQ8Snn.mount: Deactivated successfully.
Nov 23 10:01:18 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:18.011 263258 INFO neutron.agent.dhcp.agent [None req-e06b79e9-b2de-4d34-b138-d1fabb92b56f - - - - - -] DHCP configuration for ports {'67c1b2ca-3e93-4592-92bd-bb626f12e09a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:18.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:18.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:01:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:18.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:01:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:18.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:01:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:18.234 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:01:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:18.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:01:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:18 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:18.573 2 INFO neutron.agent.securitygroups_rpc [None req-599670a7-0640-44a2-ad54-406dd4624d40 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:19 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:19.159 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:17Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd2b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdddf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cddd90>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd100>], id=dbfc41c4-f19b-4c9e-a49b-4b0a906e0b61, ip_allocation=immediate, mac_address=fa:16:3e:ee:36:51, name=tempest-NetworksTestDHCPv6-557607304, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['31b40284-1383-49a1-83f1-1def70e46b7c', '689a83b7-98fd-403b-84c3-bcdcd384cd1f'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:14Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1526, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:18Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:01:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1119170908' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/274354533' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/525196349' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:19 np0005532585.localdomain kernel: device tap3016fb40-93 left promiscuous mode
Nov 23 10:01:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:19Z|00256|binding|INFO|Releasing lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 from this chassis (sb_readonly=0)
Nov 23 10:01:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:19Z|00257|binding|INFO|Setting lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 down in Southbound
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.219 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de25a77c-b919-41f6-9a1b-fd3e354e84bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=3016fb40-93ab-4df3-956c-74e722dc2fa2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.221 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 3016fb40-93ab-4df3-956c-74e722dc2fa2 in datapath d0e9752d-2178-4eb0-b091-dd4d434021e5 unbound from our chassis
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.224 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d0e9752d-2178-4eb0-b091-dd4d434021e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.225 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c27c55d7-e81c-4d35-9805-596848d4b74c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.230 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:19 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:19.269 2 INFO neutron.agent.securitygroups_rpc [None req-7242a629-d88b-4313-9e67-39aa189122ef fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:19 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:19.298 263258 INFO neutron.agent.linux.ip_lib [None req-96ca0c12-1709-4815-824e-eabac43a2935 - - - - - -] Device tap9a92ea95-74 cannot be used as it has no MAC address
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.321 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:19 np0005532585.localdomain sshd[319816]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:01:19 np0005532585.localdomain kernel: device tap9a92ea95-74 entered promiscuous mode
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:19 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892079.3284] manager: (tap9a92ea95-74): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Nov 23 10:01:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:19Z|00258|binding|INFO|Claiming lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 for this chassis.
Nov 23 10:01:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:19Z|00259|binding|INFO|9a92ea95-742f-47d6-b9a5-b24454278ac2: Claiming unknown
Nov 23 10:01:19 np0005532585.localdomain systemd-udevd[319819]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.346 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a180dbb035ce42ac9ec3178829ba27ed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fbea3ea-671c-4b0c-8df6-a11c66c76ac2, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=9a92ea95-742f-47d6-b9a5-b24454278ac2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.347 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9a92ea95-742f-47d6-b9a5-b24454278ac2 in datapath a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee bound to our chassis
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.349 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:01:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:19.349 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4372becd-27b0-4c99-8fd5-fe6dd4aaf531]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.356 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:19Z|00260|binding|INFO|Setting lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 ovn-installed in OVS
Nov 23 10:01:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:19Z|00261|binding|INFO|Setting lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 up in Southbound
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9a92ea95-74: No such device
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.441 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:19 np0005532585.localdomain systemd[1]: tmp-crun.KrFQML.mount: Deactivated successfully.
Nov 23 10:01:19 np0005532585.localdomain podman[319827]: 2025-11-23 10:01:19.460959269 +0000 UTC m=+0.097838095 container kill 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:19 np0005532585.localdomain dnsmasq[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:01:19 np0005532585.localdomain dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:19 np0005532585.localdomain dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:01:19 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/437465050' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.513 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.278s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.584 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.585 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:01:19 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:19.728 263258 INFO neutron.agent.dhcp.agent [None req-484ed451-1cab-473f-801c-5e2b0b9ad6c2 - - - - - -] DHCP configuration for ports {'dbfc41c4-f19b-4c9e-a49b-4b0a906e0b61'} is completed
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.797 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.799 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11199MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.800 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.800 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.894 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.895 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.895 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:01:19 np0005532585.localdomain sshd[319816]: Received disconnect from 107.172.15.139 port 47386:11: Bye Bye [preauth]
Nov 23 10:01:19 np0005532585.localdomain sshd[319816]: Disconnected from authenticating user root 107.172.15.139 port 47386 [preauth]
Nov 23 10:01:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:19.937 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:01:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1791639122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:20 np0005532585.localdomain ceph-mon[300199]: pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 307 B/s wr, 16 op/s
Nov 23 10:01:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/437465050' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:20 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:20.239 2 INFO neutron.agent.securitygroups_rpc [None req-55e03d23-65b9-4d8a-853a-a5da3e2be7a3 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:20 np0005532585.localdomain podman[319940]: 
Nov 23 10:01:20 np0005532585.localdomain podman[319940]: 2025-11-23 10:01:20.254767193 +0000 UTC m=+0.082527824 container create 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:01:20 np0005532585.localdomain systemd[1]: Started libpod-conmon-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04.scope.
Nov 23 10:01:20 np0005532585.localdomain podman[319940]: 2025-11-23 10:01:20.210882351 +0000 UTC m=+0.038643022 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:20 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:20 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82de1d62aafaefd90894da078f8a1a469742bc8b5c3eb4a9608ec437320d7a2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:20 np0005532585.localdomain podman[319940]: 2025-11-23 10:01:20.33192916 +0000 UTC m=+0.159689801 container init 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 10:01:20 np0005532585.localdomain podman[319940]: 2025-11-23 10:01:20.352099511 +0000 UTC m=+0.179860142 container start 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319974]: started, version 2.85 cachesize 150
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319974]: DNS service limited to local subnets
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319974]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319974]: warning: no upstream servers configured
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319974]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 0 addresses
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:20 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:01:20 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3857005358' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:20.420 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.421 263258 INFO neutron.agent.dhcp.agent [None req-96ca0c12-1709-4815-824e-eabac43a2935 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:18Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c7bb80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c7b430>], id=dcca1c20-9ecb-4374-ad1b-f605ac775db9, ip_allocation=immediate, mac_address=fa:16:3e:fc:db:47, name=tempest-AllowedAddressPairIpV6TestJSON-769159927, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1528, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:18Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee
Nov 23 10:01:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:20.434 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:01:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:20.439 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:01:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:20.456 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:01:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:20.458 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:01:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:20.458 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:01:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.473 263258 INFO neutron.agent.dhcp.agent [None req-2377f124-9d87-41f7-9796-6138ac709a7b - - - - - -] DHCP configuration for ports {'c2a69d57-5768-4e92-a190-af21210eb643'} is completed
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319108]: exiting on receipt of SIGTERM
Nov 23 10:01:20 np0005532585.localdomain podman[319992]: 2025-11-23 10:01:20.477970908 +0000 UTC m=+0.037611279 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:20 np0005532585.localdomain systemd[1]: libpod-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186.scope: Deactivated successfully.
Nov 23 10:01:20 np0005532585.localdomain podman[320020]: 2025-11-23 10:01:20.549337037 +0000 UTC m=+0.055390237 container died d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:01:20 np0005532585.localdomain podman[320020]: 2025-11-23 10:01:20.580150987 +0000 UTC m=+0.086204157 container cleanup d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:01:20 np0005532585.localdomain systemd[1]: libpod-conmon-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186.scope: Deactivated successfully.
Nov 23 10:01:20 np0005532585.localdomain podman[320021]: 2025-11-23 10:01:20.621708717 +0000 UTC m=+0.126407055 container remove d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:20 np0005532585.localdomain podman[320054]: 2025-11-23 10:01:20.647749219 +0000 UTC m=+0.099152485 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:20 np0005532585.localdomain dnsmasq[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:20 np0005532585.localdomain dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:20 np0005532585.localdomain podman[320015]: 2025-11-23 10:01:20.661619876 +0000 UTC m=+0.172254598 container kill 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:01:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.849 263258 INFO neutron.agent.dhcp.agent [None req-4770a38f-60f3-41a2-9dff-e09784566601 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.915 263258 INFO neutron.agent.dhcp.agent [None req-ac4d95ed-da65-419c-993b-bab8231fff13 - - - - - -] DHCP configuration for ports {'dcca1c20-9ecb-4374-ad1b-f605ac775db9'} is completed
Nov 23 10:01:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.961 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:21 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3857005358' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:01:21 np0005532585.localdomain systemd[1]: tmp-crun.yICxm0.mount: Deactivated successfully.
Nov 23 10:01:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b893594a3b7f21ead5de7308fbf557090d81368903c395e5edbb120a0248d337-merged.mount: Deactivated successfully.
Nov 23 10:01:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:21 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dd0e9752d\x2d2178\x2d4eb0\x2db091\x2ddd4d434021e5.mount: Deactivated successfully.
Nov 23 10:01:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:21.458 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:21.461 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:21.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:21.462 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:21.886 2 INFO neutron.agent.securitygroups_rpc [None req-da0cfd7f-9c49-4dab-bd78-effdeef69255 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:21.993 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac55e0>], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:20Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac55b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ac5430>], id=be42f629-f772-4cb8-bcb6-19f1e43b36b3, ip_allocation=immediate, mac_address=fa:16:3e:7c:74:ec, name=tempest-AllowedAddressPairIpV6TestJSON-1721931869, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1536, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:20Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee
Nov 23 10:01:22 np0005532585.localdomain dnsmasq[319773]: exiting on receipt of SIGTERM
Nov 23 10:01:22 np0005532585.localdomain podman[320111]: 2025-11-23 10:01:22.055821935 +0000 UTC m=+0.064164878 container kill 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:01:22 np0005532585.localdomain systemd[1]: libpod-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18.scope: Deactivated successfully.
Nov 23 10:01:22 np0005532585.localdomain podman[320135]: 2025-11-23 10:01:22.131981011 +0000 UTC m=+0.064919671 container died 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:22 np0005532585.localdomain podman[320135]: 2025-11-23 10:01:22.167093672 +0000 UTC m=+0.100032312 container cleanup 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:01:22 np0005532585.localdomain systemd[1]: libpod-conmon-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18.scope: Deactivated successfully.
Nov 23 10:01:22 np0005532585.localdomain ceph-mon[300199]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 307 B/s wr, 16 op/s
Nov 23 10:01:22 np0005532585.localdomain podman[320142]: 2025-11-23 10:01:22.226188343 +0000 UTC m=+0.148457275 container remove 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:22 np0005532585.localdomain systemd[1]: tmp-crun.jqERwL.mount: Deactivated successfully.
Nov 23 10:01:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b617da3d100e467919636d843ab26dcfbf14a001e0fc52c63ea40186eb9f6dfb-merged.mount: Deactivated successfully.
Nov 23 10:01:22 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:22.268 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:22 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses
Nov 23 10:01:22 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:22 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:22 np0005532585.localdomain podman[320171]: 2025-11-23 10:01:22.317832546 +0000 UTC m=+0.061916048 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:22.589 263258 INFO neutron.agent.dhcp.agent [None req-28a6886c-072c-4559-95cd-95b2843586c3 - - - - - -] DHCP configuration for ports {'be42f629-f772-4cb8-bcb6-19f1e43b36b3'} is completed
Nov 23 10:01:23 np0005532585.localdomain podman[320238]: 
Nov 23 10:01:23 np0005532585.localdomain podman[320238]: 2025-11-23 10:01:23.112209248 +0000 UTC m=+0.108938707 container create 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:23 np0005532585.localdomain systemd[1]: Started libpod-conmon-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d.scope.
Nov 23 10:01:23 np0005532585.localdomain podman[320238]: 2025-11-23 10:01:23.057092279 +0000 UTC m=+0.053821818 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:23 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:23 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f0a8345e1c8f1f0588811cc6ec76af6b50a76c7923797df817a5dc7f62282f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:23 np0005532585.localdomain podman[320238]: 2025-11-23 10:01:23.197155634 +0000 UTC m=+0.193885103 container init 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:01:23 np0005532585.localdomain podman[320238]: 2025-11-23 10:01:23.20613823 +0000 UTC m=+0.202867689 container start 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:01:23 np0005532585.localdomain dnsmasq[320256]: started, version 2.85 cachesize 150
Nov 23 10:01:23 np0005532585.localdomain dnsmasq[320256]: DNS service limited to local subnets
Nov 23 10:01:23 np0005532585.localdomain dnsmasq[320256]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:23 np0005532585.localdomain dnsmasq[320256]: warning: no upstream servers configured
Nov 23 10:01:23 np0005532585.localdomain dnsmasq[320256]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:23 np0005532585.localdomain sshd[320257]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:23.462 263258 INFO neutron.agent.dhcp.agent [None req-bf3bb32d-907a-4bf1-b4ff-6b4ebb008875 - - - - - -] DHCP configuration for ports {'67c1b2ca-3e93-4592-92bd-bb626f12e09a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:23Z|00262|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:23.561 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 283 B/s wr, 14 op/s
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.626002) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083626032, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2550, "num_deletes": 259, "total_data_size": 3401241, "memory_usage": 3459920, "flush_reason": "Manual Compaction"}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083637805, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2197040, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21220, "largest_seqno": 23765, "table_properties": {"data_size": 2187848, "index_size": 5697, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20399, "raw_average_key_size": 21, "raw_value_size": 2168831, "raw_average_value_size": 2254, "num_data_blocks": 249, "num_entries": 962, "num_filter_entries": 962, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891914, "oldest_key_time": 1763891914, "file_creation_time": 1763892083, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 11852 microseconds, and 5590 cpu microseconds.
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.637850) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2197040 bytes OK
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.637873) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.640426) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.640448) EVENT_LOG_v1 {"time_micros": 1763892083640442, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.640468) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3389840, prev total WAL file size 3389840, number of live WAL files 2.
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.641249) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2145KB)], [36(16MB)]
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083641285, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19079701, "oldest_snapshot_seqno": -1}
Nov 23 10:01:23 np0005532585.localdomain systemd[1]: tmp-crun.QIEk7H.mount: Deactivated successfully.
Nov 23 10:01:23 np0005532585.localdomain podman[320275]: 2025-11-23 10:01:23.658864228 +0000 UTC m=+0.059280307 container kill 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:23 np0005532585.localdomain dnsmasq[320256]: exiting on receipt of SIGTERM
Nov 23 10:01:23 np0005532585.localdomain systemd[1]: libpod-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d.scope: Deactivated successfully.
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12593 keys, 16078574 bytes, temperature: kUnknown
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083714733, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 16078574, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16006852, "index_size": 39173, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 338609, "raw_average_key_size": 26, "raw_value_size": 15792162, "raw_average_value_size": 1254, "num_data_blocks": 1477, "num_entries": 12593, "num_filter_entries": 12593, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892083, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.715250) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 16078574 bytes
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.717151) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 258.7 rd, 218.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 16.1 +0.0 blob) out(15.3 +0.0 blob), read-write-amplify(16.0) write-amplify(7.3) OK, records in: 13126, records dropped: 533 output_compression: NoCompression
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.717180) EVENT_LOG_v1 {"time_micros": 1763892083717168, "job": 20, "event": "compaction_finished", "compaction_time_micros": 73762, "compaction_time_cpu_micros": 43622, "output_level": 6, "num_output_files": 1, "total_output_size": 16078574, "num_input_records": 13126, "num_output_records": 12593, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083718094, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083720584, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.641189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:23 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:23 np0005532585.localdomain podman[320289]: 2025-11-23 10:01:23.728072769 +0000 UTC m=+0.053447317 container died 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:01:23 np0005532585.localdomain podman[320289]: 2025-11-23 10:01:23.759619332 +0000 UTC m=+0.084993840 container cleanup 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:01:23 np0005532585.localdomain systemd[1]: libpod-conmon-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d.scope: Deactivated successfully.
Nov 23 10:01:23 np0005532585.localdomain podman[320291]: 2025-11-23 10:01:23.794239378 +0000 UTC m=+0.115174749 container remove 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:01:23 np0005532585.localdomain sshd[320257]: Invalid user postgres from 207.154.194.2 port 46774
Nov 23 10:01:23 np0005532585.localdomain sshd[320257]: Received disconnect from 207.154.194.2 port 46774:11: Bye Bye [preauth]
Nov 23 10:01:23 np0005532585.localdomain sshd[320257]: Disconnected from invalid user postgres 207.154.194.2 port 46774 [preauth]
Nov 23 10:01:24 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:24.130 2 INFO neutron.agent.securitygroups_rpc [None req-cef856d0-bc0f-421d-b6e3-a61c28d38f99 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5f0a8345e1c8f1f0588811cc6ec76af6b50a76c7923797df817a5dc7f62282f2-merged.mount: Deactivated successfully.
Nov 23 10:01:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:24.336 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:24 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses
Nov 23 10:01:24 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:24 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:24 np0005532585.localdomain podman[320334]: 2025-11-23 10:01:24.377012961 +0000 UTC m=+0.061127074 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 10:01:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:01:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:01:24 np0005532585.localdomain podman[320347]: 2025-11-23 10:01:24.497233584 +0000 UTC m=+0.092496580 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:01:24 np0005532585.localdomain podman[320347]: 2025-11-23 10:01:24.539351322 +0000 UTC m=+0.134614308 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:24 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:01:24 np0005532585.localdomain podman[320348]: 2025-11-23 10:01:24.560810323 +0000 UTC m=+0.153964684 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:01:24 np0005532585.localdomain podman[320348]: 2025-11-23 10:01:24.568629134 +0000 UTC m=+0.161783505 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:01:24 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:01:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:25.113 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:25.114 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:25.118 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:25.118 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:25.119 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[842da903-5e54-4a7a-9e4d-ca6032751f25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:25.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:01:25 np0005532585.localdomain ceph-mon[300199]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 341 B/s wr, 20 op/s
Nov 23 10:01:25 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:25.880 2 INFO neutron.agent.securitygroups_rpc [None req-2cdcfc76-bcd0-4cc8-b710-4dffd4e83ae1 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:26.002 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:24Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6f40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b33f70>], id=44238cca-3aef-4386-9b52-e2dbbf93973c, ip_allocation=immediate, mac_address=fa:16:3e:47:a8:3d, name=tempest-AllowedAddressPairIpV6TestJSON-851427365, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1551, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:25Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee
Nov 23 10:01:26 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses
Nov 23 10:01:26 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:26 np0005532585.localdomain podman[320439]: 2025-11-23 10:01:26.198578485 +0000 UTC m=+0.055479369 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:26 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:26.427 263258 INFO neutron.agent.dhcp.agent [None req-18f24ecb-a96d-474b-b9ea-376300dd66ae - - - - - -] DHCP configuration for ports {'44238cca-3aef-4386-9b52-e2dbbf93973c'} is completed
Nov 23 10:01:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:26.490 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:26 np0005532585.localdomain podman[320485]: 
Nov 23 10:01:26 np0005532585.localdomain podman[320485]: 2025-11-23 10:01:26.680186251 +0000 UTC m=+0.077590460 container create cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:26 np0005532585.localdomain systemd[1]: Started libpod-conmon-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced.scope.
Nov 23 10:01:26 np0005532585.localdomain systemd[1]: tmp-crun.3xnVU3.mount: Deactivated successfully.
Nov 23 10:01:26 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:26 np0005532585.localdomain podman[320485]: 2025-11-23 10:01:26.635299948 +0000 UTC m=+0.032704207 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:26 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/501fc31257ee79b0f2a66e32bf5adfac954ccbcbdac3591e5a80dde2af71d298/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:26 np0005532585.localdomain podman[320485]: 2025-11-23 10:01:26.748060923 +0000 UTC m=+0.145465132 container init cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:26 np0005532585.localdomain podman[320485]: 2025-11-23 10:01:26.75706877 +0000 UTC m=+0.154472979 container start cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:01:26 np0005532585.localdomain dnsmasq[320503]: started, version 2.85 cachesize 150
Nov 23 10:01:26 np0005532585.localdomain dnsmasq[320503]: DNS service limited to local subnets
Nov 23 10:01:26 np0005532585.localdomain dnsmasq[320503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:26 np0005532585.localdomain dnsmasq[320503]: warning: no upstream servers configured
Nov 23 10:01:26 np0005532585.localdomain dnsmasq-dhcp[320503]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:26 np0005532585.localdomain dnsmasq[320503]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:26 np0005532585.localdomain dnsmasq-dhcp[320503]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:26 np0005532585.localdomain dnsmasq-dhcp[320503]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:26Z|00263|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:26.930 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:27.059 263258 INFO neutron.agent.dhcp.agent [None req-3f8f8920-baf4-42cc-ada8-a3603bb1b297 - - - - - -] DHCP configuration for ports {'67c1b2ca-3e93-4592-92bd-bb626f12e09a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:27 np0005532585.localdomain ceph-mon[300199]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 85 B/s wr, 6 op/s
Nov 23 10:01:28 np0005532585.localdomain podman[320520]: 2025-11-23 10:01:28.256010566 +0000 UTC m=+0.060011750 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:28 np0005532585.localdomain dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 0 addresses
Nov 23 10:01:28 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host
Nov 23 10:01:28 np0005532585.localdomain dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts
Nov 23 10:01:28 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:28.343 2 INFO neutron.agent.securitygroups_rpc [None req-ac3982b0-5538-471d-bc5b-e6d4442ddedd fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:28.482 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:28 np0005532585.localdomain kernel: device tapb74e35ad-94 left promiscuous mode
Nov 23 10:01:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:28Z|00264|binding|INFO|Releasing lport b74e35ad-94a2-4d4d-af80-3b2024099e6d from this chassis (sb_readonly=0)
Nov 23 10:01:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:28Z|00265|binding|INFO|Setting lport b74e35ad-94a2-4d4d-af80-3b2024099e6d down in Southbound
Nov 23 10:01:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:28.498 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:28.501 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a45cfb38-a270-4929-a93b-8d89273d60d1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b74e35ad-94a2-4d4d-af80-3b2024099e6d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:28.502 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b74e35ad-94a2-4d4d-af80-3b2024099e6d in datapath bcc66174-371f-4faf-83f1-5de56d4886ad unbound from our chassis
Nov 23 10:01:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:28.504 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcc66174-371f-4faf-83f1-5de56d4886ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:28.504 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[de1459e9-a989-477d-bcc2-704bacad5eb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:28 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses
Nov 23 10:01:28 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:28 np0005532585.localdomain podman[320558]: 2025-11-23 10:01:28.559536506 +0000 UTC m=+0.038474337 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:01:28 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:29 np0005532585.localdomain ceph-mon[300199]: pgmap v214: 177 pgs: 177 active+clean; 192 MiB data, 798 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 37 op/s
Nov 23 10:01:29 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:29.655 2 INFO neutron.agent.securitygroups_rpc [None req-efb808b5-6542-451b-803e-d2906345bde2 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:29.691 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:28Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa7040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa7f70>], id=1715b7fd-d248-410f-8cd9-5fcf625fd736, ip_allocation=immediate, mac_address=fa:16:3e:f8:04:63, name=tempest-AllowedAddressPairIpV6TestJSON-790328342, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1554, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:29Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee
Nov 23 10:01:29 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses
Nov 23 10:01:29 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:29 np0005532585.localdomain podman[320610]: 2025-11-23 10:01:29.890359992 +0000 UTC m=+0.060041480 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:01:29 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:29 np0005532585.localdomain podman[320623]: 2025-11-23 10:01:29.948414571 +0000 UTC m=+0.065855170 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:01:29 np0005532585.localdomain dnsmasq[317960]: exiting on receipt of SIGTERM
Nov 23 10:01:29 np0005532585.localdomain systemd[1]: libpod-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736.scope: Deactivated successfully.
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:01:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:01:30 np0005532585.localdomain podman[320643]: 2025-11-23 10:01:30.036016039 +0000 UTC m=+0.071029828 container died c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:01:30 np0005532585.localdomain podman[320643]: 2025-11-23 10:01:30.066435087 +0000 UTC m=+0.101448786 container cleanup c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:01:30 np0005532585.localdomain systemd[1]: libpod-conmon-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736.scope: Deactivated successfully.
Nov 23 10:01:30 np0005532585.localdomain podman[320645]: 2025-11-23 10:01:30.113204358 +0000 UTC m=+0.139539720 container remove c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:30.179 263258 INFO neutron.agent.dhcp.agent [None req-2fbb10bd-f0f6-4cc9-8bea-d1d6b8f50c07 - - - - - -] DHCP configuration for ports {'1715b7fd-d248-410f-8cd9-5fcf625fd736'} is completed
Nov 23 10:01:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:30.623 263258 INFO neutron.agent.dhcp.agent [None req-605c187b-5ced-4280-a2c8-755324b78819 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c21502b2347b802ca27076444fd406a39385c92b799f42a0977ebddef8e75100-merged.mount: Deactivated successfully.
Nov 23 10:01:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:30 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dbcc66174\x2d371f\x2d4faf\x2d83f1\x2d5de56d4886ad.mount: Deactivated successfully.
Nov 23 10:01:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:30.971 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1011787999' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1011787999' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:31.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:31.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:32 np0005532585.localdomain ceph-mon[300199]: pgmap v215: 177 pgs: 177 active+clean; 192 MiB data, 798 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 37 op/s
Nov 23 10:01:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:32.908 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:33 np0005532585.localdomain ceph-mon[300199]: pgmap v216: 177 pgs: 177 active+clean; 192 MiB data, 798 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 37 op/s
Nov 23 10:01:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:33.734 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:33.736 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:33.739 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:33.739 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:33.740 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[89a9b992-5bc5-41f7-b90e-56d291534c60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:34Z|00266|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:34.278 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:34 np0005532585.localdomain podman[320690]: 2025-11-23 10:01:34.782033814 +0000 UTC m=+0.043110439 container kill cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:01:34 np0005532585.localdomain dnsmasq[320503]: exiting on receipt of SIGTERM
Nov 23 10:01:34 np0005532585.localdomain systemd[1]: tmp-crun.qXnmFL.mount: Deactivated successfully.
Nov 23 10:01:34 np0005532585.localdomain systemd[1]: libpod-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced.scope: Deactivated successfully.
Nov 23 10:01:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:01:34 np0005532585.localdomain podman[320703]: 2025-11-23 10:01:34.827289058 +0000 UTC m=+0.035966699 container died cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:01:34 np0005532585.localdomain systemd[1]: tmp-crun.PVJ37A.mount: Deactivated successfully.
Nov 23 10:01:34 np0005532585.localdomain podman[320703]: 2025-11-23 10:01:34.8624354 +0000 UTC m=+0.071113021 container cleanup cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:01:34 np0005532585.localdomain systemd[1]: libpod-conmon-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced.scope: Deactivated successfully.
Nov 23 10:01:34 np0005532585.localdomain podman[320707]: 2025-11-23 10:01:34.930021722 +0000 UTC m=+0.127275641 container remove cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:34 np0005532585.localdomain podman[320711]: 2025-11-23 10:01:34.905174907 +0000 UTC m=+0.093397128 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 10:01:34 np0005532585.localdomain podman[320711]: 2025-11-23 10:01:34.986470551 +0000 UTC m=+0.174692702 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 0 addresses
Nov 23 10:01:35 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host
Nov 23 10:01:35 np0005532585.localdomain podman[320774]: 2025-11-23 10:01:35.07698113 +0000 UTC m=+0.043996417 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:01:35 np0005532585.localdomain dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts
Nov 23 10:01:35 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:35.211 2 INFO neutron.agent.securitygroups_rpc [None req-9e0f6898-e450-4b10-a9b6-2526799f670d fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:35Z|00267|binding|INFO|Releasing lport 10e1f965-5681-42dc-916c-83e697ea474c from this chassis (sb_readonly=0)
Nov 23 10:01:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:35Z|00268|binding|INFO|Setting lport 10e1f965-5681-42dc-916c-83e697ea474c down in Southbound
Nov 23 10:01:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:35.283 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:35 np0005532585.localdomain kernel: device tap10e1f965-56 left promiscuous mode
Nov 23 10:01:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:35.293 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10754024-8e92-4669-8b3d-be0210470d0a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=10e1f965-5681-42dc-916c-83e697ea474c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:35.296 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 10e1f965-5681-42dc-916c-83e697ea474c in datapath 30192eb7-6210-4b4d-956f-dbc64d7c0b7c unbound from our chassis
Nov 23 10:01:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:35.299 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:35.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:35.300 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3abb7f62-d340-40f7-a26f-46b621d0b5a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:35 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses
Nov 23 10:01:35 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:35 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:35 np0005532585.localdomain podman[320835]: 2025-11-23 10:01:35.493421088 +0000 UTC m=+0.054091997 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:01:35 np0005532585.localdomain ceph-mon[300199]: pgmap v217: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 57 op/s
Nov 23 10:01:35 np0005532585.localdomain dnsmasq[317068]: exiting on receipt of SIGTERM
Nov 23 10:01:35 np0005532585.localdomain podman[320873]: 2025-11-23 10:01:35.668793771 +0000 UTC m=+0.076254400 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: libpod-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f.scope: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:01:35 np0005532585.localdomain podman[320891]: 2025-11-23 10:01:35.714535789 +0000 UTC m=+0.038467965 container died b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-501fc31257ee79b0f2a66e32bf5adfac954ccbcbdac3591e5a80dde2af71d298-merged.mount: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0701b612469235a6b7433f4e52ad0b8dcaf36306964dddada708c650d1295ed6-merged.mount: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain podman[320899]: 2025-11-23 10:01:35.78402071 +0000 UTC m=+0.090939602 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:01:35 np0005532585.localdomain podman[320891]: 2025-11-23 10:01:35.798169026 +0000 UTC m=+0.122101172 container cleanup b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: libpod-conmon-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f.scope: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain podman[320898]: 2025-11-23 10:01:35.82849339 +0000 UTC m=+0.132941457 container remove b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:01:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:35.855 263258 INFO neutron.agent.dhcp.agent [None req-348dfbab-34a2-4d1b-bc09-7758c516221a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:35.855 263258 INFO neutron.agent.dhcp.agent [None req-348dfbab-34a2-4d1b-bc09-7758c516221a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d30192eb7\x2d6210\x2d4b4d\x2d956f\x2ddbc64d7c0b7c.mount: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain podman[320899]: 2025-11-23 10:01:35.913755796 +0000 UTC m=+0.220674668 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:01:35 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:01:35 np0005532585.localdomain podman[320968]: 
Nov 23 10:01:35 np0005532585.localdomain podman[320968]: 2025-11-23 10:01:35.974551169 +0000 UTC m=+0.079536290 container create dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29.scope.
Nov 23 10:01:36 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:36Z|00269|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ee8801ca6f5ed344b19c84012e21d39d0de6ed956074fb53f69f665de5d851f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:36 np0005532585.localdomain podman[320968]: 2025-11-23 10:01:35.930995348 +0000 UTC m=+0.035980549 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:36 np0005532585.localdomain podman[320968]: 2025-11-23 10:01:36.041055638 +0000 UTC m=+0.146040799 container init dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:36.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:36 np0005532585.localdomain podman[320968]: 2025-11-23 10:01:36.052046947 +0000 UTC m=+0.157032108 container start dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:01:36 np0005532585.localdomain dnsmasq[320986]: started, version 2.85 cachesize 150
Nov 23 10:01:36 np0005532585.localdomain dnsmasq[320986]: DNS service limited to local subnets
Nov 23 10:01:36 np0005532585.localdomain dnsmasq[320986]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:36 np0005532585.localdomain dnsmasq[320986]: warning: no upstream servers configured
Nov 23 10:01:36 np0005532585.localdomain dnsmasq-dhcp[320986]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:36 np0005532585.localdomain dnsmasq-dhcp[320986]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:01:36 np0005532585.localdomain dnsmasq[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:36 np0005532585.localdomain dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:36 np0005532585.localdomain dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:36.225 263258 INFO neutron.agent.dhcp.agent [None req-f5f3ae7d-34d9-45de-a6b8-c1d05bd5e6ed - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', '67c1b2ca-3e93-4592-92bd-bb626f12e09a'} is completed
Nov 23 10:01:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:36.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:37 np0005532585.localdomain sudo[320987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:01:37 np0005532585.localdomain sudo[320987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:01:37 np0005532585.localdomain sudo[320987]: pam_unix(sudo:session): session closed for user root
Nov 23 10:01:37 np0005532585.localdomain sudo[321005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:01:37 np0005532585.localdomain sudo[321005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:01:37 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:37.252 2 INFO neutron.agent.securitygroups_rpc [None req-0d1bae38-c3b0-4ed4-ba2a-9b0a25fae4ab 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:37.379 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:35Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7ff10>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7f970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b5e5e0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b5e250>], id=3ba41a4f-9ca4-4e43-9003-2afe15de07b2, ip_allocation=immediate, mac_address=fa:16:3e:24:37:ae, name=tempest-NetworksTestDHCPv6-1057999249, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['029dda56-7cc4-415f-aff5-e1cbc56c7934', 'c6230737-2289-4ccb-b4a3-ea3c53474e91'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:26Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1567, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:36Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:01:37 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:37.500 2 INFO neutron.agent.securitygroups_rpc [None req-b60c8ac0-a755-4981-a84e-d95a726ca718 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:37 np0005532585.localdomain ceph-mon[300199]: pgmap v218: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.8 MiB/s wr, 50 op/s
Nov 23 10:01:37 np0005532585.localdomain systemd[1]: tmp-crun.4CSBfO.mount: Deactivated successfully.
Nov 23 10:01:37 np0005532585.localdomain dnsmasq[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:01:37 np0005532585.localdomain dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:37 np0005532585.localdomain dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:37 np0005532585.localdomain podman[321058]: 2025-11-23 10:01:37.63565101 +0000 UTC m=+0.048468654 container kill dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:37 np0005532585.localdomain sudo[321005]: pam_unix(sudo:session): session closed for user root
Nov 23 10:01:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:37.899 263258 INFO neutron.agent.dhcp.agent [None req-d63b3733-4f6c-4711-a9e5-a7ae0e7a3cf0 - - - - - -] DHCP configuration for ports {'3ba41a4f-9ca4-4e43-9003-2afe15de07b2'} is completed
Nov 23 10:01:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:38.016 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:36Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cc3f40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c04cd0>], id=27d54986-817e-4c51-af4c-54db06d3c104, ip_allocation=immediate, mac_address=fa:16:3e:bf:82:90, name=tempest-AllowedAddressPairIpV6TestJSON-526636575, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1569, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:36Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee
Nov 23 10:01:38 np0005532585.localdomain sudo[321112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:01:38 np0005532585.localdomain sudo[321112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:01:38 np0005532585.localdomain sudo[321112]: pam_unix(sudo:session): session closed for user root
Nov 23 10:01:38 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses
Nov 23 10:01:38 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:38 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:38 np0005532585.localdomain podman[321131]: 2025-11-23 10:01:38.226856652 +0000 UTC m=+0.074602370 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:01:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:38.471 263258 INFO neutron.agent.dhcp.agent [None req-fd797fa5-df38-4f00-b386-6dbfc563214b - - - - - -] DHCP configuration for ports {'27d54986-817e-4c51-af4c-54db06d3c104'} is completed
Nov 23 10:01:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:01:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:01:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:01:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:01:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:01:38 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:38.895 2 INFO neutron.agent.securitygroups_rpc [None req-e4114a4e-94f5-47df-bc1b-6dc35d238db6 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:39 np0005532585.localdomain systemd[1]: tmp-crun.myRyxo.mount: Deactivated successfully.
Nov 23 10:01:39 np0005532585.localdomain dnsmasq[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:39 np0005532585.localdomain dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:39 np0005532585.localdomain dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:39 np0005532585.localdomain podman[321169]: 2025-11-23 10:01:39.168553001 +0000 UTC m=+0.069864413 container kill dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:01:39 np0005532585.localdomain ceph-mon[300199]: pgmap v219: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.8 MiB/s wr, 50 op/s
Nov 23 10:01:39 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:39.785 2 INFO neutron.agent.securitygroups_rpc [None req-3c7abe8e-704d-455a-8480-13cb405babe1 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']
Nov 23 10:01:39 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:39.828 2 INFO neutron.agent.securitygroups_rpc [None req-3839d769-9cff-43fb-b0be-05026e050e30 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:39.848 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:39 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:39.887 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:38Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa7040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa71f0>], id=956e2e20-c773-4868-a462-57a23da288a3, ip_allocation=immediate, mac_address=fa:16:3e:b0:67:39, name=tempest-AllowedAddressPairIpV6TestJSON-651501806, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1573, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:39Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee
Nov 23 10:01:40 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:40.002 2 INFO neutron.agent.securitygroups_rpc [None req-3c7abe8e-704d-455a-8480-13cb405babe1 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']
Nov 23 10:01:40 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 3 addresses
Nov 23 10:01:40 np0005532585.localdomain podman[321210]: 2025-11-23 10:01:40.077238594 +0000 UTC m=+0.058831833 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:40 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:40 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:40 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:40.500 263258 INFO neutron.agent.dhcp.agent [None req-a1130b39-ef34-4073-b9dc-16486cf26a18 - - - - - -] DHCP configuration for ports {'956e2e20-c773-4868-a462-57a23da288a3'} is completed
Nov 23 10:01:40 np0005532585.localdomain systemd[1]: tmp-crun.Gp5u93.mount: Deactivated successfully.
Nov 23 10:01:40 np0005532585.localdomain dnsmasq[320986]: exiting on receipt of SIGTERM
Nov 23 10:01:40 np0005532585.localdomain podman[321249]: 2025-11-23 10:01:40.686726109 +0000 UTC m=+0.060196285 container kill dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:01:40 np0005532585.localdomain systemd[1]: libpod-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29.scope: Deactivated successfully.
Nov 23 10:01:40 np0005532585.localdomain podman[321265]: 2025-11-23 10:01:40.741515117 +0000 UTC m=+0.039169227 container died dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:01:40 np0005532585.localdomain systemd[1]: tmp-crun.dykTOO.mount: Deactivated successfully.
Nov 23 10:01:40 np0005532585.localdomain podman[321265]: 2025-11-23 10:01:40.78740163 +0000 UTC m=+0.085055700 container remove dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:01:40 np0005532585.localdomain systemd[1]: libpod-conmon-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29.scope: Deactivated successfully.
Nov 23 10:01:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:41.016 2 INFO neutron.agent.securitygroups_rpc [None req-7b7ae10d-5331-41d7-96b9-7aafc7180edd 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']
Nov 23 10:01:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7ee8801ca6f5ed344b19c84012e21d39d0de6ed956074fb53f69f665de5d851f-merged.mount: Deactivated successfully.
Nov 23 10:01:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:41.358 2 INFO neutron.agent.securitygroups_rpc [None req-9181f901-83e9-4198-8a73-d33dcd9ef0fc 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']
Nov 23 10:01:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:41.402 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:41 np0005532585.localdomain ceph-mon[300199]: pgmap v220: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 767 B/s wr, 19 op/s
Nov 23 10:01:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:41.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:41 np0005532585.localdomain podman[321340]: 
Nov 23 10:01:41 np0005532585.localdomain podman[321340]: 2025-11-23 10:01:41.595463324 +0000 UTC m=+0.077072366 container create 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:41 np0005532585.localdomain systemd[1]: Started libpod-conmon-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a.scope.
Nov 23 10:01:41 np0005532585.localdomain systemd[1]: tmp-crun.ouU3Gc.mount: Deactivated successfully.
Nov 23 10:01:41 np0005532585.localdomain podman[321340]: 2025-11-23 10:01:41.555713058 +0000 UTC m=+0.037322170 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:41 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:41 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4917278dbbdfc63e7fdece56c2d38985a74ee1b2db2810c2d7a7b81ee259f1d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:41 np0005532585.localdomain podman[321340]: 2025-11-23 10:01:41.677092798 +0000 UTC m=+0.158701850 container init 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:01:41 np0005532585.localdomain podman[321340]: 2025-11-23 10:01:41.685485027 +0000 UTC m=+0.167094069 container start 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:41 np0005532585.localdomain dnsmasq[321359]: started, version 2.85 cachesize 150
Nov 23 10:01:41 np0005532585.localdomain dnsmasq[321359]: DNS service limited to local subnets
Nov 23 10:01:41 np0005532585.localdomain dnsmasq[321359]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:41 np0005532585.localdomain dnsmasq[321359]: warning: no upstream servers configured
Nov 23 10:01:41 np0005532585.localdomain dnsmasq-dhcp[321359]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:01:41 np0005532585.localdomain dnsmasq[321359]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:41 np0005532585.localdomain dnsmasq-dhcp[321359]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:41 np0005532585.localdomain dnsmasq-dhcp[321359]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:01:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:01:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:41.909 2 INFO neutron.agent.securitygroups_rpc [None req-7527e00c-fa2d-4f7e-82d1-1aa51478130d fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:01:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157496 "" "Go-http-client/1.1"
Nov 23 10:01:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:41.927 263258 INFO neutron.agent.dhcp.agent [None req-e173d92d-386d-40cf-8b2f-abcc2575f9b0 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', '67c1b2ca-3e93-4592-92bd-bb626f12e09a'} is completed
Nov 23 10:01:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:01:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19712 "" "Go-http-client/1.1"
Nov 23 10:01:42 np0005532585.localdomain dnsmasq[321359]: exiting on receipt of SIGTERM
Nov 23 10:01:42 np0005532585.localdomain systemd[1]: libpod-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a.scope: Deactivated successfully.
Nov 23 10:01:42 np0005532585.localdomain podman[321377]: 2025-11-23 10:01:42.047489468 +0000 UTC m=+0.065380014 container kill 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:01:42 np0005532585.localdomain podman[321414]: 2025-11-23 10:01:42.126740099 +0000 UTC m=+0.051733564 container died 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:42 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses
Nov 23 10:01:42 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:42 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:42 np0005532585.localdomain podman[321416]: 2025-11-23 10:01:42.147768578 +0000 UTC m=+0.057525264 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:42 np0005532585.localdomain podman[321414]: 2025-11-23 10:01:42.22574756 +0000 UTC m=+0.150740975 container remove 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:01:42 np0005532585.localdomain systemd[1]: libpod-conmon-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a.scope: Deactivated successfully.
Nov 23 10:01:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:42.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:42Z|00270|binding|INFO|Releasing lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a from this chassis (sb_readonly=0)
Nov 23 10:01:42 np0005532585.localdomain kernel: device tap67c1b2ca-3e left promiscuous mode
Nov 23 10:01:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:42Z|00271|binding|INFO|Setting lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a down in Southbound
Nov 23 10:01:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:42.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:42.269 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe16:28e8/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=67c1b2ca-3e93-4592-92bd-bb626f12e09a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:42.271 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 67c1b2ca-3e93-4592-92bd-bb626f12e09a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:01:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:42.271 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:42.274 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:42 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:42.275 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[acef254e-4dae-463e-b4f5-4b9f85295b32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:42.484 263258 INFO neutron.agent.dhcp.agent [None req-e403bbe2-eaca-4a73-b431-214dd897fd09 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:42 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:42.739 2 INFO neutron.agent.securitygroups_rpc [None req-02caec08-4fe5-4eae-8f59-0f41df22086a fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:42 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses
Nov 23 10:01:42 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:42 np0005532585.localdomain podman[321474]: 2025-11-23 10:01:42.939333371 +0000 UTC m=+0.061207566 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:42 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4917278dbbdfc63e7fdece56c2d38985a74ee1b2db2810c2d7a7b81ee259f1d1-merged.mount: Deactivated successfully.
Nov 23 10:01:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:43 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:01:43 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:43.308 2 INFO neutron.agent.securitygroups_rpc [None req-7ad0c56d-3950-498c-8f5a-35533112ee18 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:43 np0005532585.localdomain ceph-mon[300199]: pgmap v221: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 767 B/s wr, 19 op/s
Nov 23 10:01:43 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:43.662 2 INFO neutron.agent.securitygroups_rpc [None req-6a3f8f10-e09b-4d25-8141-9c205b1a054c fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']
Nov 23 10:01:43 np0005532585.localdomain podman[321512]: 2025-11-23 10:01:43.889091379 +0000 UTC m=+0.065644183 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:01:43 np0005532585.localdomain dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 0 addresses
Nov 23 10:01:43 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host
Nov 23 10:01:43 np0005532585.localdomain dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.231 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.233 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.234 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.235 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[568dbc5f-5e02-4ed4-bd9a-5570fa630ec5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:44 np0005532585.localdomain dnsmasq[319974]: exiting on receipt of SIGTERM
Nov 23 10:01:44 np0005532585.localdomain podman[321552]: 2025-11-23 10:01:44.344081106 +0000 UTC m=+0.044941956 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:01:44 np0005532585.localdomain systemd[1]: libpod-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04.scope: Deactivated successfully.
Nov 23 10:01:44 np0005532585.localdomain podman[321567]: 2025-11-23 10:01:44.407636863 +0000 UTC m=+0.050370553 container died 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:44 np0005532585.localdomain podman[321567]: 2025-11-23 10:01:44.439450014 +0000 UTC m=+0.082183654 container cleanup 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:44 np0005532585.localdomain systemd[1]: libpod-conmon-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04.scope: Deactivated successfully.
Nov 23 10:01:44 np0005532585.localdomain podman[321568]: 2025-11-23 10:01:44.491798996 +0000 UTC m=+0.127480068 container remove 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:01:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:44Z|00272|binding|INFO|Releasing lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 from this chassis (sb_readonly=0)
Nov 23 10:01:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:44.503 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:44Z|00273|binding|INFO|Setting lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 down in Southbound
Nov 23 10:01:44 np0005532585.localdomain kernel: device tap9a92ea95-74 left promiscuous mode
Nov 23 10:01:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:44.520 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.523 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a180dbb035ce42ac9ec3178829ba27ed', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fbea3ea-671c-4b0c-8df6-a11c66c76ac2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=9a92ea95-742f-47d6-b9a5-b24454278ac2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.525 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9a92ea95-742f-47d6-b9a5-b24454278ac2 in datapath a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee unbound from our chassis
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.527 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:01:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:44.527 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2cbf5229-e1a9-43b4-beaf-a6fea3b16e4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:44.592 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2622836700' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2622836700' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.186 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.188 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.232 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:01:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:45.276 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:45 np0005532585.localdomain podman[321595]: 2025-11-23 10:01:45.318637228 +0000 UTC m=+0.073638660 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Nov 23 10:01:45 np0005532585.localdomain podman[321595]: 2025-11-23 10:01:45.327370327 +0000 UTC m=+0.082371819 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Nov 23 10:01:45 np0005532585.localdomain podman[321596]: 2025-11-23 10:01:45.338223822 +0000 UTC m=+0.084907188 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:01:45 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:45.341 2 INFO neutron.agent.securitygroups_rpc [None req-13ffd066-3677-4907-b0b4-8f11a42c5f7c a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-82de1d62aafaefd90894da078f8a1a469742bc8b5c3eb4a9608ec437320d7a2a-merged.mount: Deactivated successfully.
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2da401e9c6\x2d07fc\x2d4d23\x2d8b8b\x2dff6246c4e8ee.mount: Deactivated successfully.
Nov 23 10:01:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:45.396 263258 INFO neutron.agent.linux.ip_lib [None req-fec28012-90eb-45f4-8f7f-93722b94c6dd - - - - - -] Device tapbea321e5-56 cannot be used as it has no MAC address
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: tmp-crun.WVgMAK.mount: Deactivated successfully.
Nov 23 10:01:45 np0005532585.localdomain podman[321594]: 2025-11-23 10:01:45.401102728 +0000 UTC m=+0.156039288 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain podman[321596]: 2025-11-23 10:01:45.423608891 +0000 UTC m=+0.170292267 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 23 10:01:45 np0005532585.localdomain kernel: device tapbea321e5-56 entered promiscuous mode
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892105.4271] manager: (tapbea321e5-56): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Nov 23 10:01:45 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:45Z|00274|binding|INFO|Claiming lport bea321e5-5628-434f-911b-989bbbc4badb for this chassis.
Nov 23 10:01:45 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:45Z|00275|binding|INFO|bea321e5-5628-434f-911b-989bbbc4badb: Claiming unknown
Nov 23 10:01:45 np0005532585.localdomain systemd-udevd[321661]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:01:45 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:45Z|00276|binding|INFO|Setting lport bea321e5-5628-434f-911b-989bbbc4badb ovn-installed in OVS
Nov 23 10:01:45 np0005532585.localdomain podman[321594]: 2025-11-23 10:01:45.438785019 +0000 UTC m=+0.193721589 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:01:45 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:45Z|00277|binding|INFO|Setting lport bea321e5-5628-434f-911b-989bbbc4badb up in Southbound
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.446 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe03:fd57/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=bea321e5-5628-434f-911b-989bbbc4badb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.452 160439 INFO neutron.agent.ovn.metadata.agent [-] Port bea321e5-5628-434f-911b-989bbbc4badb in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.457 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port b747372d-b0df-4741-9567-1269323d1ac6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.458 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:45 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:45.458 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8a1b7b-656b-45b5-8d2c-4443aa4b06b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapbea321e5-56: No such device
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:45.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:45 np0005532585.localdomain ceph-mon[300199]: pgmap v222: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 1.3 KiB/s wr, 34 op/s
Nov 23 10:01:45 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:45.921 2 INFO neutron.agent.securitygroups_rpc [None req-b9b41fb6-e2e7-4179-8f08-7b97a3dfa0c7 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:46.189 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:01:46 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:46Z|00278|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:46.264 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:46.294 2 INFO neutron.agent.securitygroups_rpc [None req-f7529e1b-3bf3-41ad-a49e-e39cf58ffefa ca36e3c530cd4996add76add048683eb 461e34582027490ebd34279a384a57b1 - - default default] Security group rule updated ['ce47e028-f950-480c-a113-98c15c008254']
Nov 23 10:01:46 np0005532585.localdomain podman[321733]: 
Nov 23 10:01:46 np0005532585.localdomain podman[321733]: 2025-11-23 10:01:46.329389054 +0000 UTC m=+0.082006487 container create 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:01:46 np0005532585.localdomain systemd[1]: Started libpod-conmon-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d.scope.
Nov 23 10:01:46 np0005532585.localdomain podman[321733]: 2025-11-23 10:01:46.292641222 +0000 UTC m=+0.045258625 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:46 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:46 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/169696af7c6c322034c0f6d82a43613a1e0ee2554520eba53c27cf09c69abb80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:46 np0005532585.localdomain podman[321733]: 2025-11-23 10:01:46.4094359 +0000 UTC m=+0.162053323 container init 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:01:46 np0005532585.localdomain podman[321733]: 2025-11-23 10:01:46.41886171 +0000 UTC m=+0.171479103 container start 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:01:46 np0005532585.localdomain dnsmasq[321750]: started, version 2.85 cachesize 150
Nov 23 10:01:46 np0005532585.localdomain dnsmasq[321750]: DNS service limited to local subnets
Nov 23 10:01:46 np0005532585.localdomain dnsmasq[321750]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:46 np0005532585.localdomain dnsmasq[321750]: warning: no upstream servers configured
Nov 23 10:01:46 np0005532585.localdomain dnsmasq[321750]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:46.470 263258 INFO neutron.agent.dhcp.agent [None req-fec28012-90eb-45f4-8f7f-93722b94c6dd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:44Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad3460>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad35e0>], id=ef912ada-86b5-4d28-bae3-e33192c6b57f, ip_allocation=immediate, mac_address=fa:16:3e:6e:a9:1d, name=tempest-NetworksTestDHCPv6-1699018063, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['6377934c-bad8-4c3b-bfeb-7ba8e9254091'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:43Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1592, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:45Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:01:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:46.541 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:46.633 263258 INFO neutron.agent.dhcp.agent [None req-ea2f1b92-8901-42f8-bc68-c42b0ef2f9e9 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:46 np0005532585.localdomain dnsmasq[321750]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:01:46 np0005532585.localdomain podman[321769]: 2025-11-23 10:01:46.657628887 +0000 UTC m=+0.068102780 container kill 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:46.994 263258 INFO neutron.agent.dhcp.agent [None req-d0563eee-ea94-474f-a1b4-e5316e44bb3d - - - - - -] DHCP configuration for ports {'ef912ada-86b5-4d28-bae3-e33192c6b57f'} is completed
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3232634724' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3232634724' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:01:47 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:47.256 2 INFO neutron.agent.securitygroups_rpc [None req-cfd83c6e-245c-4505-a8d2-3c8b7de44cbd 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:47 np0005532585.localdomain systemd[1]: tmp-crun.CIe3oh.mount: Deactivated successfully.
Nov 23 10:01:47 np0005532585.localdomain dnsmasq[321750]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:47 np0005532585.localdomain podman[321805]: 2025-11-23 10:01:47.485980534 +0000 UTC m=+0.061640290 container kill 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.611366) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107611403, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 538, "num_deletes": 257, "total_data_size": 416165, "memory_usage": 427544, "flush_reason": "Manual Compaction"}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107616550, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 271753, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23770, "largest_seqno": 24303, "table_properties": {"data_size": 269139, "index_size": 661, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6592, "raw_average_key_size": 18, "raw_value_size": 263608, "raw_average_value_size": 740, "num_data_blocks": 30, "num_entries": 356, "num_filter_entries": 356, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892085, "oldest_key_time": 1763892085, "file_creation_time": 1763892107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 5240 microseconds, and 2045 cpu microseconds.
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.616601) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 271753 bytes OK
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.616626) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.618879) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.618902) EVENT_LOG_v1 {"time_micros": 1763892107618895, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.618956) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 412982, prev total WAL file size 413306, number of live WAL files 2.
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.619693) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303234' seq:72057594037927935, type:22 .. '6C6F676D0034323737' seq:0, type:0; will stop at (end)
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(265KB)], [39(15MB)]
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107619747, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 16350327, "oldest_snapshot_seqno": -1}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12419 keys, 16030469 bytes, temperature: kUnknown
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107714272, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 16030469, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15959705, "index_size": 38599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31109, "raw_key_size": 335963, "raw_average_key_size": 27, "raw_value_size": 15747950, "raw_average_value_size": 1268, "num_data_blocks": 1448, "num_entries": 12419, "num_filter_entries": 12419, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.714613) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 16030469 bytes
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.716383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.8 rd, 169.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 15.3 +0.0 blob) out(15.3 +0.0 blob), read-write-amplify(119.2) write-amplify(59.0) OK, records in: 12949, records dropped: 530 output_compression: NoCompression
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.716411) EVENT_LOG_v1 {"time_micros": 1763892107716399, "job": 22, "event": "compaction_finished", "compaction_time_micros": 94631, "compaction_time_cpu_micros": 44019, "output_level": 6, "num_output_files": 1, "total_output_size": 16030469, "num_input_records": 12949, "num_output_records": 12419, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107716586, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107718967, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.619572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719087) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719090) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:01:48 np0005532585.localdomain ceph-mon[300199]: pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 23 10:01:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:48 np0005532585.localdomain podman[321843]: 2025-11-23 10:01:48.950283703 +0000 UTC m=+0.060916028 container kill 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:01:48 np0005532585.localdomain dnsmasq[321750]: exiting on receipt of SIGTERM
Nov 23 10:01:48 np0005532585.localdomain systemd[1]: libpod-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d.scope: Deactivated successfully.
Nov 23 10:01:49 np0005532585.localdomain podman[321857]: 2025-11-23 10:01:49.020269609 +0000 UTC m=+0.050687893 container died 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:01:49 np0005532585.localdomain podman[321857]: 2025-11-23 10:01:49.051601254 +0000 UTC m=+0.082019458 container cleanup 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:49 np0005532585.localdomain systemd[1]: libpod-conmon-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d.scope: Deactivated successfully.
Nov 23 10:01:49 np0005532585.localdomain podman[321858]: 2025-11-23 10:01:49.093400122 +0000 UTC m=+0.123203397 container remove 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:01:49 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:49Z|00279|binding|INFO|Releasing lport bea321e5-5628-434f-911b-989bbbc4badb from this chassis (sb_readonly=0)
Nov 23 10:01:49 np0005532585.localdomain kernel: device tapbea321e5-56 left promiscuous mode
Nov 23 10:01:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:49.104 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:49 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:49Z|00280|binding|INFO|Setting lport bea321e5-5628-434f-911b-989bbbc4badb down in Southbound
Nov 23 10:01:49 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:49.117 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe03:fd57/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=bea321e5-5628-434f-911b-989bbbc4badb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:49 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:49.119 160439 INFO neutron.agent.ovn.metadata.agent [-] Port bea321e5-5628-434f-911b-989bbbc4badb in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:01:49 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:49.122 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:49.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:49 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:49.122 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[66ac9ca1-8648-4532-82f4-e9afe9a6f359]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:49.386 263258 INFO neutron.agent.dhcp.agent [None req-65a49057-2d3e-4771-bb73-17afb6891ac6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:49 np0005532585.localdomain ceph-mon[300199]: pgmap v224: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Nov 23 10:01:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-169696af7c6c322034c0f6d82a43613a1e0ee2554520eba53c27cf09c69abb80-merged.mount: Deactivated successfully.
Nov 23 10:01:49 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:49 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:01:51 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:51.013 2 INFO neutron.agent.securitygroups_rpc [None req-fcaf6d85-3067-425b-90fc-65fb17c22c5c 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:51.433 263258 INFO neutron.agent.linux.ip_lib [None req-019f317e-bd0f-4094-9bd5-c52c775285c2 - - - - - -] Device tap9e1df90a-d3 cannot be used as it has no MAC address
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.503 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain kernel: device tap9e1df90a-d3 entered promiscuous mode
Nov 23 10:01:51 np0005532585.localdomain systemd-udevd[321894]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:01:51 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892111.5143] manager: (tap9e1df90a-d3): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:51Z|00281|binding|INFO|Claiming lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 for this chassis.
Nov 23 10:01:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:51Z|00282|binding|INFO|9e1df90a-d3d3-4908-848e-a0d5f6a57103: Claiming unknown
Nov 23 10:01:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:51Z|00283|binding|INFO|Setting lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 ovn-installed in OVS
Nov 23 10:01:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:51Z|00284|binding|INFO|Setting lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 up in Southbound
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.526 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:51.530 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe08:dc73/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=9e1df90a-d3d3-4908-848e-a0d5f6a57103) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:51.533 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9e1df90a-d3d3-4908-848e-a0d5f6a57103 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:01:51 np0005532585.localdomain ceph-mon[300199]: pgmap v225: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Nov 23 10:01:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:51.539 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port a4cbf391-da96-4b48-84ca-e56fadda95bf IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:51.540 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:51.541 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a01d95-5693-456e-877d-87e15919af79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.632 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:51Z|00285|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:01:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:51.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:52 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:52.185 2 INFO neutron.agent.securitygroups_rpc [None req-b422b6dc-3b22-4323-bd62-6ed72320b39a 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:52 np0005532585.localdomain podman[321964]: 
Nov 23 10:01:52 np0005532585.localdomain podman[321964]: 2025-11-23 10:01:52.4527918 +0000 UTC m=+0.097515376 container create 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:01:52 np0005532585.localdomain systemd[1]: Started libpod-conmon-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a.scope.
Nov 23 10:01:52 np0005532585.localdomain systemd[1]: tmp-crun.P22Rh1.mount: Deactivated successfully.
Nov 23 10:01:52 np0005532585.localdomain podman[321964]: 2025-11-23 10:01:52.407806634 +0000 UTC m=+0.052530230 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:52 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:52 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d063cd5b145b767409251f60588cb8b0fd1438cff8445bf44f62f5e0c43fc386/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:52 np0005532585.localdomain podman[321964]: 2025-11-23 10:01:52.519832125 +0000 UTC m=+0.164555701 container init 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:01:52 np0005532585.localdomain podman[321964]: 2025-11-23 10:01:52.529531174 +0000 UTC m=+0.174254750 container start 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:01:52 np0005532585.localdomain dnsmasq[321982]: started, version 2.85 cachesize 150
Nov 23 10:01:52 np0005532585.localdomain dnsmasq[321982]: DNS service limited to local subnets
Nov 23 10:01:52 np0005532585.localdomain dnsmasq[321982]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:52 np0005532585.localdomain dnsmasq[321982]: warning: no upstream servers configured
Nov 23 10:01:52 np0005532585.localdomain dnsmasq-dhcp[321982]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:01:52 np0005532585.localdomain dnsmasq[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:52 np0005532585.localdomain dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:52 np0005532585.localdomain dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:52.586 263258 INFO neutron.agent.dhcp.agent [None req-019f317e-bd0f-4094-9bd5-c52c775285c2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:50Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bc2550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bc2490>], id=1d695bde-d397-4fcd-9710-c54ea9a58d0a, ip_allocation=immediate, mac_address=fa:16:3e:60:5c:31, name=tempest-NetworksTestDHCPv6-1597934444, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['042ef781-533a-42ed-9c75-26c4d12a8424'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:49Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1624, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:50Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:01:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:52.728 263258 INFO neutron.agent.dhcp.agent [None req-902c09cc-c236-4810-b741-d773e14881ae - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:52 np0005532585.localdomain dnsmasq[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:01:52 np0005532585.localdomain dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:01:52 np0005532585.localdomain dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:01:52 np0005532585.localdomain podman[321999]: 2025-11-23 10:01:52.767476644 +0000 UTC m=+0.057700479 container kill 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:01:52 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:52.955 263258 INFO neutron.agent.dhcp.agent [None req-ecf7f6d3-88b3-429b-8e12-1e120000a66d - - - - - -] DHCP configuration for ports {'1d695bde-d397-4fcd-9710-c54ea9a58d0a'} is completed
Nov 23 10:01:53 np0005532585.localdomain dnsmasq[321982]: exiting on receipt of SIGTERM
Nov 23 10:01:53 np0005532585.localdomain podman[322037]: 2025-11-23 10:01:53.159769869 +0000 UTC m=+0.060355521 container kill 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:53 np0005532585.localdomain systemd[1]: libpod-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a.scope: Deactivated successfully.
Nov 23 10:01:53 np0005532585.localdomain podman[322049]: 2025-11-23 10:01:53.232734857 +0000 UTC m=+0.060538447 container died 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:01:53 np0005532585.localdomain podman[322049]: 2025-11-23 10:01:53.266536808 +0000 UTC m=+0.094340358 container cleanup 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:01:53 np0005532585.localdomain systemd[1]: libpod-conmon-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a.scope: Deactivated successfully.
Nov 23 10:01:53 np0005532585.localdomain podman[322056]: 2025-11-23 10:01:53.309302605 +0000 UTC m=+0.128023785 container remove 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:53Z|00286|binding|INFO|Releasing lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 from this chassis (sb_readonly=0)
Nov 23 10:01:53 np0005532585.localdomain kernel: device tap9e1df90a-d3 left promiscuous mode
Nov 23 10:01:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:53Z|00287|binding|INFO|Setting lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 down in Southbound
Nov 23 10:01:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:53.353 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:53.361 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe08:dc73/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=9e1df90a-d3d3-4908-848e-a0d5f6a57103) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:53.363 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9e1df90a-d3d3-4908-848e-a0d5f6a57103 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:01:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:53.366 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:53.367 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe581da-f6a7-4902-a066-fbff417315f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:53.379 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d063cd5b145b767409251f60588cb8b0fd1438cff8445bf44f62f5e0c43fc386-merged.mount: Deactivated successfully.
Nov 23 10:01:53 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:53 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:53.542 2 INFO neutron.agent.securitygroups_rpc [None req-76b8a8df-4b24-4290-be38-6011d037c5af a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:53 np0005532585.localdomain ceph-mon[300199]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Nov 23 10:01:53 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:53.854 263258 INFO neutron.agent.dhcp.agent [None req-15fe0042-9576-47da-9ccb-4f21713ef5bd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:53 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:53.855 263258 INFO neutron.agent.dhcp.agent [None req-15fe0042-9576-47da-9ccb-4f21713ef5bd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:53 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:01:54 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:54.016 2 INFO neutron.agent.securitygroups_rpc [None req-988e37e0-0049-4c8c-be99-30016558f502 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']
Nov 23 10:01:54 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:54.204 263258 INFO neutron.agent.linux.ip_lib [None req-b456a874-9c5d-4705-83ba-28876b48e72b - - - - - -] Device tap911f7f8a-f8 cannot be used as it has no MAC address
Nov 23 10:01:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:54.227 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:54 np0005532585.localdomain kernel: device tap911f7f8a-f8 entered promiscuous mode
Nov 23 10:01:54 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892114.2341] manager: (tap911f7f8a-f8): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Nov 23 10:01:54 np0005532585.localdomain systemd-udevd[321896]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:01:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:54Z|00288|binding|INFO|Claiming lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b for this chassis.
Nov 23 10:01:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:54.236 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:54Z|00289|binding|INFO|911f7f8a-f8a0-4ea9-8b79-546ce102d99b: Claiming unknown
Nov 23 10:01:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:54.248 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-758e8da2-ad0b-4400-add6-179377986387', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-758e8da2-ad0b-4400-add6-179377986387', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb9791d358174b77957d83c427c41282', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e94e65-0414-426a-b507-976914cc35fa, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=911f7f8a-f8a0-4ea9-8b79-546ce102d99b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:54.249 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 911f7f8a-f8a0-4ea9-8b79-546ce102d99b in datapath 758e8da2-ad0b-4400-add6-179377986387 bound to our chassis
Nov 23 10:01:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:54.250 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 758e8da2-ad0b-4400-add6-179377986387 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:01:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:54.250 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a7916074-87ef-4afc-ba53-d68658697c0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:54Z|00290|binding|INFO|Setting lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b ovn-installed in OVS
Nov 23 10:01:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:54Z|00291|binding|INFO|Setting lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b up in Southbound
Nov 23 10:01:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:54.276 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:54.304 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:54.331 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:01:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:01:55 np0005532585.localdomain systemd[1]: tmp-crun.maFKg3.mount: Deactivated successfully.
Nov 23 10:01:55 np0005532585.localdomain podman[322119]: 2025-11-23 10:01:55.03512571 +0000 UTC m=+0.088790167 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:01:55 np0005532585.localdomain podman[322119]: 2025-11-23 10:01:55.073319767 +0000 UTC m=+0.126984244 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:01:55 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:01:55 np0005532585.localdomain podman[322118]: 2025-11-23 10:01:55.083707746 +0000 UTC m=+0.136365121 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 23 10:01:55 np0005532585.localdomain podman[322118]: 2025-11-23 10:01:55.219254642 +0000 UTC m=+0.271912027 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 10:01:55 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:01:55 np0005532585.localdomain podman[322182]: 
Nov 23 10:01:55 np0005532585.localdomain podman[322182]: 2025-11-23 10:01:55.24613467 +0000 UTC m=+0.085479064 container create 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:55 np0005532585.localdomain systemd[1]: Started libpod-conmon-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137.scope.
Nov 23 10:01:55 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:55 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53a9620dc369a622449f987892c4bdf312e7e2ee540f9f927180e16e47bb5267/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:55 np0005532585.localdomain podman[322182]: 2025-11-23 10:01:55.300521906 +0000 UTC m=+0.139866270 container init 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:55 np0005532585.localdomain podman[322182]: 2025-11-23 10:01:55.209558943 +0000 UTC m=+0.048903337 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:55 np0005532585.localdomain podman[322182]: 2025-11-23 10:01:55.309057229 +0000 UTC m=+0.148401593 container start 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:55 np0005532585.localdomain dnsmasq[322200]: started, version 2.85 cachesize 150
Nov 23 10:01:55 np0005532585.localdomain dnsmasq[322200]: DNS service limited to local subnets
Nov 23 10:01:55 np0005532585.localdomain dnsmasq[322200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:55 np0005532585.localdomain dnsmasq[322200]: warning: no upstream servers configured
Nov 23 10:01:55 np0005532585.localdomain dnsmasq-dhcp[322200]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:01:55 np0005532585.localdomain dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 0 addresses
Nov 23 10:01:55 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host
Nov 23 10:01:55 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts
Nov 23 10:01:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:55.418 263258 INFO neutron.agent.dhcp.agent [None req-a5a0a7e3-a14c-423e-af9d-86d86a12402d - - - - - -] DHCP configuration for ports {'7f83e24c-3fc0-4b80-9977-bf2ab2093c87'} is completed
Nov 23 10:01:55 np0005532585.localdomain ceph-mon[300199]: pgmap v227: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Nov 23 10:01:56 np0005532585.localdomain systemd[1]: tmp-crun.868tbO.mount: Deactivated successfully.
Nov 23 10:01:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:56.444 263258 INFO neutron.agent.linux.ip_lib [None req-78c8dcf5-78ff-474b-93c3-aea8a513c1f7 - - - - - -] Device tap1a9e922d-b6 cannot be used as it has no MAC address
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain kernel: device tap1a9e922d-b6 entered promiscuous mode
Nov 23 10:01:56 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892116.5228] manager: (tap1a9e922d-b6): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.525 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:56Z|00292|binding|INFO|Claiming lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 for this chassis.
Nov 23 10:01:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:56Z|00293|binding|INFO|1a9e922d-b6d2-47d7-9477-97162b14a8c2: Claiming unknown
Nov 23 10:01:56 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:56.532 2 INFO neutron.agent.securitygroups_rpc [None req-dca3126a-73f1-4b2f-a026-193f4196c9b1 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:56Z|00294|binding|INFO|Setting lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 ovn-installed in OVS
Nov 23 10:01:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:56Z|00295|binding|INFO|Setting lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 up in Southbound
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:56.536 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec5:9d71/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1a9e922d-b6d2-47d7-9477-97162b14a8c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:56.540 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1a9e922d-b6d2-47d7-9477-97162b14a8c2 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:01:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:56.545 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7bde6d51-54df-41c2-ad71-83a30ea62aca IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:01:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:56.545 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:56.546 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9d2913-af52-4d51-8465-38927f6a7f3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.547 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.561 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.609 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:56.646 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:56 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:56.966 2 INFO neutron.agent.securitygroups_rpc [None req-7d98cf13-9412-4d6e-887c-62597fa6d091 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']
Nov 23 10:01:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.041 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9410b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b500d0>], id=adfaba6e-566a-4b11-ba70-5bd47e5b5e70, ip_allocation=immediate, mac_address=fa:16:3e:2b:d5:66, name=tempest-ExtraDHCPOptionsTestJSON-1593894496, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:51Z, description=, dns_domain=, id=758e8da2-ad0b-4400-add6-179377986387, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-572811970, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44870, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1630, status=ACTIVE, subnets=['f0dbf111-ccce-4dbd-b3f1-dd923a995597'], tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:52Z, vlan_transparent=None, network_id=758e8da2-ad0b-4400-add6-179377986387, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7'], standard_attr_id=1665, status=DOWN, tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:56Z on network 758e8da2-ad0b-4400-add6-179377986387
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 1 addresses
Nov 23 10:01:57 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host
Nov 23 10:01:57 np0005532585.localdomain podman[322259]: 2025-11-23 10:01:57.305497739 +0000 UTC m=+0.059979798 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:01:57 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts
Nov 23 10:01:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.310 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:57 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:57.420 2 INFO neutron.agent.securitygroups_rpc [None req-bcaf3ac0-9e11-402c-a86f-10a7b58b202d 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:01:57 np0005532585.localdomain ceph-mon[300199]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 23 10:01:57 np0005532585.localdomain podman[322302]: 
Nov 23 10:01:57 np0005532585.localdomain podman[322302]: 2025-11-23 10:01:57.538731655 +0000 UTC m=+0.090260023 container create 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:01:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a.scope.
Nov 23 10:01:57 np0005532585.localdomain systemd[1]: tmp-crun.nv6jM4.mount: Deactivated successfully.
Nov 23 10:01:57 np0005532585.localdomain podman[322302]: 2025-11-23 10:01:57.493962625 +0000 UTC m=+0.045491053 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:01:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:01:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d87b3fb9068fe77c6a0a8dd82cc1218221b11144b046412f21ec7c3919a34f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:01:57 np0005532585.localdomain podman[322302]: 2025-11-23 10:01:57.623569998 +0000 UTC m=+0.175098356 container init 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:01:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.629 263258 INFO neutron.agent.dhcp.agent [None req-ad127182-d28a-48d1-a1ca-789913d793a5 - - - - - -] DHCP configuration for ports {'adfaba6e-566a-4b11-ba70-5bd47e5b5e70'} is completed
Nov 23 10:01:57 np0005532585.localdomain podman[322302]: 2025-11-23 10:01:57.632613846 +0000 UTC m=+0.184142204 container start 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322322]: started, version 2.85 cachesize 150
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322322]: DNS service limited to local subnets
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322322]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322322]: warning: no upstream servers configured
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322322]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.694 263258 INFO neutron.agent.dhcp.agent [None req-78c8dcf5-78ff-474b-93c3-aea8a513c1f7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b33ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b338b0>], id=6e16e3f9-6a29-4b10-8e83-0542549b123f, ip_allocation=immediate, mac_address=fa:16:3e:d4:2c:f5, name=tempest-NetworksTestDHCPv6-105209193, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['7c1d7f06-fd27-4e33-82bf-f56ea0e372a0'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:53Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1659, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:56Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:01:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.765 263258 INFO neutron.agent.dhcp.agent [None req-b2d156fd-497c-4709-9e61-56285a3424ba - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:01:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:01:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 2279 writes, 24K keys, 2279 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s
                                                           Cumulative WAL: 2279 writes, 2279 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2279 writes, 24K keys, 2279 commit groups, 1.0 writes per commit group, ingest: 41.37 MB, 0.07 MB/s
                                                           Interval WAL: 2279 writes, 2279 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    194.8      0.16              0.08        11    0.015       0      0       0.0       0.0
                                                             L6      1/0   15.29 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.2    227.2    206.7      0.81              0.45        10    0.081    124K   5047       0.0       0.0
                                                            Sum      1/0   15.29 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   6.2    188.9    204.7      0.97              0.52        21    0.046    124K   5047       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   6.2    189.5    205.3      0.97              0.52        20    0.048    124K   5047       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   0.0    227.2    206.7      0.81              0.45        10    0.081    124K   5047       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    198.1      0.16              0.08        10    0.016       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.031, interval 0.031
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.19 GB write, 0.33 MB/s write, 0.18 GB read, 0.31 MB/s read, 1.0 seconds
                                                           Interval compaction: 0.19 GB write, 0.33 MB/s write, 0.18 GB read, 0.31 MB/s read, 1.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5651615b9350#2 capacity: 308.00 MB usage: 17.93 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000113 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(889,17.12 MB,5.5569%) FilterBlock(21,369.11 KB,0.117032%) IndexBlock(21,470.17 KB,0.149075%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 23 10:01:57 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:57.855 2 INFO neutron.agent.securitygroups_rpc [None req-640caa93-9dd6-4d73-895a-6c48cb53e831 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']
Nov 23 10:01:57 np0005532585.localdomain dnsmasq[322322]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:01:57 np0005532585.localdomain podman[322341]: 2025-11-23 10:01:57.883540126 +0000 UTC m=+0.059253296 container kill 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:01:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.887 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:57Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a54490>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a54790>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a543a0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a544c0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a54580>], id=006191dd-6d3f-4d8a-bc53-137fd3dda03f, ip_allocation=immediate, mac_address=fa:16:3e:41:23:16, name=tempest-ExtraDHCPOptionsTestJSON-716825327, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:51Z, description=, dns_domain=, id=758e8da2-ad0b-4400-add6-179377986387, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-572811970, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44870, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1630, status=ACTIVE, subnets=['f0dbf111-ccce-4dbd-b3f1-dd923a995597'], tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:52Z, vlan_transparent=None, network_id=758e8da2-ad0b-4400-add6-179377986387, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7'], standard_attr_id=1666, status=DOWN, tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:57Z on network 758e8da2-ad0b-4400-add6-179377986387
Nov 23 10:01:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:58.091 263258 INFO neutron.agent.dhcp.agent [None req-cdd49f53-a920-4b5a-8e68-3506f1547c4c - - - - - -] DHCP configuration for ports {'6e16e3f9-6a29-4b10-8e83-0542549b123f'} is completed
Nov 23 10:01:58 np0005532585.localdomain dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 2 addresses
Nov 23 10:01:58 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host
Nov 23 10:01:58 np0005532585.localdomain podman[322381]: 2025-11-23 10:01:58.194383822 +0000 UTC m=+0.105737528 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:01:58 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts
Nov 23 10:01:58 np0005532585.localdomain dnsmasq[322322]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:01:58 np0005532585.localdomain podman[322403]: 2025-11-23 10:01:58.22741949 +0000 UTC m=+0.064154677 container kill 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:01:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:01:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:58.461 263258 INFO neutron.agent.dhcp.agent [None req-8465bbf6-1c54-4086-a7f4-b51f82a7d2b5 - - - - - -] DHCP configuration for ports {'006191dd-6d3f-4d8a-bc53-137fd3dda03f'} is completed
Nov 23 10:01:58 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:58.481 2 INFO neutron.agent.securitygroups_rpc [None req-00590570-c613-4100-977f-94a0fcdda7a2 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']
Nov 23 10:01:58 np0005532585.localdomain dnsmasq[322322]: exiting on receipt of SIGTERM
Nov 23 10:01:58 np0005532585.localdomain podman[322459]: 2025-11-23 10:01:58.674530003 +0000 UTC m=+0.066241131 container kill 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:01:58 np0005532585.localdomain systemd[1]: tmp-crun.T48bJu.mount: Deactivated successfully.
Nov 23 10:01:58 np0005532585.localdomain systemd[1]: libpod-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a.scope: Deactivated successfully.
Nov 23 10:01:58 np0005532585.localdomain dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 1 addresses
Nov 23 10:01:58 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host
Nov 23 10:01:58 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts
Nov 23 10:01:58 np0005532585.localdomain podman[322477]: 2025-11-23 10:01:58.747110739 +0000 UTC m=+0.088033763 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:01:58 np0005532585.localdomain podman[322493]: 2025-11-23 10:01:58.791334441 +0000 UTC m=+0.093949055 container died 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:01:58 np0005532585.localdomain podman[322493]: 2025-11-23 10:01:58.83773482 +0000 UTC m=+0.140349404 container remove 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:01:58 np0005532585.localdomain systemd[1]: libpod-conmon-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a.scope: Deactivated successfully.
Nov 23 10:01:58 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:58Z|00296|binding|INFO|Releasing lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 from this chassis (sb_readonly=0)
Nov 23 10:01:58 np0005532585.localdomain kernel: device tap1a9e922d-b6 left promiscuous mode
Nov 23 10:01:58 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:01:58Z|00297|binding|INFO|Setting lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 down in Southbound
Nov 23 10:01:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:58.855 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:58.865 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec5:9d71/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1a9e922d-b6d2-47d7-9477-97162b14a8c2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:01:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:58.867 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1a9e922d-b6d2-47d7-9477-97162b14a8c2 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:01:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:58.869 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:01:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:01:58.870 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7c232b5f-dddb-41bd-b27c-c48ec392ff14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:01:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:01:58.873 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:01:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:58.909 263258 INFO neutron.agent.dhcp.agent [None req-a4c642f5-7cf9-48d1-abe1-ff098b5c01d8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad9a90>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8baeb80>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bae130>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c9410c40>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bae5b0>], id=adfaba6e-566a-4b11-ba70-5bd47e5b5e70, ip_allocation=immediate, mac_address=fa:16:3e:2b:d5:66, name=tempest-new-port-name-306650585, network_id=758e8da2-ad0b-4400-add6-179377986387, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7'], standard_attr_id=1665, status=DOWN, tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:58Z on network 758e8da2-ad0b-4400-add6-179377986387
Nov 23 10:01:59 np0005532585.localdomain dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 1 addresses
Nov 23 10:01:59 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host
Nov 23 10:01:59 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts
Nov 23 10:01:59 np0005532585.localdomain podman[322545]: 2025-11-23 10:01:59.141401165 +0000 UTC m=+0.063951711 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:01:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:59.195 263258 INFO neutron.agent.dhcp.agent [None req-62da2b23-e1a6-4f95-9c02-432067fbf0e6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:01:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0d87b3fb9068fe77c6a0a8dd82cc1218221b11144b046412f21ec7c3919a34f0-merged.mount: Deactivated successfully.
Nov 23 10:01:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:01:59 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:01:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:01:59.455 263258 INFO neutron.agent.dhcp.agent [None req-fac1d73c-c18e-48aa-b6ed-40ff0f813cd0 - - - - - -] DHCP configuration for ports {'adfaba6e-566a-4b11-ba70-5bd47e5b5e70'} is completed
Nov 23 10:01:59 np0005532585.localdomain ceph-mon[300199]: pgmap v229: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 23 10:01:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:01:59.661 2 INFO neutron.agent.securitygroups_rpc [None req-a72c44cb-60bc-4c8a-a77d-8d03ce0529ac 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']
Nov 23 10:01:59 np0005532585.localdomain systemd[1]: tmp-crun.p65Fgr.mount: Deactivated successfully.
Nov 23 10:01:59 np0005532585.localdomain dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 0 addresses
Nov 23 10:01:59 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host
Nov 23 10:01:59 np0005532585.localdomain dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts
Nov 23 10:01:59 np0005532585.localdomain podman[322583]: 2025-11-23 10:01:59.89373561 +0000 UTC m=+0.053906931 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:01:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:01:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:01:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:01:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:02:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:00 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:02:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:02:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:02:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:02:00 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:00.285 2 INFO neutron.agent.securitygroups_rpc [None req-b6b7fea1-91f0-4eed-a7d1-b3a653a6eb31 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:00 np0005532585.localdomain dnsmasq[322200]: exiting on receipt of SIGTERM
Nov 23 10:02:00 np0005532585.localdomain podman[322622]: 2025-11-23 10:02:00.399425309 +0000 UTC m=+0.061011641 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:02:00 np0005532585.localdomain systemd[1]: libpod-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137.scope: Deactivated successfully.
Nov 23 10:02:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:00Z|00298|binding|INFO|Removing iface tap911f7f8a-f8 ovn-installed in OVS
Nov 23 10:02:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:00Z|00299|binding|INFO|Removing lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b ovn-installed in OVS
Nov 23 10:02:00 np0005532585.localdomain podman[322634]: 2025-11-23 10:02:00.469156907 +0000 UTC m=+0.059138143 container died 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.469 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.470 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 22fb3b05-2f99-4740-ad3a-43fe5cf32d1d with type ""
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.471 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-758e8da2-ad0b-4400-add6-179377986387', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-758e8da2-ad0b-4400-add6-179377986387', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb9791d358174b77957d83c427c41282', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e94e65-0414-426a-b507-976914cc35fa, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=911f7f8a-f8a0-4ea9-8b79-546ce102d99b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.473 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 911f7f8a-f8a0-4ea9-8b79-546ce102d99b in datapath 758e8da2-ad0b-4400-add6-179377986387 unbound from our chassis
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.475 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 758e8da2-ad0b-4400-add6-179377986387, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.476 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.478 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[84a65e8e-6cdf-4451-b6b5-4bb988572f82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:00 np0005532585.localdomain systemd[1]: tmp-crun.QBLesf.mount: Deactivated successfully.
Nov 23 10:02:00 np0005532585.localdomain podman[322634]: 2025-11-23 10:02:00.502008289 +0000 UTC m=+0.091989495 container cleanup 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:02:00 np0005532585.localdomain systemd[1]: libpod-conmon-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137.scope: Deactivated successfully.
Nov 23 10:02:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3910451066' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3910451066' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:00.531 263258 INFO neutron.agent.linux.ip_lib [None req-fb62524a-61ab-4b4c-9f08-b7ad33c8aa00 - - - - - -] Device tap1eae7604-c5 cannot be used as it has no MAC address
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.556 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain kernel: device tap1eae7604-c5 entered promiscuous mode
Nov 23 10:02:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:00Z|00300|binding|INFO|Claiming lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 for this chassis.
Nov 23 10:02:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:00Z|00301|binding|INFO|1eae7604-c5d1-42c0-ba22-91f336e21eb6: Claiming unknown
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.564 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892120.5653] manager: (tap1eae7604-c5): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Nov 23 10:02:00 np0005532585.localdomain systemd-udevd[322668]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:00Z|00302|binding|INFO|Setting lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 ovn-installed in OVS
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.573 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:00Z|00303|binding|INFO|Setting lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 up in Southbound
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.577 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.578 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec2:891e/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1eae7604-c5d1-42c0-ba22-91f336e21eb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.580 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1eae7604-c5d1-42c0-ba22-91f336e21eb6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.583 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port e7b38be6-fb06-4882-a743-601ba140a474 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.583 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:00 np0005532585.localdomain podman[322641]: 2025-11-23 10:02:00.584150189 +0000 UTC m=+0.158512024 container remove 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:02:00 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:00.584 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[bb19d044-46ea-4426-a191-a4da3d41ca4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:00 np0005532585.localdomain kernel: device tap911f7f8a-f8 left promiscuous mode
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.599 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:00.629 263258 INFO neutron.agent.dhcp.agent [None req-6dbabc39-1466-416b-bc6e-eb67358223e2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.634 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:00.661 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:00.798 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:02:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 7241 writes, 30K keys, 7241 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 7241 writes, 1653 syncs, 4.38 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 2297 writes, 8467 keys, 2297 commit groups, 1.0 writes per commit group, ingest: 8.52 MB, 0.01 MB/s
                                                          Interval WAL: 2297 writes, 988 syncs, 2.32 writes per sync, written: 0.01 GB, 0.01 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:02:01 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:01.029 2 INFO neutron.agent.securitygroups_rpc [None req-632d2016-3c1b-4f91-9e80-c737b7a909f7 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:01 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:01Z|00304|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:01.102 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-53a9620dc369a622449f987892c4bdf312e7e2ee540f9f927180e16e47bb5267-merged.mount: Deactivated successfully.
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d758e8da2\x2dad0b\x2d4400\x2dadd6\x2d179377986387.mount: Deactivated successfully.
Nov 23 10:02:01 np0005532585.localdomain podman[322722]: 
Nov 23 10:02:01 np0005532585.localdomain podman[322722]: 2025-11-23 10:02:01.478413887 +0000 UTC m=+0.064174748 container create 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: Started libpod-conmon-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3.scope.
Nov 23 10:02:01 np0005532585.localdomain ceph-mon[300199]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:01 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9417ceb2b435d04eebc44b707f3c79893d5ae8ab0419cd784f36ee62d1553931/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:01 np0005532585.localdomain podman[322722]: 2025-11-23 10:02:01.447495275 +0000 UTC m=+0.033256156 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:01 np0005532585.localdomain podman[322722]: 2025-11-23 10:02:01.551144678 +0000 UTC m=+0.136905569 container init 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:01.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:01 np0005532585.localdomain podman[322722]: 2025-11-23 10:02:01.557761451 +0000 UTC m=+0.143522312 container start 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:02:01 np0005532585.localdomain dnsmasq[322740]: started, version 2.85 cachesize 150
Nov 23 10:02:01 np0005532585.localdomain dnsmasq[322740]: DNS service limited to local subnets
Nov 23 10:02:01 np0005532585.localdomain dnsmasq[322740]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:01 np0005532585.localdomain dnsmasq[322740]: warning: no upstream servers configured
Nov 23 10:02:01 np0005532585.localdomain dnsmasq-dhcp[322740]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:01.565 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:01 np0005532585.localdomain dnsmasq[322740]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:01 np0005532585.localdomain dnsmasq-dhcp[322740]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:01 np0005532585.localdomain dnsmasq-dhcp[322740]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:01 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:01.721 263258 INFO neutron.agent.dhcp.agent [None req-32946da0-bb78-43b4-82d5-69cf34036c1b - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:01 np0005532585.localdomain dnsmasq[322740]: exiting on receipt of SIGTERM
Nov 23 10:02:01 np0005532585.localdomain podman[322758]: 2025-11-23 10:02:01.886794908 +0000 UTC m=+0.058341838 container kill 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: libpod-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3.scope: Deactivated successfully.
Nov 23 10:02:01 np0005532585.localdomain podman[322770]: 2025-11-23 10:02:01.952813291 +0000 UTC m=+0.053988714 container died 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:02:01 np0005532585.localdomain podman[322770]: 2025-11-23 10:02:01.98814877 +0000 UTC m=+0.089324123 container cleanup 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:01 np0005532585.localdomain systemd[1]: libpod-conmon-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3.scope: Deactivated successfully.
Nov 23 10:02:02 np0005532585.localdomain podman[322777]: 2025-11-23 10:02:02.037068457 +0000 UTC m=+0.124048083 container remove 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:02Z|00305|binding|INFO|Releasing lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 from this chassis (sb_readonly=0)
Nov 23 10:02:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:02.049 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:02Z|00306|binding|INFO|Setting lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 down in Southbound
Nov 23 10:02:02 np0005532585.localdomain kernel: device tap1eae7604-c5 left promiscuous mode
Nov 23 10:02:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:02.059 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec2:891e/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1eae7604-c5d1-42c0-ba22-91f336e21eb6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:02.061 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1eae7604-c5d1-42c0-ba22-91f336e21eb6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:02:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:02.063 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:02.064 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a5903e21-0c30-4c53-9df8-161e319d9660]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:02.066 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:02.380 2 INFO neutron.agent.securitygroups_rpc [None req-daacf7a5-9b59-487b-876d-20ffffe4895d 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9417ceb2b435d04eebc44b707f3c79893d5ae8ab0419cd784f36ee62d1553931-merged.mount: Deactivated successfully.
Nov 23 10:02:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:02 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:02:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:02.813 2 INFO neutron.agent.securitygroups_rpc [None req-7b96091d-4d68-47d5-ad40-bc5e46f787b5 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:02.970 263258 INFO neutron.agent.linux.ip_lib [None req-cc73e387-1f57-4444-b655-bd413813bfc9 - - - - - -] Device tap62791e54-5e cannot be used as it has no MAC address
Nov 23 10:02:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:02.996 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain kernel: device tap62791e54-5e entered promiscuous mode
Nov 23 10:02:03 np0005532585.localdomain systemd-udevd[322670]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:03.004 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892123.0048] manager: (tap62791e54-5e): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Nov 23 10:02:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:03Z|00307|binding|INFO|Claiming lport 62791e54-5e64-445a-ab2c-3e33a17041f0 for this chassis.
Nov 23 10:02:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:03Z|00308|binding|INFO|62791e54-5e64-445a-ab2c-3e33a17041f0: Claiming unknown
Nov 23 10:02:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:03Z|00309|binding|INFO|Setting lport 62791e54-5e64-445a-ab2c-3e33a17041f0 ovn-installed in OVS
Nov 23 10:02:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:03.022 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:03Z|00310|binding|INFO|Setting lport 62791e54-5e64-445a-ab2c-3e33a17041f0 up in Southbound
Nov 23 10:02:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:03.024 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=62791e54-5e64-445a-ab2c-3e33a17041f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:03.026 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 62791e54-5e64-445a-ab2c-3e33a17041f0 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:02:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:03.028 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port b6aa7198-8a06-47c3-9895-f076ba88da04 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:02:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:03.028 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:03.028 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6009671c-6d11-4103-bbb7-c79269669e21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:03.034 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:03.042 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:03.079 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:03.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3963313061' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3963313061' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:03 np0005532585.localdomain ceph-mon[300199]: pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail
Nov 23 10:02:03 np0005532585.localdomain podman[322863]: 
Nov 23 10:02:03 np0005532585.localdomain podman[322863]: 2025-11-23 10:02:03.963509992 +0000 UTC m=+0.092856202 container create ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: Started libpod-conmon-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb.scope.
Nov 23 10:02:04 np0005532585.localdomain podman[322863]: 2025-11-23 10:02:03.919503696 +0000 UTC m=+0.048849936 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: tmp-crun.C0okSU.mount: Deactivated successfully.
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:04 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4da620b16801967997c7241de13bd854d2a48763c81d260fa8a5c934b2c99f79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:04 np0005532585.localdomain podman[322863]: 2025-11-23 10:02:04.045548589 +0000 UTC m=+0.174894799 container init ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:04 np0005532585.localdomain podman[322863]: 2025-11-23 10:02:04.054489805 +0000 UTC m=+0.183836025 container start ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:02:04 np0005532585.localdomain dnsmasq[322881]: started, version 2.85 cachesize 150
Nov 23 10:02:04 np0005532585.localdomain dnsmasq[322881]: DNS service limited to local subnets
Nov 23 10:02:04 np0005532585.localdomain dnsmasq[322881]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:04 np0005532585.localdomain dnsmasq[322881]: warning: no upstream servers configured
Nov 23 10:02:04 np0005532585.localdomain dnsmasq-dhcp[322881]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:04 np0005532585.localdomain dnsmasq[322881]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses
Nov 23 10:02:04 np0005532585.localdomain dnsmasq-dhcp[322881]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:04 np0005532585.localdomain dnsmasq-dhcp[322881]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.216 263258 INFO neutron.agent.dhcp.agent [None req-4a694251-3129-465d-b182-42a3f4e8c1c3 - - - - - -] DHCP configuration for ports {'aea50331-ae74-4951-a33a-a93d73ae2d3e', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:04 np0005532585.localdomain dnsmasq[322881]: exiting on receipt of SIGTERM
Nov 23 10:02:04 np0005532585.localdomain podman[322897]: 2025-11-23 10:02:04.36011192 +0000 UTC m=+0.058820773 container kill ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: libpod-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb.scope: Deactivated successfully.
Nov 23 10:02:04 np0005532585.localdomain podman[322911]: 2025-11-23 10:02:04.432590743 +0000 UTC m=+0.060455464 container died ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:04 np0005532585.localdomain podman[322911]: 2025-11-23 10:02:04.463642289 +0000 UTC m=+0.091506950 container cleanup ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: libpod-conmon-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb.scope: Deactivated successfully.
Nov 23 10:02:04 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:04.479 2 INFO neutron.agent.securitygroups_rpc [None req-69ba403f-ba50-44e1-b8d0-5abce2396fa5 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:04 np0005532585.localdomain podman[322913]: 2025-11-23 10:02:04.508694387 +0000 UTC m=+0.127566061 container remove ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:02:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:04Z|00311|binding|INFO|Releasing lport 62791e54-5e64-445a-ab2c-3e33a17041f0 from this chassis (sb_readonly=0)
Nov 23 10:02:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:04.522 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:04 np0005532585.localdomain kernel: device tap62791e54-5e left promiscuous mode
Nov 23 10:02:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:04Z|00312|binding|INFO|Setting lport 62791e54-5e64-445a-ab2c-3e33a17041f0 down in Southbound
Nov 23 10:02:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:04.535 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=62791e54-5e64-445a-ab2c-3e33a17041f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:04.537 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 62791e54-5e64-445a-ab2c-3e33a17041f0 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:02:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:04.539 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:04.539 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6cab7de9-3e49-46f1-b977-769b76d3a2fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:04.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:02:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2789 syncs, 3.74 writes per sync, written: 0.04 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4513 writes, 15K keys, 4513 commit groups, 1.0 writes per commit group, ingest: 16.92 MB, 0.03 MB/s
                                                          Interval WAL: 4513 writes, 1925 syncs, 2.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:02:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.790 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.791 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.791 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.792 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4da620b16801967997c7241de13bd854d2a48763c81d260fa8a5c934b2c99f79-merged.mount: Deactivated successfully.
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:04 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:02:05 np0005532585.localdomain ceph-mon[300199]: pgmap v232: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 938 B/s wr, 15 op/s
Nov 23 10:02:05 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:02:05 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3821055355' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:05 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:02:05 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3821055355' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:02:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:02:06 np0005532585.localdomain podman[322941]: 2025-11-23 10:02:06.031466096 +0000 UTC m=+0.086614350 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 10:02:06 np0005532585.localdomain podman[322941]: 2025-11-23 10:02:06.040707421 +0000 UTC m=+0.095855645 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:02:06 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:02:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:06.078 263258 INFO neutron.agent.linux.ip_lib [None req-3816ae8e-7ce5-49fc-ab68-7ae269013f24 - - - - - -] Device tapc97f425d-0e cannot be used as it has no MAC address
Nov 23 10:02:06 np0005532585.localdomain podman[322942]: 2025-11-23 10:02:06.082622782 +0000 UTC m=+0.133619487 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.105 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain kernel: device tapc97f425d-0e entered promiscuous mode
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892126.1124] manager: (tapc97f425d-0e): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Nov 23 10:02:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:06Z|00313|binding|INFO|Claiming lport c97f425d-0e2f-4212-85de-246b74cc4178 for this chassis.
Nov 23 10:02:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:06Z|00314|binding|INFO|c97f425d-0e2f-4212-85de-246b74cc4178: Claiming unknown
Nov 23 10:02:06 np0005532585.localdomain podman[322942]: 2025-11-23 10:02:06.114523895 +0000 UTC m=+0.165520650 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:02:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:06Z|00315|binding|INFO|Setting lport c97f425d-0e2f-4212-85de-246b74cc4178 ovn-installed in OVS
Nov 23 10:02:06 np0005532585.localdomain systemd-udevd[322991]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.118 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:06Z|00316|binding|INFO|Setting lport c97f425d-0e2f-4212-85de-246b74cc4178 up in Southbound
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:06.151 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7a:311b/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=c97f425d-0e2f-4212-85de-246b74cc4178) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:06.153 160439 INFO neutron.agent.ovn.metadata.agent [-] Port c97f425d-0e2f-4212-85de-246b74cc4178 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:02:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:06.155 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port c19594a7-8cd2-4205-b5d1-3fd9bc8de3f5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:02:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:06.155 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:06.156 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f97d43b0-89e8-4b53-9d8a-68c198c4aaa7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapc97f425d-0e: No such device
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.182 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:06.464 2 INFO neutron.agent.securitygroups_rpc [None req-027c2ba7-b3e9-44a3-b67a-0b2bdc3ad43a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3821055355' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3821055355' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:06.567 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:06 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:06.597 2 INFO neutron.agent.securitygroups_rpc [None req-027c2ba7-b3e9-44a3-b67a-0b2bdc3ad43a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:06.996 263258 INFO neutron.agent.linux.ip_lib [None req-00365568-31c1-4a8e-b7e0-06fa1d25d2f7 - - - - - -] Device tap4888f66e-2a cannot be used as it has no MAC address
Nov 23 10:02:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:07.023 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:07 np0005532585.localdomain kernel: device tap4888f66e-2a entered promiscuous mode
Nov 23 10:02:07 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892127.0293] manager: (tap4888f66e-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Nov 23 10:02:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:07.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:07Z|00317|binding|INFO|Claiming lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 for this chassis.
Nov 23 10:02:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:07Z|00318|binding|INFO|4888f66e-2a7b-4114-aa4a-94f38d09c793: Claiming unknown
Nov 23 10:02:07 np0005532585.localdomain systemd-udevd[322994]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:07.047 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cf4fe09-32ab-44a4-bc92-e7a5f2b64203, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=4888f66e-2a7b-4114-aa4a-94f38d09c793) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:07.049 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4888f66e-2a7b-4114-aa4a-94f38d09c793 in datapath e392c0bc-bd43-40a4-a7d7-6e0130e48060 bound to our chassis
Nov 23 10:02:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:07.051 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e392c0bc-bd43-40a4-a7d7-6e0130e48060 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:07.052 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[519e9eef-161c-46d9-94bd-50ab07aba296]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain podman[323067]: 
Nov 23 10:02:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:07Z|00319|binding|INFO|Setting lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 ovn-installed in OVS
Nov 23 10:02:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:07Z|00320|binding|INFO|Setting lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 up in Southbound
Nov 23 10:02:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:07.063 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain podman[323067]: 2025-11-23 10:02:07.0700686 +0000 UTC m=+0.093702947 container create 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:07Z|00321|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:07.100 2 INFO neutron.agent.securitygroups_rpc [None req-ed1fae3b-c78a-4940-9647-1237b664bff1 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:07 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4888f66e-2a: No such device
Nov 23 10:02:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:07.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:07 np0005532585.localdomain systemd[1]: Started libpod-conmon-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e.scope.
Nov 23 10:02:07 np0005532585.localdomain podman[323067]: 2025-11-23 10:02:07.025028233 +0000 UTC m=+0.048662610 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:07.124 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:07.139 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:07 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fe118de034c9dcbe06dbd482ab6b32f3c606b4624d6aae6333387bec542e04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:07 np0005532585.localdomain podman[323067]: 2025-11-23 10:02:07.172258499 +0000 UTC m=+0.195892836 container init 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:07 np0005532585.localdomain podman[323067]: 2025-11-23 10:02:07.180180502 +0000 UTC m=+0.203814839 container start 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:02:07 np0005532585.localdomain dnsmasq[323115]: started, version 2.85 cachesize 150
Nov 23 10:02:07 np0005532585.localdomain dnsmasq[323115]: DNS service limited to local subnets
Nov 23 10:02:07 np0005532585.localdomain dnsmasq[323115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:07 np0005532585.localdomain dnsmasq[323115]: warning: no upstream servers configured
Nov 23 10:02:07 np0005532585.localdomain dnsmasq[323115]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:07.254 2 INFO neutron.agent.securitygroups_rpc [None req-57d318cc-de08-4b5e-a1a2-97e0506818c3 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:07.324 263258 INFO neutron.agent.dhcp.agent [None req-ee5cd8f9-998a-4438-9d3c-0b83bc860687 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:07 np0005532585.localdomain dnsmasq[323115]: exiting on receipt of SIGTERM
Nov 23 10:02:07 np0005532585.localdomain podman[323146]: 2025-11-23 10:02:07.528685748 +0000 UTC m=+0.065519460 container kill 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 10:02:07 np0005532585.localdomain systemd[1]: libpod-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e.scope: Deactivated successfully.
Nov 23 10:02:07 np0005532585.localdomain ceph-mon[300199]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 938 B/s wr, 15 op/s
Nov 23 10:02:07 np0005532585.localdomain podman[323165]: 2025-11-23 10:02:07.601085259 +0000 UTC m=+0.054751098 container died 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:02:07 np0005532585.localdomain podman[323165]: 2025-11-23 10:02:07.627344107 +0000 UTC m=+0.081009916 container cleanup 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:02:07 np0005532585.localdomain systemd[1]: libpod-conmon-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e.scope: Deactivated successfully.
Nov 23 10:02:07 np0005532585.localdomain podman[323167]: 2025-11-23 10:02:07.678242585 +0000 UTC m=+0.125998942 container remove 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:07 np0005532585.localdomain podman[323217]: 
Nov 23 10:02:07 np0005532585.localdomain podman[323217]: 2025-11-23 10:02:07.963553364 +0000 UTC m=+0.088283791 container create 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 23 10:02:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:07.982 2 INFO neutron.agent.securitygroups_rpc [None req-fccb08e0-0b04-4e9e-9f46-388ae08a5315 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:07 np0005532585.localdomain systemd[1]: Started libpod-conmon-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d.scope.
Nov 23 10:02:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.004 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:08 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:08 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85794c23db926913628cec6db6b73789ec462daef8098b4cddf986aad56e49c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:08 np0005532585.localdomain podman[323217]: 2025-11-23 10:02:08.021282483 +0000 UTC m=+0.146012910 container init 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:02:08 np0005532585.localdomain podman[323217]: 2025-11-23 10:02:07.924067358 +0000 UTC m=+0.048797835 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:08 np0005532585.localdomain podman[323217]: 2025-11-23 10:02:08.031505198 +0000 UTC m=+0.156235635 container start 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:08 np0005532585.localdomain dnsmasq[323235]: started, version 2.85 cachesize 150
Nov 23 10:02:08 np0005532585.localdomain dnsmasq[323235]: DNS service limited to local subnets
Nov 23 10:02:08 np0005532585.localdomain dnsmasq[323235]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:08 np0005532585.localdomain dnsmasq[323235]: warning: no upstream servers configured
Nov 23 10:02:08 np0005532585.localdomain dnsmasq-dhcp[323235]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Nov 23 10:02:08 np0005532585.localdomain dnsmasq[323235]: read /var/lib/neutron/dhcp/e392c0bc-bd43-40a4-a7d7-6e0130e48060/addn_hosts - 0 addresses
Nov 23 10:02:08 np0005532585.localdomain dnsmasq-dhcp[323235]: read /var/lib/neutron/dhcp/e392c0bc-bd43-40a4-a7d7-6e0130e48060/host
Nov 23 10:02:08 np0005532585.localdomain dnsmasq-dhcp[323235]: read /var/lib/neutron/dhcp/e392c0bc-bd43-40a4-a7d7-6e0130e48060/opts
Nov 23 10:02:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.039 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a3fe118de034c9dcbe06dbd482ab6b32f3c606b4624d6aae6333387bec542e04-merged.mount: Deactivated successfully.
Nov 23 10:02:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.172 263258 INFO neutron.agent.dhcp.agent [None req-d624a739-ef82-4e54-98ae-a94d64679190 - - - - - -] DHCP configuration for ports {'475fd203-4bae-4a13-96a3-3a4ff6625465'} is completed
Nov 23 10:02:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:08.227 2 INFO neutron.agent.securitygroups_rpc [None req-ccd00d64-5365-45b4-a98a-49c5095f8557 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e116 e116: 6 total, 6 up, 6 in
Nov 23 10:02:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.753 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:08 np0005532585.localdomain podman[323287]: 
Nov 23 10:02:09 np0005532585.localdomain podman[323287]: 2025-11-23 10:02:09.006819882 +0000 UTC m=+0.086369820 container create 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:02:09 np0005532585.localdomain systemd[1]: Started libpod-conmon-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa.scope.
Nov 23 10:02:09 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:09 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6787532a03028bbb9352430e4ee923335f00773afda3bc4b41f6a4b5bdbf779c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:09 np0005532585.localdomain podman[323287]: 2025-11-23 10:02:09.061565259 +0000 UTC m=+0.141115117 container init 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:02:09 np0005532585.localdomain podman[323287]: 2025-11-23 10:02:08.964033335 +0000 UTC m=+0.043583213 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:09 np0005532585.localdomain podman[323287]: 2025-11-23 10:02:09.06840547 +0000 UTC m=+0.147955368 container start 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: started, version 2.85 cachesize 150
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: DNS service limited to local subnets
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: warning: no upstream servers configured
Nov 23 10:02:09 np0005532585.localdomain dnsmasq-dhcp[323305]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:02:09 np0005532585.localdomain dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:09 np0005532585.localdomain dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:09.128 263258 INFO neutron.agent.dhcp.agent [None req-80643ad2-522f-449b-989f-ac461e6ca619 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:06Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a394f0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a39a90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a39c40>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a39460>], id=829fd08c-2cab-4a3b-8d7c-a1d746a46be5, ip_allocation=immediate, mac_address=fa:16:3e:30:b5:e7, name=tempest-NetworksTestDHCPv6-1529993794, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['117abf22-f342-429e-a859-0a947d5758f7', 'd72bfe7d-5cae-4bc9-8351-3414fff06dc1'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:06Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1746, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:07Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:02:09 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:09.136 2 INFO neutron.agent.securitygroups_rpc [None req-4a4cd2de-ce5b-469c-97bb-90c17373d140 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']
Nov 23 10:02:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:09.292 263258 INFO neutron.agent.dhcp.agent [None req-ebc69321-ccee-4eeb-974c-63c06a80ad1d - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', '829fd08c-2cab-4a3b-8d7c-a1d746a46be5', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:09.301 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:02:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:09.302 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:02:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:02:09 np0005532585.localdomain podman[323323]: 2025-11-23 10:02:09.309875359 +0000 UTC m=+0.066080097 container kill 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:02:09 np0005532585.localdomain dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:09 np0005532585.localdomain dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:09.517 263258 INFO neutron.agent.dhcp.agent [None req-d599d47a-4620-41bc-b0a4-07672c4f0d6b - - - - - -] DHCP configuration for ports {'829fd08c-2cab-4a3b-8d7c-a1d746a46be5'} is completed
Nov 23 10:02:09 np0005532585.localdomain ceph-mon[300199]: osdmap e116: 6 total, 6 up, 6 in
Nov 23 10:02:09 np0005532585.localdomain ceph-mon[300199]: pgmap v235: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 3.0 KiB/s wr, 49 op/s
Nov 23 10:02:09 np0005532585.localdomain dnsmasq[323305]: exiting on receipt of SIGTERM
Nov 23 10:02:09 np0005532585.localdomain podman[323360]: 2025-11-23 10:02:09.722372796 +0000 UTC m=+0.065141768 container kill 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:02:09 np0005532585.localdomain systemd[1]: libpod-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa.scope: Deactivated successfully.
Nov 23 10:02:09 np0005532585.localdomain podman[323375]: 2025-11-23 10:02:09.804728632 +0000 UTC m=+0.060686650 container died 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:09 np0005532585.localdomain podman[323375]: 2025-11-23 10:02:09.847727168 +0000 UTC m=+0.103685156 container remove 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:02:09 np0005532585.localdomain systemd[1]: libpod-conmon-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa.scope: Deactivated successfully.
Nov 23 10:02:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6787532a03028bbb9352430e4ee923335f00773afda3bc4b41f6a4b5bdbf779c-merged.mount: Deactivated successfully.
Nov 23 10:02:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:10.545 263258 INFO neutron.agent.dhcp.agent [None req-16bec703-84f3-4c4a-82cf-01da7f4b94a4 - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:11.229 263258 INFO neutron.agent.linux.ip_lib [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Device tapa448321b-77 cannot be used as it has no MAC address
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain kernel: device tapa448321b-77 entered promiscuous mode
Nov 23 10:02:11 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892131.2921] manager: (tapa448321b-77): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Nov 23 10:02:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:11Z|00322|binding|INFO|Claiming lport a448321b-7787-4303-b318-0f7b37915029 for this chassis.
Nov 23 10:02:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:11Z|00323|binding|INFO|a448321b-7787-4303-b318-0f7b37915029: Claiming unknown
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain systemd-udevd[323410]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:11.305 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=083064ad-1b8e-4a49-952e-9175ecca48be, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a448321b-7787-4303-b318-0f7b37915029) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:11.307 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a448321b-7787-4303-b318-0f7b37915029 in datapath 4267129b-8796-478e-a8a2-d9eb57ec8730 bound to our chassis
Nov 23 10:02:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:11.309 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4267129b-8796-478e-a8a2-d9eb57ec8730 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:11.310 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9e60e5a6-0e2d-427f-b351-a754ecdcc38e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:11Z|00324|binding|INFO|Setting lport a448321b-7787-4303-b318-0f7b37915029 ovn-installed in OVS
Nov 23 10:02:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:11Z|00325|binding|INFO|Setting lport a448321b-7787-4303-b318-0f7b37915029 up in Southbound
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa448321b-77: No such device
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.375 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.404 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:11.569 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:11 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:11.779 2 INFO neutron.agent.securitygroups_rpc [None req-714530e0-ede7-43a6-b513-9acc4fe9127a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:02:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:02:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:02:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 10:02:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:02:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1"
Nov 23 10:02:12 np0005532585.localdomain podman[323509]: 
Nov 23 10:02:12 np0005532585.localdomain podman[323509]: 2025-11-23 10:02:12.304329474 +0000 UTC m=+0.091553101 container create b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:02:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a.scope.
Nov 23 10:02:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:12.358 2 INFO neutron.agent.securitygroups_rpc [None req-c5ba658c-cffc-41f4-af3e-933fb394a1b2 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']
Nov 23 10:02:12 np0005532585.localdomain podman[323509]: 2025-11-23 10:02:12.260331309 +0000 UTC m=+0.047554956 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:12 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0c0334387ce4a5254ea7688c110c9fde45e19d5a61b9cda522ce0194ebbeff3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:12 np0005532585.localdomain podman[323509]: 2025-11-23 10:02:12.470165773 +0000 UTC m=+0.257389400 container init b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:02:12 np0005532585.localdomain podman[323509]: 2025-11-23 10:02:12.478468859 +0000 UTC m=+0.265692476 container start b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323543]: started, version 2.85 cachesize 150
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323543]: DNS service limited to local subnets
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323543]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323543]: warning: no upstream servers configured
Nov 23 10:02:12 np0005532585.localdomain dnsmasq-dhcp[323543]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323543]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/addn_hosts - 0 addresses
Nov 23 10:02:12 np0005532585.localdomain dnsmasq-dhcp[323543]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/host
Nov 23 10:02:12 np0005532585.localdomain dnsmasq-dhcp[323543]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/opts
Nov 23 10:02:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:12.540 2 INFO neutron.agent.securitygroups_rpc [None req-c5ba658c-cffc-41f4-af3e-933fb394a1b2 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.542 263258 INFO neutron.agent.dhcp.agent [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:10Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a6edc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c9410c10>], id=36e753ee-ee5c-438f-999a-778d9317f27a, ip_allocation=immediate, mac_address=fa:16:3e:ed:a8:f0, name=tempest-PortsIpV6TestJSON-1794941810, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:09Z, description=, dns_domain=, id=4267129b-8796-478e-a8a2-d9eb57ec8730, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-375175599, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51530, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1763, status=ACTIVE, subnets=['9657efcc-5829-4691-aba6-3cbf999666ad'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:10Z, vlan_transparent=None, network_id=4267129b-8796-478e-a8a2-d9eb57ec8730, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d7ead8f7-80d5-4103-ab91-19b87956485a'], standard_attr_id=1779, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:11Z on network 4267129b-8796-478e-a8a2-d9eb57ec8730
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323543]: exiting on receipt of SIGTERM
Nov 23 10:02:12 np0005532585.localdomain systemd[1]: libpod-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a.scope: Deactivated successfully.
Nov 23 10:02:12 np0005532585.localdomain podman[323550]: 2025-11-23 10:02:12.588639782 +0000 UTC m=+0.080034157 container died b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.588 263258 ERROR neutron.agent.linux.utils [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Exit code: 1; Cmd: ['/etc/neutron/kill_scripts/dnsmasq-kill', 'HUP', 323543]; Stdin: ; Stdout: Sun Nov 23 10:02:12 AM UTC 2025 No such PID: 323543
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: ; Stderr: Cannot open network namespace: No such file or directory
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Unable to reload_allocations dhcp for 4267129b-8796-478e-a8a2-d9eb57ec8730.: neutron_lib.exceptions.ProcessExecutionError: Exit code: 1; Cmd: ['/etc/neutron/kill_scripts/dnsmasq-kill', 'HUP', 323543]; Stdin: ; Stdout: Sun Nov 23 10:02:12 AM UTC 2025 No such PID: 323543
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: ; Stderr: Cannot open network namespace: No such file or directory
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 671, in reload_allocations
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     self._spawn_or_reload_process(reload_with_HUP=True)
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 603, in _spawn_or_reload_process
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     pm.enable(reload_cfg=reload_with_HUP, ensure_active=True)
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 108, in enable
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     self.reload_cfg()
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 117, in reload_cfg
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     self.disable('HUP', delete_pid_file=False)
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 132, in disable
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     utils.execute(cmd, addl_env=self.cmd_addl_env,
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py", line 156, in execute
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent     raise exceptions.ProcessExecutionError(msg,
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent neutron_lib.exceptions.ProcessExecutionError: Exit code: 1; Cmd: ['/etc/neutron/kill_scripts/dnsmasq-kill', 'HUP', 323543]; Stdin: ; Stdout: Sun Nov 23 10:02:12 AM UTC 2025 No such PID: 323543
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent ; Stderr: Cannot open network namespace: No such file or directory
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent 
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent 
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.606 263258 INFO neutron.agent.dhcp.agent [None req-3d493ef3-82b2-4169-8cde-5402b06203db - - - - - -] DHCP configuration for ports {'1b960541-5cae-4b07-9a4c-5f405e43807e'} is completed
Nov 23 10:02:12 np0005532585.localdomain podman[323550]: 2025-11-23 10:02:12.618288126 +0000 UTC m=+0.109682431 container cleanup b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:02:12 np0005532585.localdomain podman[323566]: 2025-11-23 10:02:12.661322431 +0000 UTC m=+0.066481899 container cleanup b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:02:12 np0005532585.localdomain systemd[1]: libpod-conmon-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a.scope: Deactivated successfully.
Nov 23 10:02:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:12.695 2 INFO neutron.agent.securitygroups_rpc [None req-972382fb-338a-4217-8905-e67aa90103fe 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:12 np0005532585.localdomain podman[323578]: 2025-11-23 10:02:12.71125032 +0000 UTC m=+0.079023366 container remove b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.716 263258 INFO neutron.agent.dhcp.agent [None req-b7d11b5f-b09d-4d01-875f-b7e040b9179f - - - - - -] DHCP configuration for ports {'36e753ee-ee5c-438f-999a-778d9317f27a'} is completed
Nov 23 10:02:12 np0005532585.localdomain podman[323598]: 
Nov 23 10:02:12 np0005532585.localdomain podman[323598]: 2025-11-23 10:02:12.819204165 +0000 UTC m=+0.088748195 container create e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:02:12 np0005532585.localdomain systemd[1]: Started libpod-conmon-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339.scope.
Nov 23 10:02:12 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:12 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53352129b85bc25f3136ae709db7dcaa40905d3bf7355e014a7309f881e89f02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:12 np0005532585.localdomain podman[323598]: 2025-11-23 10:02:12.778305835 +0000 UTC m=+0.047849905 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:12 np0005532585.localdomain podman[323598]: 2025-11-23 10:02:12.881332278 +0000 UTC m=+0.150876308 container init e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:02:12 np0005532585.localdomain podman[323598]: 2025-11-23 10:02:12.891968166 +0000 UTC m=+0.161512186 container start e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323616]: started, version 2.85 cachesize 150
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323616]: DNS service limited to local subnets
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323616]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323616]: warning: no upstream servers configured
Nov 23 10:02:12 np0005532585.localdomain dnsmasq-dhcp[323616]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:12 np0005532585.localdomain dnsmasq[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:12 np0005532585.localdomain dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:12 np0005532585.localdomain dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.956 263258 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Nov 23 10:02:12 np0005532585.localdomain ceph-mon[300199]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 3.0 KiB/s wr, 49 op/s
Nov 23 10:02:13 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:13.206 2 INFO neutron.agent.securitygroups_rpc [None req-e36b687d-978f-4757-a141-a3de7329fae8 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']
Nov 23 10:02:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.295 263258 INFO neutron.agent.dhcp.agent [None req-fdec6d1d-4c7e-426c-ac37-18163735a0c1 - - - - - -] All active networks have been fetched through RPC.
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.296 263258 INFO neutron.agent.dhcp.agent [-] Starting network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.297 263258 INFO neutron.agent.dhcp.agent [-] Finished network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.297 263258 INFO neutron.agent.dhcp.agent [-] Starting network 4267129b-8796-478e-a8a2-d9eb57ec8730 dhcp configuration
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.301 263258 INFO neutron.agent.dhcp.agent [None req-ed93737c-d55e-4174-a55d-940593f9459b - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d0c0334387ce4a5254ea7688c110c9fde45e19d5a61b9cda522ce0194ebbeff3-merged.mount: Deactivated successfully.
Nov 23 10:02:13 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.409 263258 INFO neutron.agent.dhcp.agent [None req-77a37596-627f-4059-9200-c97046b46160 - - - - - -] Finished network 4267129b-8796-478e-a8a2-d9eb57ec8730 dhcp configuration
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.409 263258 INFO neutron.agent.dhcp.agent [None req-fdec6d1d-4c7e-426c-ac37-18163735a0c1 - - - - - -] Synchronizing state complete
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.545 263258 INFO neutron.agent.dhcp.agent [None req-a300361f-abab-4ba8-85d4-d92bfc23a2a5 - - - - - -] DHCP configuration for ports {'1b960541-5cae-4b07-9a4c-5f405e43807e', '36e753ee-ee5c-438f-999a-778d9317f27a'} is completed
Nov 23 10:02:13 np0005532585.localdomain dnsmasq[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:13 np0005532585.localdomain dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:13 np0005532585.localdomain dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:13 np0005532585.localdomain podman[323637]: 2025-11-23 10:02:13.585410689 +0000 UTC m=+0.032854494 container kill e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:02:13 np0005532585.localdomain podman[323680]: 
Nov 23 10:02:13 np0005532585.localdomain podman[323680]: 2025-11-23 10:02:13.859245014 +0000 UTC m=+0.073161275 container create 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:13 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:13.865 2 INFO neutron.agent.securitygroups_rpc [None req-072dc881-3667-4cc2-b265-d09f039c6880 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']
Nov 23 10:02:13 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:13.892 2 INFO neutron.agent.securitygroups_rpc [None req-a4d2f133-97c5-4bde-ae86-9705c747c91a 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:13 np0005532585.localdomain podman[323680]: 2025-11-23 10:02:13.816198498 +0000 UTC m=+0.030114779 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:13 np0005532585.localdomain systemd[1]: Started libpod-conmon-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae.scope.
Nov 23 10:02:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.921 263258 INFO neutron.agent.dhcp.agent [None req-7e1eb57e-69a5-4977-b4bb-223fad13a781 - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:13 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:13 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d911125f48d63f87fc5e495f0553a4d7d8756cf37aca9beb20580dcfaf22988/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:13Z|00326|binding|INFO|Removing iface tapa448321b-77 ovn-installed in OVS
Nov 23 10:02:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:13Z|00327|binding|INFO|Removing lport a448321b-7787-4303-b318-0f7b37915029 ovn-installed in OVS
Nov 23 10:02:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:13.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:13.973 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 53a85ccd-a35f-4a4b-800b-d124b2117401 with type ""
Nov 23 10:02:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:13.975 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=083064ad-1b8e-4a49-952e-9175ecca48be, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a448321b-7787-4303-b318-0f7b37915029) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:13.976 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a448321b-7787-4303-b318-0f7b37915029 in datapath 4267129b-8796-478e-a8a2-d9eb57ec8730 unbound from our chassis
Nov 23 10:02:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:13.978 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:13.978 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4267129b-8796-478e-a8a2-d9eb57ec8730 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:13.979 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddc62a3-fac1-4eb8-ae36-0290cdc36b70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:13 np0005532585.localdomain ceph-mon[300199]: pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 772 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 3.0 KiB/s wr, 49 op/s
Nov 23 10:02:14 np0005532585.localdomain podman[323680]: 2025-11-23 10:02:14.001718262 +0000 UTC m=+0.215634543 container init 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:02:14 np0005532585.localdomain podman[323680]: 2025-11-23 10:02:14.0145972 +0000 UTC m=+0.228513511 container start 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323727]: started, version 2.85 cachesize 150
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323727]: DNS service limited to local subnets
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323727]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323727]: warning: no upstream servers configured
Nov 23 10:02:14 np0005532585.localdomain dnsmasq-dhcp[323727]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323727]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/addn_hosts - 0 addresses
Nov 23 10:02:14 np0005532585.localdomain dnsmasq-dhcp[323727]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/host
Nov 23 10:02:14 np0005532585.localdomain dnsmasq-dhcp[323727]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/opts
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323616]: exiting on receipt of SIGTERM
Nov 23 10:02:14 np0005532585.localdomain podman[323714]: 2025-11-23 10:02:14.075132735 +0000 UTC m=+0.091157990 container kill e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: libpod-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339.scope: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain podman[323733]: 2025-11-23 10:02:14.141344104 +0000 UTC m=+0.050809366 container died e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:14 np0005532585.localdomain podman[323733]: 2025-11-23 10:02:14.191591342 +0000 UTC m=+0.101056554 container remove e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: libpod-conmon-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339.scope: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-53352129b85bc25f3136ae709db7dcaa40905d3bf7355e014a7309f881e89f02-merged.mount: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain dnsmasq[323727]: exiting on receipt of SIGTERM
Nov 23 10:02:14 np0005532585.localdomain podman[323780]: 2025-11-23 10:02:14.482958577 +0000 UTC m=+0.069789871 container kill 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: libpod-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae.scope: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain podman[323794]: 2025-11-23 10:02:14.559682692 +0000 UTC m=+0.054951665 container died 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain podman[323794]: 2025-11-23 10:02:14.664340425 +0000 UTC m=+0.159609418 container remove 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:14 np0005532585.localdomain systemd[1]: libpod-conmon-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae.scope: Deactivated successfully.
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.678 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:14 np0005532585.localdomain kernel: device tapa448321b-77 left promiscuous mode
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.690 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.691 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.691 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.692 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:14.741 263258 INFO neutron.agent.dhcp.agent [None req-1d83807c-834f-4f0c-93d7-c7972c1e7f84 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:14.742 263258 INFO neutron.agent.dhcp.agent [None req-1d83807c-834f-4f0c-93d7-c7972c1e7f84 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:14.822 2 INFO neutron.agent.securitygroups_rpc [None req-5062901d-b55d-4422-a232-c0dc20b0538f 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']
Nov 23 10:02:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:14.841 2 INFO neutron.agent.securitygroups_rpc [None req-e9c4a6cc-b7de-4f6d-9d17-231261e88eb0 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:14Z|00328|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:14.902 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3d911125f48d63f87fc5e495f0553a4d7d8756cf37aca9beb20580dcfaf22988-merged.mount: Deactivated successfully.
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d4267129b\x2d8796\x2d478e\x2da8a2\x2dd9eb57ec8730.mount: Deactivated successfully.
Nov 23 10:02:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:15.393 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:02:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:15.416 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:02:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:15.417 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:02:15 np0005532585.localdomain ceph-mon[300199]: pgmap v238: 177 pgs: 177 active+clean; 185 MiB data, 860 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 4.0 MiB/s wr, 47 op/s
Nov 23 10:02:15 np0005532585.localdomain podman[323869]: 
Nov 23 10:02:15 np0005532585.localdomain podman[323869]: 2025-11-23 10:02:15.776235488 +0000 UTC m=+0.085711482 container create 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: Started libpod-conmon-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2.scope.
Nov 23 10:02:15 np0005532585.localdomain podman[323869]: 2025-11-23 10:02:15.727026682 +0000 UTC m=+0.036502686 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:15 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a260f749b17063c012086c3799d50fbbbd0ca0af59a66791b567b9b74ab4b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:15 np0005532585.localdomain podman[323869]: 2025-11-23 10:02:15.858415939 +0000 UTC m=+0.167891913 container init 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:15 np0005532585.localdomain dnsmasq[323920]: started, version 2.85 cachesize 150
Nov 23 10:02:15 np0005532585.localdomain dnsmasq[323920]: DNS service limited to local subnets
Nov 23 10:02:15 np0005532585.localdomain dnsmasq[323920]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:15 np0005532585.localdomain dnsmasq[323920]: warning: no upstream servers configured
Nov 23 10:02:15 np0005532585.localdomain dnsmasq-dhcp[323920]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:02:15 np0005532585.localdomain dnsmasq-dhcp[323920]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:15 np0005532585.localdomain dnsmasq[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:15 np0005532585.localdomain dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:15 np0005532585.localdomain dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:15 np0005532585.localdomain podman[323885]: 2025-11-23 10:02:15.888691152 +0000 UTC m=+0.061344461 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:02:15 np0005532585.localdomain podman[323885]: 2025-11-23 10:02:15.900223887 +0000 UTC m=+0.072877216 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:02:15 np0005532585.localdomain podman[323884]: 2025-11-23 10:02:15.948300968 +0000 UTC m=+0.123863836 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:02:15 np0005532585.localdomain podman[323884]: 2025-11-23 10:02:15.977926421 +0000 UTC m=+0.153489319 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 23 10:02:15 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:02:15 np0005532585.localdomain podman[323883]: 2025-11-23 10:02:15.991663854 +0000 UTC m=+0.172063862 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:16 np0005532585.localdomain podman[323869]: 2025-11-23 10:02:16.018173091 +0000 UTC m=+0.327649095 container start 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:16 np0005532585.localdomain podman[323883]: 2025-11-23 10:02:16.037434464 +0000 UTC m=+0.217834502 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:16 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:02:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.083 263258 INFO neutron.agent.dhcp.agent [None req-a42b5eb9-623e-47d5-ba45-0dfad10fa636 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:13Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f9d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1fca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1fac0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f9a0>], id=aeaf5260-7927-48ee-b05a-2716bc3b0e31, ip_allocation=immediate, mac_address=fa:16:3e:90:84:2b, name=tempest-NetworksTestDHCPv6-1598434446, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['bb1da575-b028-485d-9f9a-eeac7df574a4', 'bf360b77-a47e-40bd-a4e7-951f05df64e6'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:13Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1805, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:13Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:02:16 np0005532585.localdomain podman[323967]: 2025-11-23 10:02:16.271137074 +0000 UTC m=+0.047215586 container kill 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:02:16 np0005532585.localdomain dnsmasq[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:02:16 np0005532585.localdomain dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:16 np0005532585.localdomain dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.391 263258 INFO neutron.agent.dhcp.agent [None req-5e056cbf-ae92-4a97-94fc-0533aae81090 - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.513 263258 INFO neutron.agent.linux.ip_lib [None req-24be83e4-27be-47bf-b575-0cc802bea2ef - - - - - -] Device taped2c180f-1a cannot be used as it has no MAC address
Nov 23 10:02:16 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.522 263258 INFO neutron.agent.dhcp.agent [None req-0fd5e769-1010-4017-b655-6bd1904ab3a2 - - - - - -] DHCP configuration for ports {'aeaf5260-7927-48ee-b05a-2716bc3b0e31'} is completed
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain kernel: device taped2c180f-1a entered promiscuous mode
Nov 23 10:02:16 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892136.5470] manager: (taped2c180f-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Nov 23 10:02:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:16Z|00329|binding|INFO|Claiming lport ed2c180f-1a61-4a88-a761-adcb953abd22 for this chassis.
Nov 23 10:02:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:16Z|00330|binding|INFO|ed2c180f-1a61-4a88-a761-adcb953abd22: Claiming unknown
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.549 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain systemd-udevd[323999]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.557 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.569 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10f8dd7c838246c58f1d2c4efc771237', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bd8c251-3bb7-4e15-9c8d-7fcd2e804fa5, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ed2c180f-1a61-4a88-a761-adcb953abd22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.571 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ed2c180f-1a61-4a88-a761-adcb953abd22 in datapath 31b977a7-a37c-42ba-bed9-7b22959f6310 bound to our chassis
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.572 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 31b977a7-a37c-42ba-bed9-7b22959f6310 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.573 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ab29bcf2-a0f9-490c-8ce5-3c3d984df964]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:16Z|00331|binding|INFO|Setting lport ed2c180f-1a61-4a88-a761-adcb953abd22 ovn-installed in OVS
Nov 23 10:02:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:16Z|00332|binding|INFO|Setting lport ed2c180f-1a61-4a88-a761-adcb953abd22 up in Southbound
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.599 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.645 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.682 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain dnsmasq[323920]: exiting on receipt of SIGTERM
Nov 23 10:02:16 np0005532585.localdomain podman[324024]: 2025-11-23 10:02:16.734023703 +0000 UTC m=+0.056417159 container kill 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:02:16 np0005532585.localdomain systemd[1]: libpod-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2.scope: Deactivated successfully.
Nov 23 10:02:16 np0005532585.localdomain podman[324040]: 2025-11-23 10:02:16.814666587 +0000 UTC m=+0.068109729 container died 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:02:16 np0005532585.localdomain podman[324040]: 2025-11-23 10:02:16.851008667 +0000 UTC m=+0.104451779 container cleanup 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:02:16 np0005532585.localdomain systemd[1]: libpod-conmon-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2.scope: Deactivated successfully.
Nov 23 10:02:16 np0005532585.localdomain podman[324047]: 2025-11-23 10:02:16.903639638 +0000 UTC m=+0.143095609 container remove 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:02:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:16Z|00333|binding|INFO|Releasing lport c97f425d-0e2f-4212-85de-246b74cc4178 from this chassis (sb_readonly=0)
Nov 23 10:02:16 np0005532585.localdomain kernel: device tapc97f425d-0e left promiscuous mode
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.926 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:16Z|00334|binding|INFO|Setting lport c97f425d-0e2f-4212-85de-246b74cc4178 down in Southbound
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.940 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fe7a:311b/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=c97f425d-0e2f-4212-85de-246b74cc4178) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.942 160439 INFO neutron.agent.ovn.metadata.agent [-] Port c97f425d-0e2f-4212-85de-246b74cc4178 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.945 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:16.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:16 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:16.946 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9a07e739-126c-493c-a315-6c89e51a4b63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:17.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:17.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:02:17 np0005532585.localdomain systemd[1]: tmp-crun.HZ4mM1.mount: Deactivated successfully.
Nov 23 10:02:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f6a260f749b17063c012086c3799d50fbbbd0ca0af59a66791b567b9b74ab4b9-merged.mount: Deactivated successfully.
Nov 23 10:02:17 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:17 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:02:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e117 e117: 6 total, 6 up, 6 in
Nov 23 10:02:17 np0005532585.localdomain podman[324114]: 
Nov 23 10:02:17 np0005532585.localdomain podman[324114]: 2025-11-23 10:02:17.667435637 +0000 UTC m=+0.090570401 container create c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:02:17 np0005532585.localdomain podman[324114]: 2025-11-23 10:02:17.625049592 +0000 UTC m=+0.048184356 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:17 np0005532585.localdomain systemd[1]: Started libpod-conmon-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99.scope.
Nov 23 10:02:17 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:17 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d30fb1fb9da3ec81fc09dcff1705485fc67933914b93a576884d8b0f75a2cf79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:17 np0005532585.localdomain podman[324114]: 2025-11-23 10:02:17.757435849 +0000 UTC m=+0.180570573 container init c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:02:17 np0005532585.localdomain podman[324114]: 2025-11-23 10:02:17.766426936 +0000 UTC m=+0.189561660 container start c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:17 np0005532585.localdomain dnsmasq[324132]: started, version 2.85 cachesize 150
Nov 23 10:02:17 np0005532585.localdomain dnsmasq[324132]: DNS service limited to local subnets
Nov 23 10:02:17 np0005532585.localdomain dnsmasq[324132]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:17 np0005532585.localdomain dnsmasq[324132]: warning: no upstream servers configured
Nov 23 10:02:17 np0005532585.localdomain dnsmasq-dhcp[324132]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:02:17 np0005532585.localdomain dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 0 addresses
Nov 23 10:02:17 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host
Nov 23 10:02:17 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts
Nov 23 10:02:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:17.901 263258 INFO neutron.agent.dhcp.agent [None req-b34f390c-0a7a-43ec-a164-5558df2022e3 - - - - - -] DHCP configuration for ports {'3aead29e-e222-4eec-a2bd-bde9a205e26f'} is completed
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:18 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:18.304 263258 INFO neutron.agent.linux.ip_lib [None req-6dc1892b-fcee-480c-a98c-87ce793d8969 - - - - - -] Device tapd4981345-81 cannot be used as it has no MAC address
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.362 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:18 np0005532585.localdomain kernel: device tapd4981345-81 entered promiscuous mode
Nov 23 10:02:18 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892138.3658] manager: (tapd4981345-81): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Nov 23 10:02:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:18Z|00335|binding|INFO|Claiming lport d4981345-81c9-4678-a4d9-29762c427058 for this chassis.
Nov 23 10:02:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:18Z|00336|binding|INFO|d4981345-81c9-4678-a4d9-29762c427058: Claiming unknown
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.366 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:18 np0005532585.localdomain systemd-udevd[324004]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:18.376 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:18Z|00337|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 ovn-installed in OVS
Nov 23 10:02:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:18Z|00338|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 up in Southbound
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.382 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:18.378 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:02:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:18.382 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 39a83a1b-6025-4506-bc53-55b3409a5751 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:02:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:18.382 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:18.383 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef6011e-c209-46ca-8d1e-31845a2c6ac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:18 np0005532585.localdomain ceph-mon[300199]: osdmap e117: 6 total, 6 up, 6 in
Nov 23 10:02:18 np0005532585.localdomain ceph-mon[300199]: pgmap v240: 177 pgs: 177 active+clean; 185 MiB data, 860 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 4.6 MiB/s wr, 53 op/s
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.446 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:18.471 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e118 e118: 6 total, 6 up, 6 in
Nov 23 10:02:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3211337825' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:19 np0005532585.localdomain ceph-mon[300199]: osdmap e118: 6 total, 6 up, 6 in
Nov 23 10:02:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1242920347' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:19.685 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:19 np0005532585.localdomain podman[324195]: 
Nov 23 10:02:19 np0005532585.localdomain podman[324195]: 2025-11-23 10:02:19.917182632 +0000 UTC m=+0.089812018 container create a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:19 np0005532585.localdomain systemd[1]: Started libpod-conmon-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc.scope.
Nov 23 10:02:19 np0005532585.localdomain podman[324195]: 2025-11-23 10:02:19.873063202 +0000 UTC m=+0.045692608 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:19 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:19 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e33c56067c23739cb4aa49806f9afa9d048c85db65aa7d7ecb65ed0e58e4d8d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:19 np0005532585.localdomain podman[324195]: 2025-11-23 10:02:19.989065887 +0000 UTC m=+0.161695263 container init a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:02:19 np0005532585.localdomain podman[324195]: 2025-11-23 10:02:19.996696051 +0000 UTC m=+0.169325437 container start a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324213]: started, version 2.85 cachesize 150
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324213]: DNS service limited to local subnets
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324213]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324213]: warning: no upstream servers configured
Nov 23 10:02:20 np0005532585.localdomain dnsmasq-dhcp[324213]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324213]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:20 np0005532585.localdomain dnsmasq-dhcp[324213]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:20 np0005532585.localdomain dnsmasq-dhcp[324213]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain kernel: device tapd4981345-81 left promiscuous mode
Nov 23 10:02:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:20Z|00339|binding|INFO|Releasing lport d4981345-81c9-4678-a4d9-29762c427058 from this chassis (sb_readonly=0)
Nov 23 10:02:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:20Z|00340|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 down in Southbound
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.088 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee0:b47d/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.090 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.093 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.093 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d8489335-1274-4f3d-8f15-a7cb5a866243]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:20.186 263258 INFO neutron.agent.dhcp.agent [None req-86bfa4b5-a3d3-49f7-8528-1648cb586bcd - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.238 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:02:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:20.362 263258 INFO neutron.agent.linux.ip_lib [None req-918d971d-3b75-4a67-bf2f-71e45f4e5fd6 - - - - - -] Device tapb261f9c2-83 cannot be used as it has no MAC address
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.387 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain kernel: device tapb261f9c2-83 entered promiscuous mode
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:20Z|00341|binding|INFO|Claiming lport b261f9c2-83e7-4a95-9c5e-261d26463cae for this chassis.
Nov 23 10:02:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:20Z|00342|binding|INFO|b261f9c2-83e7-4a95-9c5e-261d26463cae: Claiming unknown
Nov 23 10:02:20 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892140.3973] manager: (tapb261f9c2-83): new Generic device (/org/freedesktop/NetworkManager/Devices/57)
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.417 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.418 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe6b:a1ef/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a73b2c2-bad6-4a3d-a15f-3ef982ba513d, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b261f9c2-83e7-4a95-9c5e-261d26463cae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:20 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e119 e119: 6 total, 6 up, 6 in
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.421 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b261f9c2-83e7-4a95-9c5e-261d26463cae in datapath 6957ce07-4d9c-4d1d-a573-5f6518e6601d bound to our chassis
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.423 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6957ce07-4d9c-4d1d-a573-5f6518e6601d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.424 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0e12f8-1003-4282-9f65-bd4587511dc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.432 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8:0:1:f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:20Z|00343|binding|INFO|Setting lport b261f9c2-83e7-4a95-9c5e-261d26463cae ovn-installed in OVS
Nov 23 10:02:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:20Z|00344|binding|INFO|Setting lport b261f9c2-83e7-4a95-9c5e-261d26463cae up in Southbound
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.437 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.443 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:20 np0005532585.localdomain ceph-mon[300199]: pgmap v242: 177 pgs: 8 active+clean+snaptrim, 6 active+clean+snaptrim_wait, 163 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 14 MiB/s wr, 58 op/s
Nov 23 10:02:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/840348542' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4067616315' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:20.445 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9add3d-4c1f-4ed0-886e-f5ba99adb27b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.483 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:20.576 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:20Z, description=, device_id=6c96453a-c777-4431-b133-5c4197796c3e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a6e0a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a6e220>], id=ef47ee52-a1a9-471d-b433-3cfaa989690c, ip_allocation=immediate, mac_address=fa:16:3e:a6:dd:58, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:14Z, description=, dns_domain=, id=31b977a7-a37c-42ba-bed9-7b22959f6310, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1570024778-network, port_security_enabled=True, project_id=10f8dd7c838246c58f1d2c4efc771237, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1812, status=ACTIVE, subnets=['ef80155c-ac2e-4709-9882-d7bc38017108'], tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:15Z, vlan_transparent=None, network_id=31b977a7-a37c-42ba-bed9-7b22959f6310, port_security_enabled=False, project_id=10f8dd7c838246c58f1d2c4efc771237, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1857, status=DOWN, tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:20Z on network 31b977a7-a37c-42ba-bed9-7b22959f6310
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.757 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 1 addresses
Nov 23 10:02:20 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host
Nov 23 10:02:20 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts
Nov 23 10:02:20 np0005532585.localdomain podman[324286]: 2025-11-23 10:02:20.820296552 +0000 UTC m=+0.039096825 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.828 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:02:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:20.829 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:02:20 np0005532585.localdomain dnsmasq[324213]: exiting on receipt of SIGTERM
Nov 23 10:02:20 np0005532585.localdomain podman[324323]: 2025-11-23 10:02:20.96598109 +0000 UTC m=+0.052933820 container kill a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:02:20 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:20.965 2 INFO neutron.agent.securitygroups_rpc [None req-da0f81f0-068a-49d7-b6f1-50fa28f5c3fa 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:20 np0005532585.localdomain systemd[1]: libpod-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc.scope: Deactivated successfully.
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.011 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:02:21 np0005532585.localdomain podman[324343]: 2025-11-23 10:02:21.012422531 +0000 UTC m=+0.037333160 container died a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.012 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11186MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.013 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.013 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:02:21 np0005532585.localdomain podman[324343]: 2025-11-23 10:02:21.037885886 +0000 UTC m=+0.062796485 container cleanup a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: libpod-conmon-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc.scope: Deactivated successfully.
Nov 23 10:02:21 np0005532585.localdomain podman[324350]: 2025-11-23 10:02:21.08150646 +0000 UTC m=+0.098327360 container remove a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.084 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.084 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.084 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:02:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.112 263258 INFO neutron.agent.linux.ip_lib [None req-5de1e31a-d4a1-4578-b4dd-c1804f8c816f - - - - - -] Device tapd4981345-81 cannot be used as it has no MAC address
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.125 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:02:21 np0005532585.localdomain kernel: device tapd4981345-81 entered promiscuous mode
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892141.1618] manager: (tapd4981345-81): new Generic device (/org/freedesktop/NetworkManager/Devices/58)
Nov 23 10:02:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.161 263258 INFO neutron.agent.dhcp.agent [None req-27e88670-3a53-453b-aa97-bd110b992645 - - - - - -] DHCP configuration for ports {'ef47ee52-a1a9-471d-b433-3cfaa989690c'} is completed
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00345|binding|INFO|Claiming lport d4981345-81c9-4678-a4d9-29762c427058 for this chassis.
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00346|binding|INFO|d4981345-81c9-4678-a4d9-29762c427058: Claiming unknown
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.164 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00347|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 ovn-installed in OVS
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.170 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.175 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00348|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 up in Southbound
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.177 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee0:b47d/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.180 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.183 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 39a83a1b-6025-4506-bc53-55b3409a5751 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.184 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.184 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[09e1a73a-67db-4c9c-9285-5d72a87a4b59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.203 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.278 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain podman[324430]: 
Nov 23 10:02:21 np0005532585.localdomain podman[324430]: 2025-11-23 10:02:21.368556822 +0000 UTC m=+0.084749701 container create 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: Started libpod-conmon-13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9.scope.
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:21 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/520562ffe34534266c987bb9177288c00e8eaa7bcdaa0866bacb0e9fffd889e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:21 np0005532585.localdomain podman[324430]: 2025-11-23 10:02:21.430866921 +0000 UTC m=+0.147059790 container init 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:02:21 np0005532585.localdomain podman[324430]: 2025-11-23 10:02:21.332339826 +0000 UTC m=+0.048532675 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:21 np0005532585.localdomain dnsmasq[324458]: started, version 2.85 cachesize 150
Nov 23 10:02:21 np0005532585.localdomain dnsmasq[324458]: DNS service limited to local subnets
Nov 23 10:02:21 np0005532585.localdomain dnsmasq[324458]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:21 np0005532585.localdomain dnsmasq[324458]: warning: no upstream servers configured
Nov 23 10:02:21 np0005532585.localdomain dnsmasq[324458]: read /var/lib/neutron/dhcp/6957ce07-4d9c-4d1d-a573-5f6518e6601d/addn_hosts - 0 addresses
Nov 23 10:02:21 np0005532585.localdomain podman[324430]: 2025-11-23 10:02:21.445284356 +0000 UTC m=+0.161477225 container start 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:21 np0005532585.localdomain ceph-mon[300199]: osdmap e119: 6 total, 6 up, 6 in
Nov 23 10:02:21 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2008208189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:21.476 2 INFO neutron.agent.securitygroups_rpc [None req-2c014fe0-b7cd-43b0-aa6e-452db82dfd05 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:21 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:02:21 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4263813190' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.522 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.537 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.545 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.561 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.564 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.567 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.567 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:02:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.577 263258 INFO neutron.agent.dhcp.agent [None req-66ce061d-76bb-48b7-affd-be9a854de33f - - - - - -] DHCP configuration for ports {'f252ff88-57dd-44b0-956a-6448a32b09e9'} is completed
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.600 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.613 263258 INFO neutron.agent.linux.ip_lib [None req-ef2c9ea9-e2d8-4157-8991-9b32ede258bf - - - - - -] Device tapf463970f-33 cannot be used as it has no MAC address
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.644 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain kernel: device tapf463970f-33 entered promiscuous mode
Nov 23 10:02:21 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892141.6527] manager: (tapf463970f-33): new Generic device (/org/freedesktop/NetworkManager/Devices/59)
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00349|binding|INFO|Claiming lport f463970f-3381-45b3-96e6-35969693be91 for this chassis.
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00350|binding|INFO|f463970f-3381-45b3-96e6-35969693be91: Claiming unknown
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.660 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.667 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bcc515473444ea195be635c77c65d0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11a468ee-c41b-44bf-983d-07802e5afd68, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f463970f-3381-45b3-96e6-35969693be91) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.670 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f463970f-3381-45b3-96e6-35969693be91 in datapath e77a4286-3801-4220-989b-d56ef685e3b6 bound to our chassis
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.672 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e77a4286-3801-4220-989b-d56ef685e3b6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.673 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f5a266-f793-471f-8399-ec1a246e7fe2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.689 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00351|binding|INFO|Setting lport f463970f-3381-45b3-96e6-35969693be91 ovn-installed in OVS
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00352|binding|INFO|Setting lport f463970f-3381-45b3-96e6-35969693be91 up in Southbound
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.697 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.740 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain dnsmasq[324458]: exiting on receipt of SIGTERM
Nov 23 10:02:21 np0005532585.localdomain podman[324498]: 2025-11-23 10:02:21.750380824 +0000 UTC m=+0.059427031 container kill 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: libpod-13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9.scope: Deactivated successfully.
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.766 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain podman[324519]: 2025-11-23 10:02:21.804596355 +0000 UTC m=+0.041379166 container died 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:02:21 np0005532585.localdomain podman[324519]: 2025-11-23 10:02:21.825823428 +0000 UTC m=+0.062606219 container cleanup 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: libpod-conmon-13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9.scope: Deactivated successfully.
Nov 23 10:02:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:21.896 2 INFO neutron.agent.securitygroups_rpc [None req-7716ba44-8c0f-4db3-b436-de264dd9940d 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:21 np0005532585.localdomain podman[324521]: 2025-11-23 10:02:21.904230824 +0000 UTC m=+0.128361975 container remove 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00353|binding|INFO|Releasing lport b261f9c2-83e7-4a95-9c5e-261d26463cae from this chassis (sb_readonly=0)
Nov 23 10:02:21 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:21Z|00354|binding|INFO|Setting lport b261f9c2-83e7-4a95-9c5e-261d26463cae down in Southbound
Nov 23 10:02:21 np0005532585.localdomain kernel: device tapb261f9c2-83 left promiscuous mode
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.914 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e33c56067c23739cb4aa49806f9afa9d048c85db65aa7d7ecb65ed0e58e4d8d9-merged.mount: Deactivated successfully.
Nov 23 10:02:21 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.924 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe6b:a1ef/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a73b2c2-bad6-4a3d-a15f-3ef982ba513d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b261f9c2-83e7-4a95-9c5e-261d26463cae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.928 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b261f9c2-83e7-4a95-9c5e-261d26463cae in datapath 6957ce07-4d9c-4d1d-a573-5f6518e6601d unbound from our chassis
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.929 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6957ce07-4d9c-4d1d-a573-5f6518e6601d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:21 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:21.930 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[17d395e2-30e0-405e-870f-2cea24bddd0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:21.933 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:22 np0005532585.localdomain podman[324574]: 
Nov 23 10:02:22 np0005532585.localdomain podman[324574]: 2025-11-23 10:02:22.098568391 +0000 UTC m=+0.091572622 container create 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:02:22 np0005532585.localdomain systemd[1]: Started libpod-conmon-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624.scope.
Nov 23 10:02:22 np0005532585.localdomain podman[324574]: 2025-11-23 10:02:22.045989601 +0000 UTC m=+0.038993852 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:22 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:22 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/600a00c93f290b4d32413ef130a9b7a40a8e3f96000c136065f555c66a3949e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:22 np0005532585.localdomain podman[324574]: 2025-11-23 10:02:22.174738567 +0000 UTC m=+0.167742798 container init 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:02:22 np0005532585.localdomain podman[324574]: 2025-11-23 10:02:22.183850858 +0000 UTC m=+0.176855079 container start 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324603]: started, version 2.85 cachesize 150
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324603]: DNS service limited to local subnets
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324603]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324603]: warning: no upstream servers configured
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324603]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.189 263258 INFO neutron.agent.dhcp.agent [None req-6d34abed-43fb-4b42-9a1c-904f2cac2ceb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.190 263258 INFO neutron.agent.dhcp.agent [None req-6d34abed-43fb-4b42-9a1c-904f2cac2ceb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.191 263258 INFO neutron.agent.dhcp.agent [None req-6d34abed-43fb-4b42-9a1c-904f2cac2ceb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.234 263258 INFO neutron.agent.dhcp.agent [None req-5de1e31a-d4a1-4578-b4dd-c1804f8c816f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:21Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f9d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f6d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1fd90>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b1f4c0>], id=cd81cb44-73bf-4b94-82f0-6b35c126ee07, ip_allocation=immediate, mac_address=fa:16:3e:aa:26:53, name=tempest-NetworksTestDHCPv6-1288803817, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['2fe6c6ad-ce38-42e7-bef0-0b0aa5fbe80c', '8eea5134-173e-4854-98d8-93b4b79ba9cd'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:17Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1866, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:21Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:02:22 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:22.262 2 INFO neutron.agent.securitygroups_rpc [None req-2377c616-101f-418f-a8e9-4c617ed1b658 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.411 263258 INFO neutron.agent.dhcp.agent [None req-2a64f720-2fc6-49b7-a426-43cdb09c966f - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:02:22 np0005532585.localdomain podman[324626]: 2025-11-23 10:02:22.421489298 +0000 UTC m=+0.064609771 container kill 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:22 np0005532585.localdomain ceph-mon[300199]: pgmap v244: 177 pgs: 8 active+clean+snaptrim, 6 active+clean+snaptrim_wait, 163 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 12 MiB/s wr, 50 op/s
Nov 23 10:02:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/4263813190' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:02:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:22.568 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:02:22 np0005532585.localdomain podman[324666]: 
Nov 23 10:02:22 np0005532585.localdomain podman[324666]: 2025-11-23 10:02:22.634407377 +0000 UTC m=+0.073193325 container create e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.652 263258 INFO neutron.agent.dhcp.agent [None req-c9769175-7ee6-4cb3-b783-ef98c873929e - - - - - -] DHCP configuration for ports {'cd81cb44-73bf-4b94-82f0-6b35c126ee07'} is completed
Nov 23 10:02:22 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:22Z|00355|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:22 np0005532585.localdomain systemd[1]: Started libpod-conmon-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3.scope.
Nov 23 10:02:22 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:22.692 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:22 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27046c8dc740f22d06124834ed035857bd6d551c21ad129433b752c7a3c7ddc6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:22 np0005532585.localdomain podman[324666]: 2025-11-23 10:02:22.603439353 +0000 UTC m=+0.042225281 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:22 np0005532585.localdomain podman[324666]: 2025-11-23 10:02:22.705715063 +0000 UTC m=+0.144501001 container init e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:02:22 np0005532585.localdomain podman[324666]: 2025-11-23 10:02:22.713971898 +0000 UTC m=+0.152757836 container start e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324687]: started, version 2.85 cachesize 150
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324687]: DNS service limited to local subnets
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324687]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324687]: warning: no upstream servers configured
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324687]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:02:22 np0005532585.localdomain dnsmasq[324687]: read /var/lib/neutron/dhcp/e77a4286-3801-4220-989b-d56ef685e3b6/addn_hosts - 0 addresses
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324687]: read /var/lib/neutron/dhcp/e77a4286-3801-4220-989b-d56ef685e3b6/host
Nov 23 10:02:22 np0005532585.localdomain dnsmasq-dhcp[324687]: read /var/lib/neutron/dhcp/e77a4286-3801-4220-989b-d56ef685e3b6/opts
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.789 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:20Z, description=, device_id=6c96453a-c777-4431-b133-5c4197796c3e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b5e040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c89aff40>], id=ef47ee52-a1a9-471d-b433-3cfaa989690c, ip_allocation=immediate, mac_address=fa:16:3e:a6:dd:58, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:14Z, description=, dns_domain=, id=31b977a7-a37c-42ba-bed9-7b22959f6310, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1570024778-network, port_security_enabled=True, project_id=10f8dd7c838246c58f1d2c4efc771237, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1812, status=ACTIVE, subnets=['ef80155c-ac2e-4709-9882-d7bc38017108'], tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:15Z, vlan_transparent=None, network_id=31b977a7-a37c-42ba-bed9-7b22959f6310, port_security_enabled=False, project_id=10f8dd7c838246c58f1d2c4efc771237, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1857, status=DOWN, tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:20Z on network 31b977a7-a37c-42ba-bed9-7b22959f6310
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.811 263258 INFO neutron.agent.dhcp.agent [None req-d56509cd-7d81-4efc-9b74-f76c5145f3ee - - - - - -] DHCP configuration for ports {'87b1e486-66c6-450b-a88a-205b42e4c756'} is completed
Nov 23 10:02:22 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:22.852 2 INFO neutron.agent.securitygroups_rpc [None req-5aab4fba-63d5-4959-8f5e-411b7878d60b 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.910 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:22 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6957ce07\x2d4d9c\x2d4d1d\x2da573\x2d5f6518e6601d.mount: Deactivated successfully.
Nov 23 10:02:22 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:22.951 2 INFO neutron.agent.securitygroups_rpc [None req-7111b2ef-5dc8-48f5-9c8c-9b4c6e4004a1 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:23 np0005532585.localdomain podman[324719]: 2025-11-23 10:02:23.074220446 +0000 UTC m=+0.070541394 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:23 np0005532585.localdomain dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 1 addresses
Nov 23 10:02:23 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host
Nov 23 10:02:23 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts
Nov 23 10:02:23 np0005532585.localdomain podman[324730]: 2025-11-23 10:02:23.092655774 +0000 UTC m=+0.061695871 container kill e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:23 np0005532585.localdomain dnsmasq[324687]: exiting on receipt of SIGTERM
Nov 23 10:02:23 np0005532585.localdomain systemd[1]: libpod-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3.scope: Deactivated successfully.
Nov 23 10:02:23 np0005532585.localdomain podman[324762]: 2025-11-23 10:02:23.172572776 +0000 UTC m=+0.056913894 container died e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:23 np0005532585.localdomain podman[324762]: 2025-11-23 10:02:23.216614223 +0000 UTC m=+0.100955261 container remove e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:23.230 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:23 np0005532585.localdomain kernel: device tapf463970f-33 left promiscuous mode
Nov 23 10:02:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:23Z|00356|binding|INFO|Releasing lport f463970f-3381-45b3-96e6-35969693be91 from this chassis (sb_readonly=0)
Nov 23 10:02:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:23Z|00357|binding|INFO|Setting lport f463970f-3381-45b3-96e6-35969693be91 down in Southbound
Nov 23 10:02:23 np0005532585.localdomain dnsmasq[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:23 np0005532585.localdomain dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:23 np0005532585.localdomain dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:23 np0005532585.localdomain podman[324787]: 2025-11-23 10:02:23.237371522 +0000 UTC m=+0.069787761 container kill 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:23.238 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bcc515473444ea195be635c77c65d0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11a468ee-c41b-44bf-983d-07802e5afd68, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f463970f-3381-45b3-96e6-35969693be91) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:23.240 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f463970f-3381-45b3-96e6-35969693be91 in datapath e77a4286-3801-4220-989b-d56ef685e3b6 unbound from our chassis
Nov 23 10:02:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:23.240 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e77a4286-3801-4220-989b-d56ef685e3b6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:23.241 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[31f1a420-1643-4270-ac13-ff327bc306bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:23.247 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:23 np0005532585.localdomain systemd[1]: libpod-conmon-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3.scope: Deactivated successfully.
Nov 23 10:02:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.410 263258 INFO neutron.agent.dhcp.agent [None req-1eec1afe-c435-4703-85b6-9fd8f2098226 - - - - - -] DHCP configuration for ports {'ef47ee52-a1a9-471d-b433-3cfaa989690c'} is completed
Nov 23 10:02:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.458 263258 INFO neutron.agent.dhcp.agent [None req-5cf4e7f9-7943-4d45-aa17-9727e51f6d3b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.459 263258 INFO neutron.agent.dhcp.agent [None req-5cf4e7f9-7943-4d45-aa17-9727e51f6d3b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:23 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:23.606 2 INFO neutron.agent.securitygroups_rpc [None req-192c090e-5c5a-4cbc-acb5-d1edd0c7e4bb 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-27046c8dc740f22d06124834ed035857bd6d551c21ad129433b752c7a3c7ddc6-merged.mount: Deactivated successfully.
Nov 23 10:02:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:23 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2de77a4286\x2d3801\x2d4220\x2d989b\x2dd56ef685e3b6.mount: Deactivated successfully.
Nov 23 10:02:23 np0005532585.localdomain ceph-mon[300199]: pgmap v245: 177 pgs: 8 active+clean+snaptrim, 6 active+clean+snaptrim_wait, 163 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 12 MiB/s wr, 50 op/s
Nov 23 10:02:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.975 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:24 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:24Z|00358|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:24.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:24 np0005532585.localdomain dnsmasq[324603]: exiting on receipt of SIGTERM
Nov 23 10:02:24 np0005532585.localdomain podman[324834]: 2025-11-23 10:02:24.2030589 +0000 UTC m=+0.058020068 container kill 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:24 np0005532585.localdomain systemd[1]: libpod-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624.scope: Deactivated successfully.
Nov 23 10:02:24 np0005532585.localdomain podman[324846]: 2025-11-23 10:02:24.274506131 +0000 UTC m=+0.059743711 container died 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:02:24 np0005532585.localdomain systemd[1]: tmp-crun.AWtw87.mount: Deactivated successfully.
Nov 23 10:02:24 np0005532585.localdomain podman[324846]: 2025-11-23 10:02:24.315292948 +0000 UTC m=+0.100530488 container cleanup 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:24 np0005532585.localdomain systemd[1]: libpod-conmon-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624.scope: Deactivated successfully.
Nov 23 10:02:24 np0005532585.localdomain podman[324851]: 2025-11-23 10:02:24.350946306 +0000 UTC m=+0.125813866 container remove 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:02:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-600a00c93f290b4d32413ef130a9b7a40a8e3f96000c136065f555c66a3949e5-merged.mount: Deactivated successfully.
Nov 23 10:02:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e120 e120: 6 total, 6 up, 6 in
Nov 23 10:02:25 np0005532585.localdomain podman[324927]: 
Nov 23 10:02:25 np0005532585.localdomain podman[324927]: 2025-11-23 10:02:25.239509859 +0000 UTC m=+0.101288381 container create b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: Started libpod-conmon-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d.scope.
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:25 np0005532585.localdomain podman[324927]: 2025-11-23 10:02:25.192835461 +0000 UTC m=+0.054614023 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:25 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8614f07940d831ba536275ae2fdfc52851f2aedbb4cf8d25398005eb1e57659f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:25 np0005532585.localdomain podman[324927]: 2025-11-23 10:02:25.300589021 +0000 UTC m=+0.162367563 container init b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:25 np0005532585.localdomain podman[324927]: 2025-11-23 10:02:25.310154515 +0000 UTC m=+0.171933067 container start b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:25 np0005532585.localdomain dnsmasq[324964]: started, version 2.85 cachesize 150
Nov 23 10:02:25 np0005532585.localdomain dnsmasq[324964]: DNS service limited to local subnets
Nov 23 10:02:25 np0005532585.localdomain dnsmasq[324964]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:25 np0005532585.localdomain dnsmasq[324964]: warning: no upstream servers configured
Nov 23 10:02:25 np0005532585.localdomain dnsmasq[324964]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:25 np0005532585.localdomain podman[324942]: 2025-11-23 10:02:25.353962925 +0000 UTC m=+0.075291451 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 23 10:02:25 np0005532585.localdomain podman[324942]: 2025-11-23 10:02:25.361460705 +0000 UTC m=+0.082789231 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:02:25 np0005532585.localdomain podman[324943]: 2025-11-23 10:02:25.404730869 +0000 UTC m=+0.120318678 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:02:25 np0005532585.localdomain podman[324943]: 2025-11-23 10:02:25.4119381 +0000 UTC m=+0.127525929 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:02:25 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:25.606 263258 INFO neutron.agent.dhcp.agent [None req-90741188-f0e1-4c52-ba7f-d1d78c99dd0a - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed
Nov 23 10:02:25 np0005532585.localdomain dnsmasq[324964]: exiting on receipt of SIGTERM
Nov 23 10:02:25 np0005532585.localdomain podman[325001]: 2025-11-23 10:02:25.709984092 +0000 UTC m=+0.063536179 container kill b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: libpod-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d.scope: Deactivated successfully.
Nov 23 10:02:25 np0005532585.localdomain podman[325014]: 2025-11-23 10:02:25.779684769 +0000 UTC m=+0.054799089 container died b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:02:25 np0005532585.localdomain podman[325014]: 2025-11-23 10:02:25.810845999 +0000 UTC m=+0.085960279 container cleanup b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: libpod-conmon-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d.scope: Deactivated successfully.
Nov 23 10:02:25 np0005532585.localdomain podman[325015]: 2025-11-23 10:02:25.866155883 +0000 UTC m=+0.132361139 container remove b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-8614f07940d831ba536275ae2fdfc52851f2aedbb4cf8d25398005eb1e57659f-merged.mount: Deactivated successfully.
Nov 23 10:02:25 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:26 np0005532585.localdomain ceph-mon[300199]: osdmap e120: 6 total, 6 up, 6 in
Nov 23 10:02:26 np0005532585.localdomain ceph-mon[300199]: pgmap v247: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 12 MiB/s wr, 111 op/s
Nov 23 10:02:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e121 e121: 6 total, 6 up, 6 in
Nov 23 10:02:26 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:26.346 2 INFO neutron.agent.securitygroups_rpc [None req-d166c481-1e29-4957-bae4-8d46f816d4e6 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:26.381 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:26.564 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:26.602 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:27 np0005532585.localdomain ceph-mon[300199]: osdmap e121: 6 total, 6 up, 6 in
Nov 23 10:02:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e122 e122: 6 total, 6 up, 6 in
Nov 23 10:02:27 np0005532585.localdomain podman[325094]: 
Nov 23 10:02:27 np0005532585.localdomain podman[325094]: 2025-11-23 10:02:27.530768053 +0000 UTC m=+0.088808017 container create 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:02:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd.scope.
Nov 23 10:02:27 np0005532585.localdomain systemd[1]: tmp-crun.fQpSvg.mount: Deactivated successfully.
Nov 23 10:02:27 np0005532585.localdomain podman[325094]: 2025-11-23 10:02:27.487259632 +0000 UTC m=+0.045299636 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:27 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bf7835d0e90b38389a369dac93f71b03b61e893f5c0cbb4569c76a06c953709/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:27 np0005532585.localdomain podman[325094]: 2025-11-23 10:02:27.615671488 +0000 UTC m=+0.173711452 container init 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:02:27 np0005532585.localdomain podman[325094]: 2025-11-23 10:02:27.625749819 +0000 UTC m=+0.183789783 container start 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:02:27 np0005532585.localdomain dnsmasq[325112]: started, version 2.85 cachesize 150
Nov 23 10:02:27 np0005532585.localdomain dnsmasq[325112]: DNS service limited to local subnets
Nov 23 10:02:27 np0005532585.localdomain dnsmasq[325112]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:27 np0005532585.localdomain dnsmasq[325112]: warning: no upstream servers configured
Nov 23 10:02:27 np0005532585.localdomain dnsmasq-dhcp[325112]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:27 np0005532585.localdomain dnsmasq[325112]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:27 np0005532585.localdomain dnsmasq-dhcp[325112]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:27 np0005532585.localdomain dnsmasq-dhcp[325112]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:27.868 263258 INFO neutron.agent.dhcp.agent [None req-b1486ae4-3c8b-4d00-a2a4-94af2405fc70 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed
Nov 23 10:02:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e123 e123: 6 total, 6 up, 6 in
Nov 23 10:02:27 np0005532585.localdomain dnsmasq[325112]: exiting on receipt of SIGTERM
Nov 23 10:02:27 np0005532585.localdomain podman[325130]: 2025-11-23 10:02:27.992379543 +0000 UTC m=+0.064715556 container kill 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:02:27 np0005532585.localdomain systemd[1]: libpod-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd.scope: Deactivated successfully.
Nov 23 10:02:28 np0005532585.localdomain podman[325144]: 2025-11-23 10:02:28.067672782 +0000 UTC m=+0.058959628 container died 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:28 np0005532585.localdomain podman[325144]: 2025-11-23 10:02:28.098635146 +0000 UTC m=+0.089921942 container cleanup 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:28 np0005532585.localdomain systemd[1]: libpod-conmon-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd.scope: Deactivated successfully.
Nov 23 10:02:28 np0005532585.localdomain podman[325145]: 2025-11-23 10:02:28.137216494 +0000 UTC m=+0.123856006 container remove 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:28 np0005532585.localdomain ceph-mon[300199]: osdmap e122: 6 total, 6 up, 6 in
Nov 23 10:02:28 np0005532585.localdomain ceph-mon[300199]: pgmap v250: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 3.5 KiB/s wr, 63 op/s
Nov 23 10:02:28 np0005532585.localdomain ceph-mon[300199]: osdmap e123: 6 total, 6 up, 6 in
Nov 23 10:02:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:28 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:28.418 2 INFO neutron.agent.securitygroups_rpc [None req-accb9679-f798-499a-bac7-ef6b44f5ac25 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0bf7835d0e90b38389a369dac93f71b03b61e893f5c0cbb4569c76a06c953709-merged.mount: Deactivated successfully.
Nov 23 10:02:28 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:28 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:28.920 2 INFO neutron.agent.securitygroups_rpc [None req-cba9b202-7a98-4e4d-911c-f57572c47e81 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']
Nov 23 10:02:29 np0005532585.localdomain podman[325223]: 
Nov 23 10:02:29 np0005532585.localdomain podman[325223]: 2025-11-23 10:02:29.045603808 +0000 UTC m=+0.092910574 container create 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:29 np0005532585.localdomain systemd[1]: Started libpod-conmon-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd.scope.
Nov 23 10:02:29 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:29 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/934bd1a7d08cf6a4a8ef94b1e4c9333eeb001c1a039964b7962f004e9e6810f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:29 np0005532585.localdomain podman[325223]: 2025-11-23 10:02:29.002801089 +0000 UTC m=+0.050107885 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:29 np0005532585.localdomain podman[325223]: 2025-11-23 10:02:29.106696149 +0000 UTC m=+0.154002925 container init 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:02:29 np0005532585.localdomain podman[325223]: 2025-11-23 10:02:29.113051546 +0000 UTC m=+0.160358312 container start 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: started, version 2.85 cachesize 150
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: DNS service limited to local subnets
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: warning: no upstream servers configured
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:29.168 263258 INFO neutron.agent.dhcp.agent [None req-fd203e9b-1f2c-470a-be85-24176465be87 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:27Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a25b50>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a25fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b24850>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b24be0>], id=a72e710a-e848-4366-84b4-63006f20c66f, ip_allocation=immediate, mac_address=fa:16:3e:d7:d9:99, name=tempest-NetworksTestDHCPv6-353818491, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['274eb987-216a-4bde-ad9a-f70582643a22', 'b16500ec-aaf4-434d-abb8-bd690dfe4c20'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:26Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1900, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:28Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237
Nov 23 10:02:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e124 e124: 6 total, 6 up, 6 in
Nov 23 10:02:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:29.300 263258 INFO neutron.agent.dhcp.agent [None req-5c2b92a6-e799-4223-a264-0022711823cb - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:29 np0005532585.localdomain podman[325261]: 2025-11-23 10:02:29.3680232 +0000 UTC m=+0.059963689 container kill 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:02:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:29.621 263258 INFO neutron.agent.dhcp.agent [None req-54d3920b-5b80-4605-9c8d-480703b7e810 - - - - - -] DHCP configuration for ports {'a72e710a-e848-4366-84b4-63006f20c66f'} is completed
Nov 23 10:02:29 np0005532585.localdomain dnsmasq[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:29 np0005532585.localdomain podman[325300]: 2025-11-23 10:02:29.707309752 +0000 UTC m=+0.054695056 container kill 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:02:29 np0005532585.localdomain dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:29 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:29.891 2 INFO neutron.agent.securitygroups_rpc [None req-71451af8-2b84-4ee8-885e-297c3854d4d1 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:02:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:02:30 np0005532585.localdomain ceph-mon[300199]: osdmap e124: 6 total, 6 up, 6 in
Nov 23 10:02:30 np0005532585.localdomain ceph-mon[300199]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 145 KiB/s rd, 25 KiB/s wr, 210 op/s
Nov 23 10:02:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e125 e125: 6 total, 6 up, 6 in
Nov 23 10:02:30 np0005532585.localdomain systemd[1]: tmp-crun.05dwbq.mount: Deactivated successfully.
Nov 23 10:02:30 np0005532585.localdomain dnsmasq[325241]: exiting on receipt of SIGTERM
Nov 23 10:02:30 np0005532585.localdomain podman[325337]: 2025-11-23 10:02:30.314962181 +0000 UTC m=+0.072744232 container kill 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:30 np0005532585.localdomain systemd[1]: libpod-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd.scope: Deactivated successfully.
Nov 23 10:02:30 np0005532585.localdomain podman[325358]: 2025-11-23 10:02:30.399524656 +0000 UTC m=+0.061784125 container died 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:30 np0005532585.localdomain podman[325358]: 2025-11-23 10:02:30.490367354 +0000 UTC m=+0.152626783 container remove 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:30 np0005532585.localdomain systemd[1]: libpod-conmon-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd.scope: Deactivated successfully.
Nov 23 10:02:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-934bd1a7d08cf6a4a8ef94b1e4c9333eeb001c1a039964b7962f004e9e6810f4-merged.mount: Deactivated successfully.
Nov 23 10:02:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:30 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:30.790 2 INFO neutron.agent.securitygroups_rpc [None req-1ba66ce5-b42f-4fad-9876-b27f14456f6a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:31 np0005532585.localdomain ceph-mon[300199]: osdmap e125: 6 total, 6 up, 6 in
Nov 23 10:02:31 np0005532585.localdomain podman[325430]: 
Nov 23 10:02:31 np0005532585.localdomain podman[325430]: 2025-11-23 10:02:31.389958776 +0000 UTC m=+0.073652790 container create aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 10:02:31 np0005532585.localdomain systemd[1]: Started libpod-conmon-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d.scope.
Nov 23 10:02:31 np0005532585.localdomain systemd[1]: tmp-crun.0nAWDn.mount: Deactivated successfully.
Nov 23 10:02:31 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:31 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd17c590309b55ead84c30c1c43935a15235ea5459b9de3ba9300a684a6178a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:31 np0005532585.localdomain podman[325430]: 2025-11-23 10:02:31.455406183 +0000 UTC m=+0.139100197 container init aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 10:02:31 np0005532585.localdomain podman[325430]: 2025-11-23 10:02:31.361352565 +0000 UTC m=+0.045046569 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:31 np0005532585.localdomain podman[325430]: 2025-11-23 10:02:31.463834732 +0000 UTC m=+0.147528746 container start aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:02:31 np0005532585.localdomain dnsmasq[325448]: started, version 2.85 cachesize 150
Nov 23 10:02:31 np0005532585.localdomain dnsmasq[325448]: DNS service limited to local subnets
Nov 23 10:02:31 np0005532585.localdomain dnsmasq[325448]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:31 np0005532585.localdomain dnsmasq[325448]: warning: no upstream servers configured
Nov 23 10:02:31 np0005532585.localdomain dnsmasq-dhcp[325448]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:31 np0005532585.localdomain dnsmasq[325448]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses
Nov 23 10:02:31 np0005532585.localdomain dnsmasq-dhcp[325448]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host
Nov 23 10:02:31 np0005532585.localdomain dnsmasq-dhcp[325448]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts
Nov 23 10:02:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:31.568 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:31.604 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:31.691 263258 INFO neutron.agent.dhcp.agent [None req-05320843-34ef-410e-9f64-81cc22df8da5 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed
Nov 23 10:02:31 np0005532585.localdomain dnsmasq[325448]: exiting on receipt of SIGTERM
Nov 23 10:02:31 np0005532585.localdomain podman[325466]: 2025-11-23 10:02:31.789528285 +0000 UTC m=+0.037821415 container kill aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:31 np0005532585.localdomain systemd[1]: libpod-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d.scope: Deactivated successfully.
Nov 23 10:02:31 np0005532585.localdomain podman[325482]: 2025-11-23 10:02:31.842056374 +0000 UTC m=+0.038695944 container died aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:31 np0005532585.localdomain systemd[1]: tmp-crun.ZQFgup.mount: Deactivated successfully.
Nov 23 10:02:31 np0005532585.localdomain podman[325482]: 2025-11-23 10:02:31.909403848 +0000 UTC m=+0.106043388 container remove aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:31Z|00359|binding|INFO|Releasing lport d4981345-81c9-4678-a4d9-29762c427058 from this chassis (sb_readonly=0)
Nov 23 10:02:31 np0005532585.localdomain kernel: device tapd4981345-81 left promiscuous mode
Nov 23 10:02:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:31Z|00360|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 down in Southbound
Nov 23 10:02:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:31.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:31.926 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee0:b47d/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:31.927 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis
Nov 23 10:02:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:31.929 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:02:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:31.930 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d56f822f-84bc-42b3-ad54-b0fc38cb3b96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:31.937 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:31 np0005532585.localdomain systemd[1]: libpod-conmon-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d.scope: Deactivated successfully.
Nov 23 10:02:32 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:32.007 2 INFO neutron.agent.securitygroups_rpc [None req-fe8c6fcc-6e02-46f0-a4be-4363defcaae3 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:32 np0005532585.localdomain sshd[325509]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:02:32 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:32.114 2 INFO neutron.agent.securitygroups_rpc [None req-5d5ccae7-720c-44b0-bb02-9faf01da722a 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']
Nov 23 10:02:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:32.255 263258 INFO neutron.agent.dhcp.agent [None req-e55fbbd9-0b6f-41b7-952b-ca167e6862e2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e126 e126: 6 total, 6 up, 6 in
Nov 23 10:02:32 np0005532585.localdomain ceph-mon[300199]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 137 KiB/s rd, 24 KiB/s wr, 198 op/s
Nov 23 10:02:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-dd17c590309b55ead84c30c1c43935a15235ea5459b9de3ba9300a684a6178a8-merged.mount: Deactivated successfully.
Nov 23 10:02:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:32 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully.
Nov 23 10:02:32 np0005532585.localdomain sshd[325509]: Invalid user boss from 207.154.194.2 port 45010
Nov 23 10:02:32 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:32.696 2 INFO neutron.agent.securitygroups_rpc [None req-07562697-3af0-478b-ab22-4ac8b6836e01 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:32 np0005532585.localdomain sshd[325509]: Received disconnect from 207.154.194.2 port 45010:11: Bye Bye [preauth]
Nov 23 10:02:32 np0005532585.localdomain sshd[325509]: Disconnected from invalid user boss 207.154.194.2 port 45010 [preauth]
Nov 23 10:02:33 np0005532585.localdomain ceph-mon[300199]: osdmap e126: 6 total, 6 up, 6 in
Nov 23 10:02:33 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/966109860' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:02:33 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:33.363 2 INFO neutron.agent.securitygroups_rpc [None req-f8a04e39-0876-44f3-a37b-8d86b234eb13 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']
Nov 23 10:02:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:33.558 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e127 e127: 6 total, 6 up, 6 in
Nov 23 10:02:34 np0005532585.localdomain ceph-mon[300199]: pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 106 KiB/s rd, 18 KiB/s wr, 152 op/s
Nov 23 10:02:34 np0005532585.localdomain ceph-mon[300199]: osdmap e127: 6 total, 6 up, 6 in
Nov 23 10:02:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:34Z|00361|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:34.489 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e128 e128: 6 total, 6 up, 6 in
Nov 23 10:02:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:35.392 263258 INFO neutron.agent.linux.ip_lib [None req-c5c7b97d-9dc9-444d-a85f-a4c6fc884a8f - - - - - -] Device tape9b240b4-dd cannot be used as it has no MAC address
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain kernel: device tape9b240b4-dd entered promiscuous mode
Nov 23 10:02:35 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892155.4272] manager: (tape9b240b4-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/60)
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.428 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:35Z|00362|binding|INFO|Claiming lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 for this chassis.
Nov 23 10:02:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:35Z|00363|binding|INFO|e9b240b4-dda7-48fc-a63a-d3fd91217a97: Claiming unknown
Nov 23 10:02:35 np0005532585.localdomain systemd-udevd[325521]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.442 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87fb399b-8c32-4da7-b979-46b50b5b7dd8, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=e9b240b4-dda7-48fc-a63a-d3fd91217a97) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.444 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e9b240b4-dda7-48fc-a63a-d3fd91217a97 in datapath 319f7ca3-1c18-4436-8178-bfc17a98eb45 bound to our chassis
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.446 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 319f7ca3-1c18-4436-8178-bfc17a98eb45 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.448 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6a744685-56f5-4bb9-ac6f-f8c6ce202820]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:35Z|00364|binding|INFO|Setting lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 ovn-installed in OVS
Nov 23 10:02:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:35Z|00365|binding|INFO|Setting lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 up in Southbound
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tape9b240b4-dd: No such device
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.507 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain dnsmasq[323235]: exiting on receipt of SIGTERM
Nov 23 10:02:35 np0005532585.localdomain podman[325570]: 2025-11-23 10:02:35.782261373 +0000 UTC m=+0.071215274 container kill 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:35 np0005532585.localdomain systemd[1]: libpod-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d.scope: Deactivated successfully.
Nov 23 10:02:35 np0005532585.localdomain podman[325590]: 2025-11-23 10:02:35.872017918 +0000 UTC m=+0.069576533 container died 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:35 np0005532585.localdomain systemd[1]: tmp-crun.y8KVnG.mount: Deactivated successfully.
Nov 23 10:02:35 np0005532585.localdomain podman[325590]: 2025-11-23 10:02:35.954011635 +0000 UTC m=+0.151570250 container remove 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:02:35 np0005532585.localdomain systemd[1]: libpod-conmon-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d.scope: Deactivated successfully.
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.970 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:35Z|00366|binding|INFO|Releasing lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 from this chassis (sb_readonly=0)
Nov 23 10:02:35 np0005532585.localdomain kernel: device tap4888f66e-2a left promiscuous mode
Nov 23 10:02:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:35Z|00367|binding|INFO|Setting lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 down in Southbound
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.986 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cf4fe09-32ab-44a4-bc92-e7a5f2b64203, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=4888f66e-2a7b-4114-aa4a-94f38d09c793) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:35 np0005532585.localdomain ceph-mon[300199]: osdmap e128: 6 total, 6 up, 6 in
Nov 23 10:02:35 np0005532585.localdomain ceph-mon[300199]: pgmap v260: 177 pgs: 177 active+clean; 145 MiB data, 781 MiB used, 41 GiB / 42 GiB avail; 130 KiB/s rd, 11 KiB/s wr, 176 op/s
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.990 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4888f66e-2a7b-4114-aa4a-94f38d09c793 in datapath e392c0bc-bd43-40a4-a7d7-6e0130e48060 unbound from our chassis
Nov 23 10:02:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:35.992 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.997 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e392c0bc-bd43-40a4-a7d7-6e0130e48060 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:35.998 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[49571d1b-4171-40c2-b031-2967a719d949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:02:36 np0005532585.localdomain podman[325629]: 2025-11-23 10:02:36.253179329 +0000 UTC m=+0.057338006 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:02:36 np0005532585.localdomain podman[325629]: 2025-11-23 10:02:36.257940276 +0000 UTC m=+0.062099023 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:02:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.261 263258 INFO neutron.agent.dhcp.agent [None req-6508a959-80b6-47ac-9f8f-23d2437b76a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.262 263258 INFO neutron.agent.dhcp.agent [None req-6508a959-80b6-47ac-9f8f-23d2437b76a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-85794c23db926913628cec6db6b73789ec462daef8098b4cddf986aad56e49c9-merged.mount: Deactivated successfully.
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2de392c0bc\x2dbd43\x2d40a4\x2da7d7\x2d6e0130e48060.mount: Deactivated successfully.
Nov 23 10:02:36 np0005532585.localdomain podman[325630]: 2025-11-23 10:02:36.316702677 +0000 UTC m=+0.119509353 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:02:36 np0005532585.localdomain podman[325630]: 2025-11-23 10:02:36.333224806 +0000 UTC m=+0.136031472 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:02:36 np0005532585.localdomain podman[325690]: 
Nov 23 10:02:36 np0005532585.localdomain podman[325690]: 2025-11-23 10:02:36.503363497 +0000 UTC m=+0.078839219 container create 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49.scope.
Nov 23 10:02:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ed3cccae87a2cec8679f8b781e7106fb3edbfdb329ca120981a36091e9dbff3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:36 np0005532585.localdomain podman[325690]: 2025-11-23 10:02:36.463053315 +0000 UTC m=+0.038529057 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:36 np0005532585.localdomain podman[325690]: 2025-11-23 10:02:36.565399178 +0000 UTC m=+0.140874890 container init 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:02:36 np0005532585.localdomain podman[325690]: 2025-11-23 10:02:36.571367972 +0000 UTC m=+0.146843684 container start 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:02:36 np0005532585.localdomain dnsmasq[325708]: started, version 2.85 cachesize 150
Nov 23 10:02:36 np0005532585.localdomain dnsmasq[325708]: DNS service limited to local subnets
Nov 23 10:02:36 np0005532585.localdomain dnsmasq[325708]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:36 np0005532585.localdomain dnsmasq[325708]: warning: no upstream servers configured
Nov 23 10:02:36 np0005532585.localdomain dnsmasq-dhcp[325708]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Nov 23 10:02:36 np0005532585.localdomain dnsmasq[325708]: read /var/lib/neutron/dhcp/319f7ca3-1c18-4436-8178-bfc17a98eb45/addn_hosts - 0 addresses
Nov 23 10:02:36 np0005532585.localdomain dnsmasq-dhcp[325708]: read /var/lib/neutron/dhcp/319f7ca3-1c18-4436-8178-bfc17a98eb45/host
Nov 23 10:02:36 np0005532585.localdomain dnsmasq-dhcp[325708]: read /var/lib/neutron/dhcp/319f7ca3-1c18-4436-8178-bfc17a98eb45/opts
Nov 23 10:02:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:36.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:36.607 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.735 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.809 263258 INFO neutron.agent.dhcp.agent [None req-ee8db2d2-f23d-47f1-be52-dbf00df2cc7a - - - - - -] DHCP configuration for ports {'2d9b3930-b9d4-4c3f-a3e2-b427dd3184a4'} is completed
Nov 23 10:02:37 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:37Z|00368|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:37.043 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e129 e129: 6 total, 6 up, 6 in
Nov 23 10:02:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e130 e130: 6 total, 6 up, 6 in
Nov 23 10:02:38 np0005532585.localdomain ceph-mon[300199]: osdmap e129: 6 total, 6 up, 6 in
Nov 23 10:02:38 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4040804523' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:38 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4040804523' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:38 np0005532585.localdomain ceph-mon[300199]: pgmap v262: 177 pgs: 177 active+clean; 145 MiB data, 781 MiB used, 41 GiB / 42 GiB avail; 131 KiB/s rd, 11 KiB/s wr, 177 op/s
Nov 23 10:02:38 np0005532585.localdomain ceph-mon[300199]: osdmap e130: 6 total, 6 up, 6 in
Nov 23 10:02:38 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:38.092 2 INFO neutron.agent.securitygroups_rpc [None req-5e89ea53-7b2c-469d-be9d-51deeecc82b0 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:38 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 23 10:02:38 np0005532585.localdomain sudo[325709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:02:38 np0005532585.localdomain sudo[325709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:02:38 np0005532585.localdomain sudo[325709]: pam_unix(sudo:session): session closed for user root
Nov 23 10:02:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:38 np0005532585.localdomain sudo[325727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:02:38 np0005532585.localdomain sudo[325727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:02:38 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:38.661 2 INFO neutron.agent.securitygroups_rpc [None req-64ec2b21-8187-4ac4-a9ab-6a7a717b7a77 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:39 np0005532585.localdomain sudo[325727]: pam_unix(sudo:session): session closed for user root
Nov 23 10:02:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e131 e131: 6 total, 6 up, 6 in
Nov 23 10:02:39 np0005532585.localdomain sudo[325778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:02:39 np0005532585.localdomain sudo[325778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:02:39 np0005532585.localdomain sudo[325778]: pam_unix(sudo:session): session closed for user root
Nov 23 10:02:40 np0005532585.localdomain ceph-mon[300199]: osdmap e131: 6 total, 6 up, 6 in
Nov 23 10:02:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:02:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:02:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:02:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:02:40 np0005532585.localdomain ceph-mon[300199]: pgmap v265: 177 pgs: 177 active+clean; 441 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 216 KiB/s rd, 66 MiB/s wr, 333 op/s
Nov 23 10:02:40 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:40.892 2 INFO neutron.agent.securitygroups_rpc [None req-abfc9448-e887-481e-8dd9-d14e7060c10b 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:41.111 263258 INFO neutron.agent.linux.ip_lib [None req-3bd5f328-01ff-4a4c-b780-349c7b2100a4 - - - - - -] Device tap020b8cf6-ea cannot be used as it has no MAC address
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain kernel: device tap020b8cf6-ea entered promiscuous mode
Nov 23 10:02:41 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892161.1410] manager: (tap020b8cf6-ea): new Generic device (/org/freedesktop/NetworkManager/Devices/61)
Nov 23 10:02:41 np0005532585.localdomain systemd-udevd[325805]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:41Z|00369|binding|INFO|Claiming lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a for this chassis.
Nov 23 10:02:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:41Z|00370|binding|INFO|020b8cf6-ea76-4e97-97d4-9364e1402e7a: Claiming unknown
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.167 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddb78cd-66e7-4087-9a5b-55327ec8df75, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=020b8cf6-ea76-4e97-97d4-9364e1402e7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.168 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 020b8cf6-ea76-4e97-97d4-9364e1402e7a in datapath ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 bound to our chassis
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.168 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.169 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2c695947-0a13-4be2-8014-192000a36946]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:41Z|00371|binding|INFO|Setting lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a ovn-installed in OVS
Nov 23 10:02:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:41Z|00372|binding|INFO|Setting lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a up in Southbound
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.190 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.223 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.247 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:41Z|00373|binding|INFO|Removing iface tap020b8cf6-ea ovn-installed in OVS
Nov 23 10:02:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:41Z|00374|binding|INFO|Removing lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a ovn-installed in OVS
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.306 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 26f1c99f-631d-416a-8ce9-39f42a019f9a with type ""
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.307 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddb78cd-66e7-4087-9a5b-55327ec8df75, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=020b8cf6-ea76-4e97-97d4-9364e1402e7a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.308 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 020b8cf6-ea76-4e97-97d4-9364e1402e7a in datapath ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 unbound from our chassis
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.308 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:41.309 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[793b7a83-74bd-4cd8-a603-e579f64da57c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.351 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.358 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e132 e132: 6 total, 6 up, 6 in
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.594 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:41.611 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:41.822 2 INFO neutron.agent.securitygroups_rpc [None req-1e146831-4c05-4284-9c5f-ec90349e3dfc 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:02:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:02:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:02:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157508 "" "Go-http-client/1.1"
Nov 23 10:02:42 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:02:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19713 "" "Go-http-client/1.1"
Nov 23 10:02:42 np0005532585.localdomain podman[325859]: 
Nov 23 10:02:42 np0005532585.localdomain podman[325859]: 2025-11-23 10:02:42.067634568 +0000 UTC m=+0.095059580 container create 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:02:42 np0005532585.localdomain systemd[1]: Started libpod-conmon-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9.scope.
Nov 23 10:02:42 np0005532585.localdomain podman[325859]: 2025-11-23 10:02:42.023353263 +0000 UTC m=+0.050778315 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:42 np0005532585.localdomain systemd[1]: tmp-crun.aCbdRt.mount: Deactivated successfully.
Nov 23 10:02:42 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:42 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e31e5055d0d197213df95c02fe47a3d397ca129ef887435c02f1f8b6de8338e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:42 np0005532585.localdomain podman[325859]: 2025-11-23 10:02:42.160961722 +0000 UTC m=+0.188386734 container init 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:02:42 np0005532585.localdomain podman[325859]: 2025-11-23 10:02:42.171991652 +0000 UTC m=+0.199416664 container start 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:42 np0005532585.localdomain dnsmasq[325877]: started, version 2.85 cachesize 150
Nov 23 10:02:42 np0005532585.localdomain dnsmasq[325877]: DNS service limited to local subnets
Nov 23 10:02:42 np0005532585.localdomain dnsmasq[325877]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:42 np0005532585.localdomain dnsmasq[325877]: warning: no upstream servers configured
Nov 23 10:02:42 np0005532585.localdomain dnsmasq-dhcp[325877]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:42 np0005532585.localdomain dnsmasq[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/addn_hosts - 0 addresses
Nov 23 10:02:42 np0005532585.localdomain dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/host
Nov 23 10:02:42 np0005532585.localdomain dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/opts
Nov 23 10:02:42 np0005532585.localdomain kernel: device tap020b8cf6-ea left promiscuous mode
Nov 23 10:02:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:42.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.269 263258 INFO neutron.agent.dhcp.agent [None req-e50c8df0-d731-49e9-8230-00c704e50710 - - - - - -] DHCP configuration for ports {'85355cdf-63bc-4add-bb48-9440c5028be9'} is completed
Nov 23 10:02:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:42.281 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:42 np0005532585.localdomain ceph-mon[300199]: osdmap e132: 6 total, 6 up, 6 in
Nov 23 10:02:42 np0005532585.localdomain ceph-mon[300199]: pgmap v267: 177 pgs: 177 active+clean; 441 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 220 KiB/s rd, 67 MiB/s wr, 340 op/s
Nov 23 10:02:42 np0005532585.localdomain dnsmasq[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/addn_hosts - 0 addresses
Nov 23 10:02:42 np0005532585.localdomain dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/host
Nov 23 10:02:42 np0005532585.localdomain dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/opts
Nov 23 10:02:42 np0005532585.localdomain podman[325896]: 2025-11-23 10:02:42.474199472 +0000 UTC m=+0.069052309 container kill 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent [None req-3bd5f328-01ff-4a4c-b780-349c7b2100a4 - - - - - -] Unable to reload_allocations dhcp for ae5e7bf6-ef74-47f3-9aa4-c47f1574f753.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap020b8cf6-ea not found in namespace qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753.
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap020b8cf6-ea not found in namespace qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753.
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent 
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.540 263258 INFO neutron.agent.dhcp.agent [None req-fdec6d1d-4c7e-426c-ac37-18163735a0c1 - - - - - -] Synchronizing state
Nov 23 10:02:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:42Z|00375|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:42.668 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.826 263258 INFO neutron.agent.dhcp.agent [None req-de0abd54-a01f-4d36-8e28-7895c77a337f - - - - - -] All active networks have been fetched through RPC.
Nov 23 10:02:42 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:42.930 2 INFO neutron.agent.securitygroups_rpc [None req-1cdbc8e1-ca0b-46de-ad57-ff34c29d4e3c fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['7d97459a-8496-4621-8cc6-1521c3f526b4']
Nov 23 10:02:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e133 e133: 6 total, 6 up, 6 in
Nov 23 10:02:43 np0005532585.localdomain dnsmasq[325877]: exiting on receipt of SIGTERM
Nov 23 10:02:43 np0005532585.localdomain podman[325928]: 2025-11-23 10:02:43.039485116 +0000 UTC m=+0.078492119 container kill 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:02:43 np0005532585.localdomain systemd[1]: libpod-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9.scope: Deactivated successfully.
Nov 23 10:02:43 np0005532585.localdomain podman[325941]: 2025-11-23 10:02:43.121925875 +0000 UTC m=+0.069494841 container died 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:02:43 np0005532585.localdomain systemd[1]: tmp-crun.lEB8Rn.mount: Deactivated successfully.
Nov 23 10:02:43 np0005532585.localdomain podman[325941]: 2025-11-23 10:02:43.173555916 +0000 UTC m=+0.121124842 container cleanup 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:02:43 np0005532585.localdomain systemd[1]: libpod-conmon-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9.scope: Deactivated successfully.
Nov 23 10:02:43 np0005532585.localdomain podman[325948]: 2025-11-23 10:02:43.22498416 +0000 UTC m=+0.155300764 container remove 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:02:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:43.273 263258 INFO neutron.agent.dhcp.agent [-] Starting network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration
Nov 23 10:02:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:43.275 263258 INFO neutron.agent.dhcp.agent [-] Finished network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration
Nov 23 10:02:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:43.275 263258 INFO neutron.agent.dhcp.agent [None req-54274fa7-429b-40de-9645-e5a1e1ac12d0 - - - - - -] Synchronizing state complete
Nov 23 10:02:43 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:43.345 2 INFO neutron.agent.securitygroups_rpc [None req-9fd2c3cf-c790-41eb-bd9f-c36dc00051cd fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['7d97459a-8496-4621-8cc6-1521c3f526b4']
Nov 23 10:02:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:43 np0005532585.localdomain ceph-mon[300199]: osdmap e133: 6 total, 6 up, 6 in
Nov 23 10:02:43 np0005532585.localdomain ceph-mon[300199]: pgmap v269: 177 pgs: 177 active+clean; 441 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 176 KiB/s rd, 54 MiB/s wr, 272 op/s
Nov 23 10:02:43 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:02:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-e31e5055d0d197213df95c02fe47a3d397ca129ef887435c02f1f8b6de8338e2-merged.mount: Deactivated successfully.
Nov 23 10:02:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:44 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dae5e7bf6\x2def74\x2d47f3\x2d9aa4\x2dc47f1574f753.mount: Deactivated successfully.
Nov 23 10:02:44 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:44.917 2 INFO neutron.agent.securitygroups_rpc [None req-c88be53d-2b25-4bbf-93ce-69a72dd56cf0 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:45 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:45.226 2 INFO neutron.agent.securitygroups_rpc [None req-3637ef8e-b231-4ebc-b0ad-62e8972d9361 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:45.386 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:45.386 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:02:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:45.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:45 np0005532585.localdomain ceph-mon[300199]: pgmap v270: 177 pgs: 177 active+clean; 785 MiB data, 2.6 GiB used, 39 GiB / 42 GiB avail; 269 KiB/s rd, 101 MiB/s wr, 425 op/s
Nov 23 10:02:45 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: tmp-crun.XuoVkk.mount: Deactivated successfully.
Nov 23 10:02:46 np0005532585.localdomain podman[325966]: 2025-11-23 10:02:46.058618371 +0000 UTC m=+0.110486105 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:02:46 np0005532585.localdomain podman[325966]: 2025-11-23 10:02:46.10044709 +0000 UTC m=+0.152314834 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: tmp-crun.ukJ7ks.mount: Deactivated successfully.
Nov 23 10:02:46 np0005532585.localdomain podman[325984]: 2025-11-23 10:02:46.170696764 +0000 UTC m=+0.085557507 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 23 10:02:46 np0005532585.localdomain podman[325984]: 2025-11-23 10:02:46.176443071 +0000 UTC m=+0.091303844 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:02:46 np0005532585.localdomain podman[325983]: 2025-11-23 10:02:46.239423841 +0000 UTC m=+0.159645989 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 10:02:46 np0005532585.localdomain podman[325983]: 2025-11-23 10:02:46.287311586 +0000 UTC m=+0.207533674 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:02:46 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:02:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:46.342 263258 INFO neutron.agent.linux.ip_lib [None req-4bbb4871-c239-4103-a3fc-3f5ce119039b - - - - - -] Device tap7f7bd95f-df cannot be used as it has no MAC address
Nov 23 10:02:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:46.346 2 INFO neutron.agent.securitygroups_rpc [None req-c48e4a17-f990-4551-a604-7a1f5baad78f fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.403 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain kernel: device tap7f7bd95f-df entered promiscuous mode
Nov 23 10:02:46 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892166.4092] manager: (tap7f7bd95f-df): new Generic device (/org/freedesktop/NetworkManager/Devices/62)
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain systemd-udevd[326031]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:46 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:46Z|00376|binding|INFO|Claiming lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad for this chassis.
Nov 23 10:02:46 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:46Z|00377|binding|INFO|7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad: Claiming unknown
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.443 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:46Z|00378|binding|INFO|Setting lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad ovn-installed in OVS
Nov 23 10:02:46 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:46Z|00379|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.469 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:46Z|00380|binding|INFO|Setting lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad up in Southbound
Nov 23 10:02:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:46.483 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=983e35fe-65e3-4c0c-9ba9-9421db3b9faf, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:46.484 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad in datapath 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 bound to our chassis
Nov 23 10:02:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:46.486 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:46.487 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a42538-08ad-440a-941f-5e911d21eca9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.595 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:46.613 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:46.897 2 INFO neutron.agent.securitygroups_rpc [None req-69aab5f7-414f-4919-8a4c-b4fcd82ff545 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:46.953 2 INFO neutron.agent.securitygroups_rpc [None req-19349ae1-ccba-4615-bdb3-c092a3aa234c 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:47 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:47.192 2 INFO neutron.agent.securitygroups_rpc [None req-a4966cd3-a78e-4723-9b2c-0105f97058c4 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:47 np0005532585.localdomain podman[326086]: 
Nov 23 10:02:47 np0005532585.localdomain podman[326086]: 2025-11-23 10:02:47.324730155 +0000 UTC m=+0.088255851 container create 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:02:47 np0005532585.localdomain systemd[1]: Started libpod-conmon-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f.scope.
Nov 23 10:02:47 np0005532585.localdomain podman[326086]: 2025-11-23 10:02:47.277929133 +0000 UTC m=+0.041454829 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:47 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:47 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c3fea95cfcf9b4849df2e2cc958865fbf3b563efeef0cb306d35e7e353e4421/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:47 np0005532585.localdomain podman[326086]: 2025-11-23 10:02:47.396760663 +0000 UTC m=+0.160286319 container init 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:02:47 np0005532585.localdomain podman[326086]: 2025-11-23 10:02:47.40314853 +0000 UTC m=+0.166674186 container start 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:47 np0005532585.localdomain dnsmasq[326102]: started, version 2.85 cachesize 150
Nov 23 10:02:47 np0005532585.localdomain dnsmasq[326102]: DNS service limited to local subnets
Nov 23 10:02:47 np0005532585.localdomain dnsmasq[326102]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:47 np0005532585.localdomain dnsmasq[326102]: warning: no upstream servers configured
Nov 23 10:02:47 np0005532585.localdomain dnsmasq-dhcp[326102]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:47 np0005532585.localdomain dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 0 addresses
Nov 23 10:02:47 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host
Nov 23 10:02:47 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts
Nov 23 10:02:47 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:47.451 263258 INFO neutron.agent.dhcp.agent [None req-4bbb4871-c239-4103-a3fc-3f5ce119039b - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:46Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ba96d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ba9820>], id=c964b4e6-771f-468b-9ae9-3eb34c34ed00, ip_allocation=immediate, mac_address=fa:16:3e:5c:5b:d7, name=tempest-PortsIpV6TestJSON-1450374320, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:43Z, description=, dns_domain=, id=5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-73862936, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['8a70a731-5cef-4632-a106-5587ddd28431'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:45Z, vlan_transparent=None, network_id=5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d7ead8f7-80d5-4103-ab91-19b87956485a'], standard_attr_id=2012, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:46Z on network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4
Nov 23 10:02:47 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:47.555 2 INFO neutron.agent.securitygroups_rpc [None req-e2113fc4-8297-4e83-8bfc-91b8d2f12d0b fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: pgmap v271: 177 pgs: 177 active+clean; 785 MiB data, 2.6 GiB used, 39 GiB / 42 GiB avail; 92 KiB/s rd, 43 MiB/s wr, 149 op/s
Nov 23 10:02:47 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:47.566 263258 INFO neutron.agent.dhcp.agent [None req-dfc703c3-43ca-4430-b00e-1537333d7d9f - - - - - -] DHCP configuration for ports {'dcb77374-8f77-4c3a-9585-ee35bed6f389'} is completed
Nov 23 10:02:47 np0005532585.localdomain dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 1 addresses
Nov 23 10:02:47 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host
Nov 23 10:02:47 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts
Nov 23 10:02:47 np0005532585.localdomain podman[326118]: 2025-11-23 10:02:47.621986982 +0000 UTC m=+0.043273415 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:47 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:47.954 263258 INFO neutron.agent.dhcp.agent [None req-0d9872b0-fd4d-4d03-9556-59f812560fcd - - - - - -] DHCP configuration for ports {'c964b4e6-771f-468b-9ae9-3eb34c34ed00'} is completed
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e134 e134: 6 total, 6 up, 6 in
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.984827) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892167984972, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1210, "num_deletes": 259, "total_data_size": 1566141, "memory_usage": 1591808, "flush_reason": "Manual Compaction"}
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892167992229, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 813419, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24308, "largest_seqno": 25513, "table_properties": {"data_size": 808882, "index_size": 2072, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11666, "raw_average_key_size": 21, "raw_value_size": 799296, "raw_average_value_size": 1482, "num_data_blocks": 90, "num_entries": 539, "num_filter_entries": 539, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892107, "oldest_key_time": 1763892107, "file_creation_time": 1763892167, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 7458 microseconds, and 3491 cpu microseconds.
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.992282) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 813419 bytes OK
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.992304) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995277) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995299) EVENT_LOG_v1 {"time_micros": 1763892167995292, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995320) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1560160, prev total WAL file size 1560160, number of live WAL files 2.
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.996013) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303037' seq:72057594037927935, type:22 .. '6D6772737461740034323539' seq:0, type:0; will stop at (end)
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(794KB)], [42(15MB)]
Nov 23 10:02:47 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892167996070, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16843888, "oldest_snapshot_seqno": -1}
Nov 23 10:02:48 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:48.008 2 INFO neutron.agent.securitygroups_rpc [None req-11137031-a253-41b4-b5b7-0c84953c1da3 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12451 keys, 14927692 bytes, temperature: kUnknown
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892168091411, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 14927692, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14860152, "index_size": 35311, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31173, "raw_key_size": 337081, "raw_average_key_size": 27, "raw_value_size": 14651174, "raw_average_value_size": 1176, "num_data_blocks": 1309, "num_entries": 12451, "num_filter_entries": 12451, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892167, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.091758) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 14927692 bytes
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.093979) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.4 rd, 156.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 15.3 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(39.1) write-amplify(18.4) OK, records in: 12958, records dropped: 507 output_compression: NoCompression
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.094007) EVENT_LOG_v1 {"time_micros": 1763892168093995, "job": 24, "event": "compaction_finished", "compaction_time_micros": 95460, "compaction_time_cpu_micros": 45948, "output_level": 6, "num_output_files": 1, "total_output_size": 14927692, "num_input_records": 12958, "num_output_records": 12451, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892168094234, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892168096578, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995870) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096725) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:02:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:48.388 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:48 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:48.641 2 INFO neutron.agent.securitygroups_rpc [None req-edba85f2-3d5d-4fd2-9504-24d71c7cf655 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:48 np0005532585.localdomain ceph-mon[300199]: osdmap e134: 6 total, 6 up, 6 in
Nov 23 10:02:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:49.209 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:46Z, description=, device_id=629023cb-7ed5-4aab-8f47-b87316f2c0e8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a91490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a912b0>], id=c964b4e6-771f-468b-9ae9-3eb34c34ed00, ip_allocation=immediate, mac_address=fa:16:3e:5c:5b:d7, name=tempest-PortsIpV6TestJSON-1450374320, network_id=5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['d7ead8f7-80d5-4103-ab91-19b87956485a'], standard_attr_id=2012, status=ACTIVE, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:47Z on network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4
Nov 23 10:02:49 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:49.211 2 INFO neutron.agent.securitygroups_rpc [None req-d908f8b2-2302-4d45-805c-4d35b6de9b08 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:49 np0005532585.localdomain dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 1 addresses
Nov 23 10:02:49 np0005532585.localdomain podman[326157]: 2025-11-23 10:02:49.405300198 +0000 UTC m=+0.060225336 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:02:49 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host
Nov 23 10:02:49 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts
Nov 23 10:02:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:49.636 263258 INFO neutron.agent.dhcp.agent [None req-ed607fe6-b388-435a-a837-eee2463de4b1 - - - - - -] DHCP configuration for ports {'c964b4e6-771f-468b-9ae9-3eb34c34ed00'} is completed
Nov 23 10:02:49 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:49.955 2 INFO neutron.agent.securitygroups_rpc [None req-765ea533-9294-4729-95dc-8b0b8371a481 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:02:49 np0005532585.localdomain ceph-mon[300199]: pgmap v273: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 150 KiB/s rd, 91 MiB/s wr, 255 op/s
Nov 23 10:02:50 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:50.135 2 INFO neutron.agent.securitygroups_rpc [None req-838089d2-b106-4cd8-a2d9-7d5c71329675 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']
Nov 23 10:02:50 np0005532585.localdomain dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 0 addresses
Nov 23 10:02:50 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host
Nov 23 10:02:50 np0005532585.localdomain dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts
Nov 23 10:02:50 np0005532585.localdomain podman[326194]: 2025-11-23 10:02:50.164676531 +0000 UTC m=+0.057047328 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:02:50 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:50Z|00381|binding|INFO|Releasing lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad from this chassis (sb_readonly=0)
Nov 23 10:02:50 np0005532585.localdomain kernel: device tap7f7bd95f-df left promiscuous mode
Nov 23 10:02:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:50.349 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:50 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:50Z|00382|binding|INFO|Setting lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad down in Southbound
Nov 23 10:02:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:50.362 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=983e35fe-65e3-4c0c-9ba9-9421db3b9faf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:50.363 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad in datapath 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 unbound from our chassis
Nov 23 10:02:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:50.365 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:50.369 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0fdf78eb-f3c4-4913-998e-4b587a7991b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:50.371 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:51 np0005532585.localdomain dnsmasq[326102]: exiting on receipt of SIGTERM
Nov 23 10:02:51 np0005532585.localdomain systemd[1]: libpod-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f.scope: Deactivated successfully.
Nov 23 10:02:51 np0005532585.localdomain podman[326233]: 2025-11-23 10:02:51.495697944 +0000 UTC m=+0.053481049 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:51 np0005532585.localdomain ceph-mon[300199]: pgmap v274: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 141 KiB/s rd, 86 MiB/s wr, 240 op/s
Nov 23 10:02:51 np0005532585.localdomain podman[326247]: 2025-11-23 10:02:51.548711697 +0000 UTC m=+0.039930841 container died 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 10:02:51 np0005532585.localdomain systemd[1]: tmp-crun.W8YJHW.mount: Deactivated successfully.
Nov 23 10:02:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:51.640 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:51 np0005532585.localdomain podman[326247]: 2025-11-23 10:02:51.643728604 +0000 UTC m=+0.134947728 container cleanup 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:02:51 np0005532585.localdomain systemd[1]: libpod-conmon-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f.scope: Deactivated successfully.
Nov 23 10:02:51 np0005532585.localdomain podman[326249]: 2025-11-23 10:02:51.707803458 +0000 UTC m=+0.193174682 container remove 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:02:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:51.734 263258 INFO neutron.agent.dhcp.agent [None req-2a991f53-4758-4e67-afd2-53a3ed94172e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:51.736 263258 INFO neutron.agent.dhcp.agent [None req-2a991f53-4758-4e67-afd2-53a3ed94172e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:51Z|00383|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:51 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:51.859 2 INFO neutron.agent.securitygroups_rpc [None req-8ada9fea-3f3e-487c-9d7c-c67e94466d69 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['2de10e3b-e1e6-47ac-8eeb-13eb3642fef8']
Nov 23 10:02:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:51.871 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-5c3fea95cfcf9b4849df2e2cc958865fbf3b563efeef0cb306d35e7e353e4421-merged.mount: Deactivated successfully.
Nov 23 10:02:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:52 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d5facb45a\x2dbb59\x2d481c\x2db7e6\x2ddbeb21aaf8b4.mount: Deactivated successfully.
Nov 23 10:02:53 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:53.223 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:53 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:53.927 263258 INFO neutron.agent.linux.ip_lib [None req-79818a7e-6681-46d6-95ae-9b562351bf83 - - - - - -] Device tap924d747d-30 cannot be used as it has no MAC address
Nov 23 10:02:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:53.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:53 np0005532585.localdomain kernel: device tap924d747d-30 entered promiscuous mode
Nov 23 10:02:53 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892173.9811] manager: (tap924d747d-30): new Generic device (/org/freedesktop/NetworkManager/Devices/63)
Nov 23 10:02:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:53Z|00384|binding|INFO|Claiming lport 924d747d-3069-493d-890d-d22289f6cb63 for this chassis.
Nov 23 10:02:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:53Z|00385|binding|INFO|924d747d-3069-493d-890d-d22289f6cb63: Claiming unknown
Nov 23 10:02:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:53.982 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:53 np0005532585.localdomain systemd-udevd[326286]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1480543529' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1480543529' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1652389115' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1652389115' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:53 np0005532585.localdomain ceph-mon[300199]: pgmap v275: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 120 KiB/s rd, 73 MiB/s wr, 204 op/s
Nov 23 10:02:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:53.993 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c46eb8d9-f4c4-4e34-a4e0-3a70a48cb8ea, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=924d747d-3069-493d-890d-d22289f6cb63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:53.995 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 924d747d-3069-493d-890d-d22289f6cb63 in datapath 0d2bb8b4-9b3e-41c7-b595-54664cfb433a bound to our chassis
Nov 23 10:02:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:53.996 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:53.997 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f07d5b66-bcdc-4172-8630-51d1052c3365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e135 e135: 6 total, 6 up, 6 in
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.012 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:54Z|00386|binding|INFO|Setting lport 924d747d-3069-493d-890d-d22289f6cb63 ovn-installed in OVS
Nov 23 10:02:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:54Z|00387|binding|INFO|Setting lport 924d747d-3069-493d-890d-d22289f6cb63 up in Southbound
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.017 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap924d747d-30: No such device
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.057 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:54.181 263258 INFO neutron.agent.linux.ip_lib [None req-5eec097b-deb4-454c-8c91-216fc47067b4 - - - - - -] Device tap6f155903-a3 cannot be used as it has no MAC address
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.203 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain kernel: device tap6f155903-a3 entered promiscuous mode
Nov 23 10:02:54 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892174.2085] manager: (tap6f155903-a3): new Generic device (/org/freedesktop/NetworkManager/Devices/64)
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:54Z|00388|binding|INFO|Claiming lport 6f155903-a394-40bc-9c4e-04010e974788 for this chassis.
Nov 23 10:02:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:54Z|00389|binding|INFO|6f155903-a394-40bc-9c4e-04010e974788: Claiming unknown
Nov 23 10:02:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:54.227 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d782672f-ba9a-4b1f-9286-2b53b24a21c0, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=6f155903-a394-40bc-9c4e-04010e974788) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:54.228 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6f155903-a394-40bc-9c4e-04010e974788 in datapath 725f5f75-c3ef-4a36-ba95-e1cd3131878c bound to our chassis
Nov 23 10:02:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:54.229 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 725f5f75-c3ef-4a36-ba95-e1cd3131878c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:54 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:54.229 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[51e82432-45aa-478f-b12f-4e40b93fcaea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:54Z|00390|binding|INFO|Setting lport 6f155903-a394-40bc-9c4e-04010e974788 ovn-installed in OVS
Nov 23 10:02:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:54Z|00391|binding|INFO|Setting lport 6f155903-a394-40bc-9c4e-04010e974788 up in Southbound
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.262 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.307 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:54.314 2 INFO neutron.agent.securitygroups_rpc [None req-70f0f8e2-9250-4661-ae6f-f58e5f669345 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['dc864c5c-de53-475b-960e-083ffe4e3e6b']
Nov 23 10:02:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:54.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:54 np0005532585.localdomain podman[326398]: 
Nov 23 10:02:54 np0005532585.localdomain podman[326398]: 2025-11-23 10:02:54.98521649 +0000 UTC m=+0.085532405 container create 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:02:55 np0005532585.localdomain ceph-mon[300199]: osdmap e135: 6 total, 6 up, 6 in
Nov 23 10:02:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1512625466' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1512625466' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: Started libpod-conmon-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495.scope.
Nov 23 10:02:55 np0005532585.localdomain podman[326398]: 2025-11-23 10:02:54.937557092 +0000 UTC m=+0.037873017 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:55 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6e4c512250f0e320013f079dacd19dfb69dd5096921c83fb30670bcf3ae1a91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:55 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:55.062 2 INFO neutron.agent.securitygroups_rpc [None req-ebacb72f-43a4-4e7f-beed-d94b86f532ab fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['dc864c5c-de53-475b-960e-083ffe4e3e6b']
Nov 23 10:02:55 np0005532585.localdomain podman[326398]: 2025-11-23 10:02:55.109288732 +0000 UTC m=+0.209604607 container init 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:02:55 np0005532585.localdomain podman[326398]: 2025-11-23 10:02:55.119092024 +0000 UTC m=+0.219407919 container start 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326434]: started, version 2.85 cachesize 150
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326434]: DNS service limited to local subnets
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326434]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326434]: warning: no upstream servers configured
Nov 23 10:02:55 np0005532585.localdomain dnsmasq-dhcp[326434]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326434]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses
Nov 23 10:02:55 np0005532585.localdomain dnsmasq-dhcp[326434]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:02:55 np0005532585.localdomain dnsmasq-dhcp[326434]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326434]: exiting on receipt of SIGTERM
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: libpod-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495.scope: Deactivated successfully.
Nov 23 10:02:55 np0005532585.localdomain podman[326441]: 2025-11-23 10:02:55.219640692 +0000 UTC m=+0.069508752 container died 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:02:55 np0005532585.localdomain podman[326441]: 2025-11-23 10:02:55.247019926 +0000 UTC m=+0.096887976 container cleanup 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:02:55 np0005532585.localdomain podman[326454]: 2025-11-23 10:02:55.267471405 +0000 UTC m=+0.046684939 container cleanup 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: libpod-conmon-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495.scope: Deactivated successfully.
Nov 23 10:02:55 np0005532585.localdomain podman[326469]: 2025-11-23 10:02:55.339641799 +0000 UTC m=+0.075643251 container remove 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:55.395 263258 INFO neutron.agent.dhcp.agent [None req-d5e2aec0-a188-4402-9b7e-a9bf1d6a375a - - - - - -] DHCP configuration for ports {'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed
Nov 23 10:02:55 np0005532585.localdomain podman[326485]: 
Nov 23 10:02:55 np0005532585.localdomain podman[326485]: 2025-11-23 10:02:55.435081489 +0000 UTC m=+0.078369315 container create 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: Started libpod-conmon-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf.scope.
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:55 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b5117719bced5fdac21fc3bb297cfd580c490322e10e374d90cdd191e13b6d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:55 np0005532585.localdomain podman[326485]: 2025-11-23 10:02:55.401618468 +0000 UTC m=+0.044906284 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:55 np0005532585.localdomain podman[326485]: 2025-11-23 10:02:55.512600606 +0000 UTC m=+0.155888422 container init 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 10:02:55 np0005532585.localdomain podman[326485]: 2025-11-23 10:02:55.521727218 +0000 UTC m=+0.165015034 container start 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326522]: started, version 2.85 cachesize 150
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326522]: DNS service limited to local subnets
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326522]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326522]: warning: no upstream servers configured
Nov 23 10:02:55 np0005532585.localdomain dnsmasq-dhcp[326522]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:55 np0005532585.localdomain dnsmasq[326522]: read /var/lib/neutron/dhcp/725f5f75-c3ef-4a36-ba95-e1cd3131878c/addn_hosts - 0 addresses
Nov 23 10:02:55 np0005532585.localdomain dnsmasq-dhcp[326522]: read /var/lib/neutron/dhcp/725f5f75-c3ef-4a36-ba95-e1cd3131878c/host
Nov 23 10:02:55 np0005532585.localdomain dnsmasq-dhcp[326522]: read /var/lib/neutron/dhcp/725f5f75-c3ef-4a36-ba95-e1cd3131878c/opts
Nov 23 10:02:55 np0005532585.localdomain podman[326502]: 2025-11-23 10:02:55.59096827 +0000 UTC m=+0.094787441 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:02:55 np0005532585.localdomain podman[326502]: 2025-11-23 10:02:55.59874294 +0000 UTC m=+0.102562071 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:02:55 np0005532585.localdomain podman[326501]: 2025-11-23 10:02:55.631653894 +0000 UTC m=+0.139479678 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:02:55 np0005532585.localdomain podman[326501]: 2025-11-23 10:02:55.642062535 +0000 UTC m=+0.149888329 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:02:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:55.743 263258 INFO neutron.agent.dhcp.agent [None req-3a164df1-a840-4abe-9000-4148e2e6778d - - - - - -] DHCP configuration for ports {'02b7bafb-bf4b-46dd-b7b7-250c2ecb1918'} is completed
Nov 23 10:02:55 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:55.749 2 INFO neutron.agent.securitygroups_rpc [None req-bb083733-91fb-4356-a91b-0dce7c35cb96 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d9259f0b-7c30-4dce-b81e-e0f698e442c7']
Nov 23 10:02:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:55.802 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:55Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ba90d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa6d00>], id=5f5b1b33-9dfa-4cc9-8666-90efeaefde4c, ip_allocation=immediate, mac_address=fa:16:3e:8c:3c:b5, name=tempest-PortsIpV6TestJSON-439196546, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:03Z, description=, dns_domain=, id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-274315444, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8521, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['c0c35879-4b70-4336-bd77-6c177763c3a9'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:52Z, vlan_transparent=None, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d9259f0b-7c30-4dce-b81e-e0f698e442c7'], standard_attr_id=2074, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:55Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: tmp-crun.BYAoSg.mount: Deactivated successfully.
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-c6e4c512250f0e320013f079dacd19dfb69dd5096921c83fb30670bcf3ae1a91-merged.mount: Deactivated successfully.
Nov 23 10:02:55 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e136 e136: 6 total, 6 up, 6 in
Nov 23 10:02:56 np0005532585.localdomain ceph-mon[300199]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 833 MiB used, 41 GiB / 42 GiB avail; 151 KiB/s rd, 48 MiB/s wr, 263 op/s
Nov 23 10:02:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1427395697' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:02:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1427395697' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:02:56 np0005532585.localdomain podman[326569]: 
Nov 23 10:02:56 np0005532585.localdomain podman[326569]: 2025-11-23 10:02:56.354839892 +0000 UTC m=+0.089167638 container create 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 23 10:02:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b.scope.
Nov 23 10:02:56 np0005532585.localdomain podman[326569]: 2025-11-23 10:02:56.314653745 +0000 UTC m=+0.048981521 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09f567e55101099cb00d6486a716f6879b35e236f2b8a7ee5f08c552a21d3b24/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:56 np0005532585.localdomain podman[326569]: 2025-11-23 10:02:56.438631173 +0000 UTC m=+0.172958919 container init 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:02:56 np0005532585.localdomain podman[326569]: 2025-11-23 10:02:56.447594829 +0000 UTC m=+0.181922575 container start 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:56 np0005532585.localdomain dnsmasq[326587]: started, version 2.85 cachesize 150
Nov 23 10:02:56 np0005532585.localdomain dnsmasq[326587]: DNS service limited to local subnets
Nov 23 10:02:56 np0005532585.localdomain dnsmasq[326587]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:56 np0005532585.localdomain dnsmasq[326587]: warning: no upstream servers configured
Nov 23 10:02:56 np0005532585.localdomain dnsmasq-dhcp[326587]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:56 np0005532585.localdomain dnsmasq[326587]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses
Nov 23 10:02:56 np0005532585.localdomain dnsmasq-dhcp[326587]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:02:56 np0005532585.localdomain dnsmasq-dhcp[326587]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:02:56 np0005532585.localdomain sshd[326588]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:02:56 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:56.621 2 INFO neutron.agent.securitygroups_rpc [None req-79d36424-581f-4db8-8c43-da9cef64debb fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['2b80d24e-7954-4669-8041-3d535b2f9be2']
Nov 23 10:02:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:56.682 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:56 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:56.761 2 INFO neutron.agent.securitygroups_rpc [None req-aea3005f-d734-4d7a-a8c4-fa6242ccbee5 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['2b80d24e-7954-4669-8041-3d535b2f9be2']
Nov 23 10:02:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:56.830 263258 INFO neutron.agent.dhcp.agent [None req-1778e3b2-e5ea-4ffa-b427-fb26a234b807 - - - - - -] DHCP configuration for ports {'5f5b1b33-9dfa-4cc9-8666-90efeaefde4c'} is completed
Nov 23 10:02:56 np0005532585.localdomain sshd[326588]: Received disconnect from 107.172.15.139 port 58564:11: Bye Bye [preauth]
Nov 23 10:02:56 np0005532585.localdomain sshd[326588]: Disconnected from authenticating user root 107.172.15.139 port 58564 [preauth]
Nov 23 10:02:57 np0005532585.localdomain ceph-mon[300199]: osdmap e136: 6 total, 6 up, 6 in
Nov 23 10:02:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:57.330 263258 INFO neutron.agent.linux.ip_lib [None req-ff26edd0-4855-4b74-b7cf-72c484335e81 - - - - - -] Device tapa3af0bf2-76 cannot be used as it has no MAC address
Nov 23 10:02:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:57.354 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:57 np0005532585.localdomain kernel: device tapa3af0bf2-76 entered promiscuous mode
Nov 23 10:02:57 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892177.3620] manager: (tapa3af0bf2-76): new Generic device (/org/freedesktop/NetworkManager/Devices/65)
Nov 23 10:02:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:57.365 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:57Z|00392|binding|INFO|Claiming lport a3af0bf2-7636-468b-88ec-a6aa42638a50 for this chassis.
Nov 23 10:02:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:57Z|00393|binding|INFO|a3af0bf2-7636-468b-88ec-a6aa42638a50: Claiming unknown
Nov 23 10:02:57 np0005532585.localdomain systemd-udevd[326635]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:02:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:57.376 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08aa4c09-13b3-4d24-b03c-d932cc7d0bac, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a3af0bf2-7636-468b-88ec-a6aa42638a50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:57.377 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a3af0bf2-7636-468b-88ec-a6aa42638a50 in datapath 3f022321-7f06-4d92-8d47-80c086661f24 bound to our chassis
Nov 23 10:02:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:57.377 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f022321-7f06-4d92-8d47-80c086661f24 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:57.378 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9c50b0a9-3b45-48f1-b9e5-c42dcd113dc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:57Z|00394|binding|INFO|Setting lport a3af0bf2-7636-468b-88ec-a6aa42638a50 ovn-installed in OVS
Nov 23 10:02:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:57Z|00395|binding|INFO|Setting lport a3af0bf2-7636-468b-88ec-a6aa42638a50 up in Southbound
Nov 23 10:02:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:57.417 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:57 np0005532585.localdomain systemd[1]: tmp-crun.QoC8l1.mount: Deactivated successfully.
Nov 23 10:02:57 np0005532585.localdomain dnsmasq[326587]: exiting on receipt of SIGTERM
Nov 23 10:02:57 np0005532585.localdomain podman[326621]: 2025-11-23 10:02:57.429060224 +0000 UTC m=+0.094467621 container kill 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:02:57 np0005532585.localdomain systemd[1]: libpod-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b.scope: Deactivated successfully.
Nov 23 10:02:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:57.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:57 np0005532585.localdomain podman[326647]: 2025-11-23 10:02:57.53339065 +0000 UTC m=+0.085324722 container died 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:02:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:57.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:57 np0005532585.localdomain podman[326647]: 2025-11-23 10:02:57.584021518 +0000 UTC m=+0.135955580 container remove 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:02:57 np0005532585.localdomain systemd[1]: libpod-conmon-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b.scope: Deactivated successfully.
Nov 23 10:02:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e137 e137: 6 total, 6 up, 6 in
Nov 23 10:02:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-09f567e55101099cb00d6486a716f6879b35e236f2b8a7ee5f08c552a21d3b24-merged.mount: Deactivated successfully.
Nov 23 10:02:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:58 np0005532585.localdomain ceph-mon[300199]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 833 MiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 4.4 KiB/s wr, 157 op/s
Nov 23 10:02:58 np0005532585.localdomain ceph-mon[300199]: osdmap e137: 6 total, 6 up, 6 in
Nov 23 10:02:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:02:58 np0005532585.localdomain podman[326744]: 
Nov 23 10:02:58 np0005532585.localdomain podman[326744]: 2025-11-23 10:02:58.48847234 +0000 UTC m=+0.083974768 container create 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2.scope.
Nov 23 10:02:58 np0005532585.localdomain podman[326744]: 2025-11-23 10:02:58.443789273 +0000 UTC m=+0.039291721 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/faeca6c7b92d4743d8ee7c6a15ec1a553cb548aadde1156e887861619142f300/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:58 np0005532585.localdomain podman[326744]: 2025-11-23 10:02:58.568315669 +0000 UTC m=+0.163818087 container init 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:02:58 np0005532585.localdomain podman[326744]: 2025-11-23 10:02:58.582334431 +0000 UTC m=+0.177836849 container start 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:02:58 np0005532585.localdomain dnsmasq[326765]: started, version 2.85 cachesize 150
Nov 23 10:02:58 np0005532585.localdomain dnsmasq[326765]: DNS service limited to local subnets
Nov 23 10:02:58 np0005532585.localdomain dnsmasq[326765]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:58 np0005532585.localdomain dnsmasq[326765]: warning: no upstream servers configured
Nov 23 10:02:58 np0005532585.localdomain dnsmasq-dhcp[326765]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:58 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:58.588 2 INFO neutron.agent.securitygroups_rpc [None req-f9017110-ec89-4835-91db-9aa9b814a12e 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d9259f0b-7c30-4dce-b81e-e0f698e442c7', '469976a2-fa36-45e6-842e-95bc93db1438']
Nov 23 10:02:58 np0005532585.localdomain dnsmasq[326765]: read /var/lib/neutron/dhcp/3f022321-7f06-4d92-8d47-80c086661f24/addn_hosts - 0 addresses
Nov 23 10:02:58 np0005532585.localdomain dnsmasq-dhcp[326765]: read /var/lib/neutron/dhcp/3f022321-7f06-4d92-8d47-80c086661f24/host
Nov 23 10:02:58 np0005532585.localdomain dnsmasq-dhcp[326765]: read /var/lib/neutron/dhcp/3f022321-7f06-4d92-8d47-80c086661f24/opts
Nov 23 10:02:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:58.756 263258 INFO neutron.agent.dhcp.agent [None req-aeaa0fdd-97d9-4ed2-922c-af644ee6c37c - - - - - -] DHCP configuration for ports {'93ccaa6b-1471-4b5e-9c29-91c9cd633700'} is completed
Nov 23 10:02:58 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:58.768 2 INFO neutron.agent.securitygroups_rpc [None req-17703c58-f506-4a75-8387-af7c0c3c8d74 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']
Nov 23 10:02:58 np0005532585.localdomain podman[326797]: 
Nov 23 10:02:58 np0005532585.localdomain dnsmasq[326765]: exiting on receipt of SIGTERM
Nov 23 10:02:58 np0005532585.localdomain podman[326816]: 2025-11-23 10:02:58.903201706 +0000 UTC m=+0.066953534 container kill 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:02:58 np0005532585.localdomain systemd[1]: libpod-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2.scope: Deactivated successfully.
Nov 23 10:02:58 np0005532585.localdomain podman[326797]: 2025-11-23 10:02:58.946990994 +0000 UTC m=+0.174215137 container create c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:58 np0005532585.localdomain podman[326797]: 2025-11-23 10:02:58.864080161 +0000 UTC m=+0.091304334 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:02:58 np0005532585.localdomain podman[326837]: 2025-11-23 10:02:58.976839464 +0000 UTC m=+0.052601012 container died 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:02:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97.scope.
Nov 23 10:02:58 np0005532585.localdomain systemd[1]: tmp-crun.5EqRcj.mount: Deactivated successfully.
Nov 23 10:02:59 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:02:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecece6215fcae68135ad77444848607179a7db1f65cb11807dd870c2d1e6781b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:02:59 np0005532585.localdomain podman[326797]: 2025-11-23 10:02:59.025432291 +0000 UTC m=+0.252656444 container init c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:02:59 np0005532585.localdomain systemd[1]: tmp-crun.woFOec.mount: Deactivated successfully.
Nov 23 10:02:59 np0005532585.localdomain podman[326797]: 2025-11-23 10:02:59.042199807 +0000 UTC m=+0.269423950 container start c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: started, version 2.85 cachesize 150
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: DNS service limited to local subnets
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: warning: no upstream servers configured
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:02:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2-userdata-shm.mount: Deactivated successfully.
Nov 23 10:02:59 np0005532585.localdomain podman[326837]: 2025-11-23 10:02:59.082413617 +0000 UTC m=+0.158175155 container remove 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:02:59 np0005532585.localdomain systemd[1]: libpod-conmon-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2.scope: Deactivated successfully.
Nov 23 10:02:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.091 263258 INFO neutron.agent.dhcp.agent [None req-fe5ae86c-31e0-4af7-9f5f-9f08814d9ce1 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:55Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c07fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c904f0>], id=5f5b1b33-9dfa-4cc9-8666-90efeaefde4c, ip_allocation=immediate, mac_address=fa:16:3e:8c:3c:b5, name=tempest-PortsIpV6TestJSON-1148844861, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['469976a2-fa36-45e6-842e-95bc93db1438'], standard_attr_id=2074, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:58Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a
Nov 23 10:02:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:59.122 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 666e00e5-826b-452a-afb5-82b9c58af16a with type ""
Nov 23 10:02:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:59.123 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08aa4c09-13b3-4d24-b03c-d932cc7d0bac, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a3af0bf2-7636-468b-88ec-a6aa42638a50) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:02:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:59.125 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a3af0bf2-7636-468b-88ec-a6aa42638a50 in datapath 3f022321-7f06-4d92-8d47-80c086661f24 unbound from our chassis
Nov 23 10:02:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:59Z|00396|binding|INFO|Removing iface tapa3af0bf2-76 ovn-installed in OVS
Nov 23 10:02:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:59Z|00397|binding|INFO|Removing lport a3af0bf2-7636-468b-88ec-a6aa42638a50 ovn-installed in OVS
Nov 23 10:02:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:59.125 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f022321-7f06-4d92-8d47-80c086661f24 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:02:59 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:02:59.126 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3816a667-0639-42fa-a9c6-7852d39477f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:02:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:59.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:59.149 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:59 np0005532585.localdomain kernel: device tapa3af0bf2-76 left promiscuous mode
Nov 23 10:02:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:59.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.188 263258 INFO neutron.agent.dhcp.agent [None req-3e44c0cc-7fc2-423c-9111-62e2572a2fa5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.189 263258 INFO neutron.agent.dhcp.agent [None req-3e44c0cc-7fc2-423c-9111-62e2572a2fa5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:02:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:59.196 2 INFO neutron.agent.securitygroups_rpc [None req-133cea40-228c-4af6-8f6b-d4d2d5a2eb51 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:02:59 np0005532585.localdomain podman[326882]: 2025-11-23 10:02:59.302355271 +0000 UTC m=+0.065672613 container kill c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:02:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:59.350 2 INFO neutron.agent.securitygroups_rpc [None req-75de0287-b544-4fab-ad82-848c9edeaf4f fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']
Nov 23 10:02:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.504 263258 INFO neutron.agent.dhcp.agent [None req-34e9f92e-d170-472c-ac70-7faa9e9aef37 - - - - - -] DHCP configuration for ports {'5f5b1b33-9dfa-4cc9-8666-90efeaefde4c', '924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed
Nov 23 10:02:59 np0005532585.localdomain ceph-mon[300199]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 817 MiB used, 41 GiB / 42 GiB avail; 168 KiB/s rd, 8.7 KiB/s wr, 270 op/s
Nov 23 10:02:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:59.540 2 INFO neutron.agent.securitygroups_rpc [None req-94b4dea1-0a52-41d4-90a7-1f1aefba76c5 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['469976a2-fa36-45e6-842e-95bc93db1438']
Nov 23 10:02:59 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:02:59Z|00398|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:02:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.630 263258 INFO neutron.agent.dhcp.agent [None req-5070c634-78d7-43c8-ad22-2dbd05c7cbc5 - - - - - -] DHCP configuration for ports {'5f5b1b33-9dfa-4cc9-8666-90efeaefde4c'} is completed
Nov 23 10:02:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:02:59.647 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:02:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:02:59.735 2 INFO neutron.agent.securitygroups_rpc [None req-d3ac2373-b722-49eb-839d-a87ede7d08ac fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']
Nov 23 10:02:59 np0005532585.localdomain dnsmasq[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:02:59 np0005532585.localdomain dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:02:59 np0005532585.localdomain podman[326922]: 2025-11-23 10:02:59.756724218 +0000 UTC m=+0.058015127 container kill c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:02:59 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-faeca6c7b92d4743d8ee7c6a15ec1a553cb548aadde1156e887861619142f300-merged.mount: Deactivated successfully.
Nov 23 10:02:59 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d3f022321\x2d7f06\x2d4d92\x2d8d47\x2d80c086661f24.mount: Deactivated successfully.
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:02:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:02:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:03:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e138 e138: 6 total, 6 up, 6 in
Nov 23 10:03:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1122561443' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1122561443' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:00 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:00.563 2 INFO neutron.agent.securitygroups_rpc [None req-6ed9d977-5486-448d-8f02-a426fdb94759 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']
Nov 23 10:03:01 np0005532585.localdomain dnsmasq[326863]: exiting on receipt of SIGTERM
Nov 23 10:03:01 np0005532585.localdomain systemd[1]: libpod-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97.scope: Deactivated successfully.
Nov 23 10:03:01 np0005532585.localdomain podman[326958]: 2025-11-23 10:03:01.07601416 +0000 UTC m=+0.059675339 container kill c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:03:01 np0005532585.localdomain podman[326970]: 2025-11-23 10:03:01.141984223 +0000 UTC m=+0.053623123 container died c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:03:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:01 np0005532585.localdomain podman[326970]: 2025-11-23 10:03:01.172749891 +0000 UTC m=+0.084388751 container cleanup c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:03:01 np0005532585.localdomain systemd[1]: libpod-conmon-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97.scope: Deactivated successfully.
Nov 23 10:03:01 np0005532585.localdomain podman[326972]: 2025-11-23 10:03:01.227627951 +0000 UTC m=+0.131132950 container remove c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:03:01 np0005532585.localdomain ceph-mon[300199]: osdmap e138: 6 total, 6 up, 6 in
Nov 23 10:03:01 np0005532585.localdomain ceph-mon[300199]: pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 817 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 2.8 KiB/s wr, 60 op/s
Nov 23 10:03:01 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:01.564 2 INFO neutron.agent.securitygroups_rpc [None req-d75c0dda-e48d-4259-9ad6-e58217c5f7b4 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']
Nov 23 10:03:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:01.729 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:01 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:01.733 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:02 np0005532585.localdomain podman[327049]: 
Nov 23 10:03:02 np0005532585.localdomain podman[327049]: 2025-11-23 10:03:02.060541309 +0000 UTC m=+0.064272051 container create f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:02 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ecece6215fcae68135ad77444848607179a7db1f65cb11807dd870c2d1e6781b-merged.mount: Deactivated successfully.
Nov 23 10:03:02 np0005532585.localdomain systemd[1]: Started libpod-conmon-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c.scope.
Nov 23 10:03:02 np0005532585.localdomain systemd[1]: tmp-crun.aHO68g.mount: Deactivated successfully.
Nov 23 10:03:02 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:02 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9d4bc8fd932f615977866e4cf411b544a5d942d899a713116fb40691af4610c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:02 np0005532585.localdomain podman[327049]: 2025-11-23 10:03:02.122167308 +0000 UTC m=+0.125897820 container init f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:03:02 np0005532585.localdomain podman[327049]: 2025-11-23 10:03:02.029874635 +0000 UTC m=+0.033605187 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:02 np0005532585.localdomain podman[327049]: 2025-11-23 10:03:02.130122753 +0000 UTC m=+0.133853265 container start f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:02 np0005532585.localdomain dnsmasq[327068]: started, version 2.85 cachesize 150
Nov 23 10:03:02 np0005532585.localdomain dnsmasq[327068]: DNS service limited to local subnets
Nov 23 10:03:02 np0005532585.localdomain dnsmasq[327068]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:02 np0005532585.localdomain dnsmasq[327068]: warning: no upstream servers configured
Nov 23 10:03:02 np0005532585.localdomain dnsmasq-dhcp[327068]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:03:02 np0005532585.localdomain dnsmasq[327068]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses
Nov 23 10:03:02 np0005532585.localdomain dnsmasq-dhcp[327068]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:02 np0005532585.localdomain dnsmasq-dhcp[327068]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:02.507 263258 INFO neutron.agent.dhcp.agent [None req-a291d9a4-9639-41e7-a6e4-11c4541309a3 - - - - - -] DHCP configuration for ports {'924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed
Nov 23 10:03:02 np0005532585.localdomain dnsmasq[327068]: exiting on receipt of SIGTERM
Nov 23 10:03:02 np0005532585.localdomain podman[327085]: 2025-11-23 10:03:02.547144609 +0000 UTC m=+0.058023468 container kill f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:03:02 np0005532585.localdomain systemd[1]: libpod-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c.scope: Deactivated successfully.
Nov 23 10:03:02 np0005532585.localdomain podman[327101]: 2025-11-23 10:03:02.623839452 +0000 UTC m=+0.053915762 container died f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:03:02 np0005532585.localdomain podman[327101]: 2025-11-23 10:03:02.666274209 +0000 UTC m=+0.096350479 container remove f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:03:02 np0005532585.localdomain systemd[1]: libpod-conmon-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c.scope: Deactivated successfully.
Nov 23 10:03:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:02.696 263258 INFO neutron.agent.linux.ip_lib [None req-8e2a35fc-7046-46e3-b20c-8ebe7ccbce4a - - - - - -] Device tapf178a34a-d3 cannot be used as it has no MAC address
Nov 23 10:03:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:02.716 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:02 np0005532585.localdomain kernel: device tapf178a34a-d3 entered promiscuous mode
Nov 23 10:03:02 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892182.7234] manager: (tapf178a34a-d3): new Generic device (/org/freedesktop/NetworkManager/Devices/66)
Nov 23 10:03:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:02.724 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:02 np0005532585.localdomain systemd-udevd[327136]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:02Z|00399|binding|INFO|Claiming lport f178a34a-d388-4d80-b907-596688f93fe4 for this chassis.
Nov 23 10:03:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:02Z|00400|binding|INFO|f178a34a-d388-4d80-b907-596688f93fe4: Claiming unknown
Nov 23 10:03:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:02Z|00401|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:02.738 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c36fb98c-d1bf-4015-90da-0f0d5f6a670b, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f178a34a-d388-4d80-b907-596688f93fe4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:02.739 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f178a34a-d388-4d80-b907-596688f93fe4 in datapath 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 bound to our chassis
Nov 23 10:03:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:02.740 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:02 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:02.741 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[18227b7b-e9bf-4e19-9dc9-47cabd27eabb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:02Z|00402|binding|INFO|Setting lport f178a34a-d388-4d80-b907-596688f93fe4 ovn-installed in OVS
Nov 23 10:03:02 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:02Z|00403|binding|INFO|Setting lport f178a34a-d388-4d80-b907-596688f93fe4 up in Southbound
Nov 23 10:03:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:02.808 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:02.814 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:02.824 2 INFO neutron.agent.securitygroups_rpc [None req-1875a888-d52f-4402-8541-5b1512187260 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['c4aad9b2-b8cd-4803-b28f-3e773406a427']
Nov 23 10:03:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:02.842 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e139 e139: 6 total, 6 up, 6 in
Nov 23 10:03:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-a9d4bc8fd932f615977866e4cf411b544a5d942d899a713116fb40691af4610c-merged.mount: Deactivated successfully.
Nov 23 10:03:03 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:03 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:03.493 2 INFO neutron.agent.securitygroups_rpc [None req-d59d1e53-d1c4-451d-80ab-ba4648fa7a20 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['cb577a71-e41d-409b-b673-d883cbdab535']
Nov 23 10:03:03 np0005532585.localdomain podman[327198]: 
Nov 23 10:03:03 np0005532585.localdomain podman[327198]: 2025-11-23 10:03:03.652758698 +0000 UTC m=+0.092186521 container create 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:03 np0005532585.localdomain systemd[1]: Started libpod-conmon-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a.scope.
Nov 23 10:03:03 np0005532585.localdomain podman[327198]: 2025-11-23 10:03:03.606930366 +0000 UTC m=+0.046358229 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:03 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:03 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f1a9a6b5db597e2d43b68f915d07dc93135e7537aca6fb565346ec09a7ab2f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:03 np0005532585.localdomain podman[327198]: 2025-11-23 10:03:03.719013339 +0000 UTC m=+0.158441142 container init 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:03:03 np0005532585.localdomain podman[327198]: 2025-11-23 10:03:03.724170818 +0000 UTC m=+0.163598631 container start 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:03:03 np0005532585.localdomain dnsmasq[327226]: started, version 2.85 cachesize 150
Nov 23 10:03:03 np0005532585.localdomain dnsmasq[327226]: DNS service limited to local subnets
Nov 23 10:03:03 np0005532585.localdomain dnsmasq[327226]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:03 np0005532585.localdomain dnsmasq[327226]: warning: no upstream servers configured
Nov 23 10:03:03 np0005532585.localdomain dnsmasq-dhcp[327226]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:03:03 np0005532585.localdomain dnsmasq[327226]: read /var/lib/neutron/dhcp/83a5d156-32a8-4c4d-8a44-164aa1c3b9d5/addn_hosts - 0 addresses
Nov 23 10:03:03 np0005532585.localdomain dnsmasq-dhcp[327226]: read /var/lib/neutron/dhcp/83a5d156-32a8-4c4d-8a44-164aa1c3b9d5/host
Nov 23 10:03:03 np0005532585.localdomain dnsmasq-dhcp[327226]: read /var/lib/neutron/dhcp/83a5d156-32a8-4c4d-8a44-164aa1c3b9d5/opts
Nov 23 10:03:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:03.815 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 2eb60cfd-51cf-472d-b9e9-d6a48f126e7a with type ""
Nov 23 10:03:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:03Z|00404|binding|INFO|Removing iface tapf178a34a-d3 ovn-installed in OVS
Nov 23 10:03:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:03Z|00405|binding|INFO|Removing lport f178a34a-d388-4d80-b907-596688f93fe4 ovn-installed in OVS
Nov 23 10:03:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:03.817 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c36fb98c-d1bf-4015-90da-0f0d5f6a670b, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f178a34a-d388-4d80-b907-596688f93fe4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:03.819 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f178a34a-d388-4d80-b907-596688f93fe4 in datapath 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 unbound from our chassis
Nov 23 10:03:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:03.820 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:03.822 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:03 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:03.823 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c14796ea-d423-46db-9b09-13869acaba6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:03 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:03.950 263258 INFO neutron.agent.dhcp.agent [None req-e9b077a5-7afd-47d8-b5f5-92d5dc34fa91 - - - - - -] DHCP configuration for ports {'eb1ed841-526e-4097-b1c7-63ca46f0e711'} is completed
Nov 23 10:03:03 np0005532585.localdomain ceph-mon[300199]: osdmap e139: 6 total, 6 up, 6 in
Nov 23 10:03:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3154268914' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3154268914' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:03 np0005532585.localdomain ceph-mon[300199]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 817 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 2.8 KiB/s wr, 60 op/s
Nov 23 10:03:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e140 e140: 6 total, 6 up, 6 in
Nov 23 10:03:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:04Z|00406|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.181 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain systemd[1]: tmp-crun.eabhyS.mount: Deactivated successfully.
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327226]: exiting on receipt of SIGTERM
Nov 23 10:03:04 np0005532585.localdomain podman[327268]: 2025-11-23 10:03:04.184441447 +0000 UTC m=+0.094282596 container kill 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:03:04 np0005532585.localdomain systemd[1]: libpod-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a.scope: Deactivated successfully.
Nov 23 10:03:04 np0005532585.localdomain podman[327284]: 
Nov 23 10:03:04 np0005532585.localdomain podman[327284]: 2025-11-23 10:03:04.240136943 +0000 UTC m=+0.104928214 container create 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:03:04 np0005532585.localdomain podman[327298]: 2025-11-23 10:03:04.267766794 +0000 UTC m=+0.058228995 container died 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:03:04 np0005532585.localdomain systemd[1]: Started libpod-conmon-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2.scope.
Nov 23 10:03:04 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:04 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55cdf0f78a503799b5700a918cf48dfcff386165f14819e636e88511fba01e53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:04 np0005532585.localdomain podman[327284]: 2025-11-23 10:03:04.209750947 +0000 UTC m=+0.074542258 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:04 np0005532585.localdomain podman[327298]: 2025-11-23 10:03:04.311537922 +0000 UTC m=+0.102000063 container remove 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:03:04 np0005532585.localdomain systemd[1]: libpod-conmon-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a.scope: Deactivated successfully.
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain kernel: device tapf178a34a-d3 left promiscuous mode
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.347 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain podman[327284]: 2025-11-23 10:03:04.353289858 +0000 UTC m=+0.218081159 container init 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:04 np0005532585.localdomain podman[327284]: 2025-11-23 10:03:04.361860732 +0000 UTC m=+0.226652033 container start 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:03:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.363 263258 INFO neutron.agent.dhcp.agent [None req-9cc8ecab-e579-4f06-a478-17f4a501dcc1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.364 263258 INFO neutron.agent.dhcp.agent [None req-9cc8ecab-e579-4f06-a478-17f4a501dcc1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327332]: started, version 2.85 cachesize 150
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327332]: DNS service limited to local subnets
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327332]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327332]: warning: no upstream servers configured
Nov 23 10:03:04 np0005532585.localdomain dnsmasq-dhcp[327332]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:03:04 np0005532585.localdomain dnsmasq-dhcp[327332]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses
Nov 23 10:03:04 np0005532585.localdomain dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:04 np0005532585.localdomain dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.431 263258 INFO neutron.agent.dhcp.agent [None req-2fd858b7-7255-49f3-9187-bc2e5925323e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c942ffd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c94269d0>], id=2464e9e2-44d8-4196-918c-708e8d6859f0, ip_allocation=immediate, mac_address=fa:16:3e:c6:53:18, name=tempest-PortsIpV6TestJSON-426008471, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:03Z, description=, dns_domain=, id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-274315444, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8521, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['4e9cd968-5f62-4975-b489-a919561db40e', 'd6862539-34a2-478e-9e57-9c8909b1626a'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:03:00Z, vlan_transparent=None, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cb577a71-e41d-409b-b673-d883cbdab535'], standard_attr_id=2145, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:03:03Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a
Nov 23 10:03:04 np0005532585.localdomain dnsmasq[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses
Nov 23 10:03:04 np0005532585.localdomain dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:04 np0005532585.localdomain dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:04 np0005532585.localdomain podman[327351]: 2025-11-23 10:03:04.618037224 +0000 UTC m=+0.056686158 container kill 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.788 263258 INFO neutron.agent.dhcp.agent [None req-6b0254d4-fb3c-43dd-b68a-b1edb6646469 - - - - - -] DHCP configuration for ports {'924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed
Nov 23 10:03:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.801 263258 INFO neutron.agent.linux.ip_lib [None req-c99f2a0f-b28f-4d20-970a-5732e585295c - - - - - -] Device tap061fe319-39 cannot be used as it has no MAC address
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.826 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain kernel: device tap061fe319-39 entered promiscuous mode
Nov 23 10:03:04 np0005532585.localdomain systemd-udevd[327138]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:04 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892184.8344] manager: (tap061fe319-39): new Generic device (/org/freedesktop/NetworkManager/Devices/67)
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:04Z|00407|binding|INFO|Claiming lport 061fe319-398b-4b80-ab34-399e616693fc for this chassis.
Nov 23 10:03:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:04Z|00408|binding|INFO|061fe319-398b-4b80-ab34-399e616693fc: Claiming unknown
Nov 23 10:03:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:04.852 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2286c65-ff8f-4cd6-91b0-87853206477a, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=061fe319-398b-4b80-ab34-399e616693fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:04.855 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 061fe319-398b-4b80-ab34-399e616693fc in datapath 6e10a0ab-0466-4df2-91a1-22e4b25912c9 bound to our chassis
Nov 23 10:03:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:04.858 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6e10a0ab-0466-4df2-91a1-22e4b25912c9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:04.859 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e3051371-e50a-45b2-933c-68dd00dd46d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:04Z|00409|binding|INFO|Setting lport 061fe319-398b-4b80-ab34-399e616693fc ovn-installed in OVS
Nov 23 10:03:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:04Z|00410|binding|INFO|Setting lport 061fe319-398b-4b80-ab34-399e616693fc up in Southbound
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.877 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap061fe319-39: No such device
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.923 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:04.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.956 263258 INFO neutron.agent.dhcp.agent [None req-610ffc88-92d6-4276-9494-fdc9efff803c - - - - - -] DHCP configuration for ports {'2464e9e2-44d8-4196-918c-708e8d6859f0'} is completed
Nov 23 10:03:04 np0005532585.localdomain ceph-mon[300199]: osdmap e140: 6 total, 6 up, 6 in
Nov 23 10:03:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6f1a9a6b5db597e2d43b68f915d07dc93135e7537aca6fb565346ec09a7ab2f4-merged.mount: Deactivated successfully.
Nov 23 10:03:05 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:05 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d83a5d156\x2d32a8\x2d4c4d\x2d8a44\x2d164aa1c3b9d5.mount: Deactivated successfully.
Nov 23 10:03:05 np0005532585.localdomain podman[327448]: 
Nov 23 10:03:05 np0005532585.localdomain podman[327448]: 2025-11-23 10:03:05.816672229 +0000 UTC m=+0.075371203 container create 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:03:05 np0005532585.localdomain systemd[1]: Started libpod-conmon-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543.scope.
Nov 23 10:03:05 np0005532585.localdomain systemd[1]: tmp-crun.ZayxQd.mount: Deactivated successfully.
Nov 23 10:03:05 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:05 np0005532585.localdomain podman[327448]: 2025-11-23 10:03:05.77616448 +0000 UTC m=+0.034863524 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:05 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba971e870817c0c1c87f8a7ba62f53f6dd751eec21c91b2626ace24c9cd01064/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:05 np0005532585.localdomain podman[327448]: 2025-11-23 10:03:05.886589953 +0000 UTC m=+0.145288937 container init 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:05 np0005532585.localdomain podman[327448]: 2025-11-23 10:03:05.895437625 +0000 UTC m=+0.154136569 container start 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:03:05 np0005532585.localdomain dnsmasq[327466]: started, version 2.85 cachesize 150
Nov 23 10:03:05 np0005532585.localdomain dnsmasq[327466]: DNS service limited to local subnets
Nov 23 10:03:05 np0005532585.localdomain dnsmasq[327466]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:05 np0005532585.localdomain dnsmasq[327466]: warning: no upstream servers configured
Nov 23 10:03:05 np0005532585.localdomain dnsmasq-dhcp[327466]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:05 np0005532585.localdomain dnsmasq[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/addn_hosts - 0 addresses
Nov 23 10:03:05 np0005532585.localdomain dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/host
Nov 23 10:03:05 np0005532585.localdomain dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/opts
Nov 23 10:03:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/792766910' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/792766910' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:06 np0005532585.localdomain ceph-mon[300199]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 3.5 KiB/s wr, 101 op/s
Nov 23 10:03:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:06.062 263258 INFO neutron.agent.dhcp.agent [None req-a1ab9fb7-8fbc-4132-8373-86c8c13f656e - - - - - -] DHCP configuration for ports {'945b3511-28ea-44ed-9bdc-8df44b5d381d'} is completed
Nov 23 10:03:06 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:06.167 263258 INFO neutron.agent.linux.ip_lib [None req-51664d74-6d4e-4769-b7da-c8cf858cc7a5 - - - - - -] Device tapb877a41e-a1 cannot be used as it has no MAC address
Nov 23 10:03:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:06.188 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:06 np0005532585.localdomain kernel: device tapb877a41e-a1 entered promiscuous mode
Nov 23 10:03:06 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892186.1946] manager: (tapb877a41e-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/68)
Nov 23 10:03:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:06.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:06Z|00411|binding|INFO|Claiming lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 for this chassis.
Nov 23 10:03:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:06Z|00412|binding|INFO|b877a41e-a15e-4b64-bd71-6cac8fa9c439: Claiming unknown
Nov 23 10:03:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:06.210 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3ed04a-a453-47aa-85a6-639016eec1c2, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b877a41e-a15e-4b64-bd71-6cac8fa9c439) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:06.213 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b877a41e-a15e-4b64-bd71-6cac8fa9c439 in datapath 8f0b7507-4636-45b3-a4b2-9fdd55176a18 bound to our chassis
Nov 23 10:03:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:06.214 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f0b7507-4636-45b3-a4b2-9fdd55176a18 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:06 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:06.215 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d40215-2eca-4c21-adc2-4601ea229b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:06Z|00413|binding|INFO|Setting lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 ovn-installed in OVS
Nov 23 10:03:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:06Z|00414|binding|INFO|Setting lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 up in Southbound
Nov 23 10:03:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:06.243 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:06 np0005532585.localdomain dnsmasq[327332]: exiting on receipt of SIGTERM
Nov 23 10:03:06 np0005532585.localdomain podman[327491]: 2025-11-23 10:03:06.250914596 +0000 UTC m=+0.075633831 container kill 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:03:06 np0005532585.localdomain systemd[1]: libpod-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2.scope: Deactivated successfully.
Nov 23 10:03:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:06.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:06.320 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:06 np0005532585.localdomain podman[327514]: 2025-11-23 10:03:06.32832651 +0000 UTC m=+0.057148001 container died 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:03:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:03:06 np0005532585.localdomain podman[327514]: 2025-11-23 10:03:06.399500573 +0000 UTC m=+0.128322004 container remove 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:03:06 np0005532585.localdomain podman[327540]: 2025-11-23 10:03:06.455979253 +0000 UTC m=+0.090386106 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:03:06 np0005532585.localdomain systemd[1]: libpod-conmon-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2.scope: Deactivated successfully.
Nov 23 10:03:06 np0005532585.localdomain podman[327542]: 2025-11-23 10:03:06.435050978 +0000 UTC m=+0.065543340 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3)
Nov 23 10:03:06 np0005532585.localdomain podman[327542]: 2025-11-23 10:03:06.518270941 +0000 UTC m=+0.148763303 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:03:06 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:03:06 np0005532585.localdomain podman[327540]: 2025-11-23 10:03:06.539707272 +0000 UTC m=+0.174114155 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:03:06 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:03:06 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:06.640 2 INFO neutron.agent.securitygroups_rpc [None req-3cc8ae58-6898-4e12-983b-8f9645bd7e63 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['a9134bcb-5194-43f8-ac1f-875c59af23f5', '736c2f34-1be4-42fe-9283-c00aaa4f421b', 'cb577a71-e41d-409b-b673-d883cbdab535']
Nov 23 10:03:06 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:06.755 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-55cdf0f78a503799b5700a918cf48dfcff386165f14819e636e88511fba01e53-merged.mount: Deactivated successfully.
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:07Z|00415|binding|INFO|Releasing lport 061fe319-398b-4b80-ab34-399e616693fc from this chassis (sb_readonly=0)
Nov 23 10:03:07 np0005532585.localdomain kernel: device tap061fe319-39 left promiscuous mode
Nov 23 10:03:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:07.137 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:07Z|00416|binding|INFO|Setting lport 061fe319-398b-4b80-ab34-399e616693fc down in Southbound
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.146 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fc6931e9-753f-4431-a2fb-7a1d4a7833f7 with type ""
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.148 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2286c65-ff8f-4cd6-91b0-87853206477a, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=061fe319-398b-4b80-ab34-399e616693fc) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.150 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 061fe319-398b-4b80-ab34-399e616693fc in datapath 6e10a0ab-0466-4df2-91a1-22e4b25912c9 unbound from our chassis
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.153 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6e10a0ab-0466-4df2-91a1-22e4b25912c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.154 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[04afa2ed-8717-48eb-8ba0-c1958230ae5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:07.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain podman[327653]: 
Nov 23 10:03:07 np0005532585.localdomain podman[327653]: 2025-11-23 10:03:07.280850553 +0000 UTC m=+0.096622897 container create 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: Started libpod-conmon-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0.scope.
Nov 23 10:03:07 np0005532585.localdomain podman[327653]: 2025-11-23 10:03:07.23401199 +0000 UTC m=+0.049784374 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fcce9fbacd628dae8de270527f13d43879fe58249a6413d8bf69abd83505308/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:07 np0005532585.localdomain podman[327653]: 2025-11-23 10:03:07.363946713 +0000 UTC m=+0.179719057 container init 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:03:07 np0005532585.localdomain podman[327653]: 2025-11-23 10:03:07.369602847 +0000 UTC m=+0.185375191 container start 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327677]: started, version 2.85 cachesize 150
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327677]: DNS service limited to local subnets
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327677]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327677]: warning: no upstream servers configured
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327677]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327677]: read /var/lib/neutron/dhcp/8f0b7507-4636-45b3-a4b2-9fdd55176a18/addn_hosts - 0 addresses
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327677]: read /var/lib/neutron/dhcp/8f0b7507-4636-45b3-a4b2-9fdd55176a18/host
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327677]: read /var/lib/neutron/dhcp/8f0b7507-4636-45b3-a4b2-9fdd55176a18/opts
Nov 23 10:03:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e141 e141: 6 total, 6 up, 6 in
Nov 23 10:03:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.496 263258 INFO neutron.agent.dhcp.agent [None req-43c6540c-db3e-441f-8f3d-4ab6973ba6cc - - - - - -] DHCP configuration for ports {'96a5e037-97e8-4711-9d19-9b28623d3884'} is completed
Nov 23 10:03:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:07Z|00417|binding|INFO|Removing iface tapb877a41e-a1 ovn-installed in OVS
Nov 23 10:03:07 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:07Z|00418|binding|INFO|Removing lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 ovn-installed in OVS
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.542 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 81f96baf-17cd-4ec7-94c4-dd9a0166c427 with type ""
Nov 23 10:03:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:07.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.544 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3ed04a-a453-47aa-85a6-639016eec1c2, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b877a41e-a15e-4b64-bd71-6cac8fa9c439) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.546 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b877a41e-a15e-4b64-bd71-6cac8fa9c439 in datapath 8f0b7507-4636-45b3-a4b2-9fdd55176a18 unbound from our chassis
Nov 23 10:03:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:07.547 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.548 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f0b7507-4636-45b3-a4b2-9fdd55176a18 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:07.549 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5218a3-a844-4f99-8b92-75f06e563fdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:07 np0005532585.localdomain podman[327714]: 
Nov 23 10:03:07 np0005532585.localdomain podman[327714]: 2025-11-23 10:03:07.698388165 +0000 UTC m=+0.092389357 container create 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:03:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:07.724 2 INFO neutron.agent.securitygroups_rpc [None req-63da6581-da02-4188-8e89-eab01a812a16 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['736c2f34-1be4-42fe-9283-c00aaa4f421b', 'a9134bcb-5194-43f8-ac1f-875c59af23f5']
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: Started libpod-conmon-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688.scope.
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:07 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cf829e7df822dea19af929cb8462e294ad3327afd2b8aab63876759ece9d62b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:07 np0005532585.localdomain podman[327714]: 2025-11-23 10:03:07.655285468 +0000 UTC m=+0.049286670 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327677]: exiting on receipt of SIGTERM
Nov 23 10:03:07 np0005532585.localdomain podman[327725]: 2025-11-23 10:03:07.764509403 +0000 UTC m=+0.117895293 container kill 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: libpod-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0.scope: Deactivated successfully.
Nov 23 10:03:07 np0005532585.localdomain podman[327714]: 2025-11-23 10:03:07.808585191 +0000 UTC m=+0.202586383 container init 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:07 np0005532585.localdomain podman[327714]: 2025-11-23 10:03:07.819705633 +0000 UTC m=+0.213706825 container start 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327769]: started, version 2.85 cachesize 150
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327769]: DNS service limited to local subnets
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327769]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327769]: warning: no upstream servers configured
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327769]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327769]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327769]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:03:07 np0005532585.localdomain dnsmasq[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:07 np0005532585.localdomain dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:07 np0005532585.localdomain podman[327748]: 2025-11-23 10:03:07.858118206 +0000 UTC m=+0.070623787 container died 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:03:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.888 263258 INFO neutron.agent.dhcp.agent [None req-8688bc5c-3dea-4cc6-99fd-8cc9764024f1 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ba9fa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ba9130>], id=2464e9e2-44d8-4196-918c-708e8d6859f0, ip_allocation=immediate, mac_address=fa:16:3e:c6:53:18, name=tempest-PortsIpV6TestJSON-194744152, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['736c2f34-1be4-42fe-9283-c00aaa4f421b', 'a9134bcb-5194-43f8-ac1f-875c59af23f5'], standard_attr_id=2145, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:03:06Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a
Nov 23 10:03:07 np0005532585.localdomain podman[327748]: 2025-11-23 10:03:07.900617505 +0000 UTC m=+0.113123046 container remove 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:03:07 np0005532585.localdomain systemd[1]: libpod-conmon-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0.scope: Deactivated successfully.
Nov 23 10:03:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:07.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain kernel: device tapb877a41e-a1 left promiscuous mode
Nov 23 10:03:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:07.968 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.981 263258 INFO neutron.agent.dhcp.agent [None req-69d033d8-169e-45e6-ac25-65fe4ac66d6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.982 263258 INFO neutron.agent.dhcp.agent [None req-69d033d8-169e-45e6-ac25-65fe4ac66d6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.015 263258 INFO neutron.agent.dhcp.agent [None req-1b1b37ba-85ae-4a3b-87a2-55c9bd34e64d - - - - - -] DHCP configuration for ports {'2464e9e2-44d8-4196-918c-708e8d6859f0', '924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed
Nov 23 10:03:08 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:08Z|00419|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:08 np0005532585.localdomain dnsmasq[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses
Nov 23 10:03:08 np0005532585.localdomain dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:08 np0005532585.localdomain dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:08 np0005532585.localdomain podman[327790]: 2025-11-23 10:03:08.050009458 +0000 UTC m=+0.059187455 container kill 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 23 10:03:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-2fcce9fbacd628dae8de270527f13d43879fe58249a6413d8bf69abd83505308-merged.mount: Deactivated successfully.
Nov 23 10:03:08 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:08 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d8f0b7507\x2d4636\x2d45b3\x2da4b2\x2d9fdd55176a18.mount: Deactivated successfully.
Nov 23 10:03:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:08.083 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:08 np0005532585.localdomain dnsmasq[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/addn_hosts - 0 addresses
Nov 23 10:03:08 np0005532585.localdomain dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/host
Nov 23 10:03:08 np0005532585.localdomain dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/opts
Nov 23 10:03:08 np0005532585.localdomain podman[327821]: 2025-11-23 10:03:08.212693379 +0000 UTC m=+0.067059837 container kill 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:03:08 np0005532585.localdomain systemd[1]: tmp-crun.fS6y5m.mount: Deactivated successfully.
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent [None req-79cd50a9-85d4-4b74-9e50-a1a32448a522 - - - - - -] Unable to reload_allocations dhcp for 6e10a0ab-0466-4df2-91a1-22e4b25912c9.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap061fe319-39 not found in namespace qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9.
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap061fe319-39 not found in namespace qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9.
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent 
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.343 263258 INFO neutron.agent.dhcp.agent [None req-2e37a1d0-27d1-4e9b-91ba-b35a2c6cf695 - - - - - -] DHCP configuration for ports {'2464e9e2-44d8-4196-918c-708e8d6859f0'} is completed
Nov 23 10:03:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:08 np0005532585.localdomain dnsmasq[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses
Nov 23 10:03:08 np0005532585.localdomain dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:08 np0005532585.localdomain dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:08 np0005532585.localdomain podman[327856]: 2025-11-23 10:03:08.418275922 +0000 UTC m=+0.059622348 container kill 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:08 np0005532585.localdomain ceph-mon[300199]: osdmap e141: 6 total, 6 up, 6 in
Nov 23 10:03:08 np0005532585.localdomain ceph-mon[300199]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 3.5 KiB/s wr, 101 op/s
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.579 263258 INFO neutron.agent.dhcp.agent [None req-54274fa7-429b-40de-9645-e5a1e1ac12d0 - - - - - -] Synchronizing state
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.774 263258 INFO neutron.agent.dhcp.agent [None req-0bff647a-37c2-4421-8ce0-5d475f777c74 - - - - - -] All active networks have been fetched through RPC.
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.775 263258 INFO neutron.agent.dhcp.agent [-] Starting network 6e10a0ab-0466-4df2-91a1-22e4b25912c9 dhcp configuration
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.777 263258 INFO neutron.agent.dhcp.agent [-] Finished network 6e10a0ab-0466-4df2-91a1-22e4b25912c9 dhcp configuration
Nov 23 10:03:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.777 263258 INFO neutron.agent.dhcp.agent [None req-0bff647a-37c2-4421-8ce0-5d475f777c74 - - - - - -] Synchronizing state complete
Nov 23 10:03:09 np0005532585.localdomain dnsmasq[327466]: exiting on receipt of SIGTERM
Nov 23 10:03:09 np0005532585.localdomain podman[327893]: 2025-11-23 10:03:09.013361414 +0000 UTC m=+0.058415401 container kill 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:09 np0005532585.localdomain systemd[1]: libpod-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543.scope: Deactivated successfully.
Nov 23 10:03:09 np0005532585.localdomain podman[327907]: 2025-11-23 10:03:09.083541506 +0000 UTC m=+0.053822160 container died 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:09 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:09 np0005532585.localdomain podman[327907]: 2025-11-23 10:03:09.113838689 +0000 UTC m=+0.084119303 container cleanup 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:03:09 np0005532585.localdomain systemd[1]: libpod-conmon-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543.scope: Deactivated successfully.
Nov 23 10:03:09 np0005532585.localdomain podman[327908]: 2025-11-23 10:03:09.165062627 +0000 UTC m=+0.130557053 container remove 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:03:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:09.301 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:03:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:09.302 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:03:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:03:09 np0005532585.localdomain dnsmasq[327769]: exiting on receipt of SIGTERM
Nov 23 10:03:09 np0005532585.localdomain podman[327954]: 2025-11-23 10:03:09.462047946 +0000 UTC m=+0.060896527 container kill 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:03:09 np0005532585.localdomain systemd[1]: libpod-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688.scope: Deactivated successfully.
Nov 23 10:03:09 np0005532585.localdomain podman[327966]: 2025-11-23 10:03:09.525775879 +0000 UTC m=+0.053101287 container died 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:03:09 np0005532585.localdomain podman[327966]: 2025-11-23 10:03:09.556302299 +0000 UTC m=+0.083627657 container cleanup 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:09 np0005532585.localdomain systemd[1]: libpod-conmon-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688.scope: Deactivated successfully.
Nov 23 10:03:09 np0005532585.localdomain podman[327973]: 2025-11-23 10:03:09.580095642 +0000 UTC m=+0.094583425 container remove 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:03:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:09.719 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3cf829e7df822dea19af929cb8462e294ad3327afd2b8aab63876759ece9d62b-merged.mount: Deactivated successfully.
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ba971e870817c0c1c87f8a7ba62f53f6dd751eec21c91b2626ace24c9cd01064-merged.mount: Deactivated successfully.
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d6e10a0ab\x2d0466\x2d4df2\x2d91a1\x2d22e4b25912c9.mount: Deactivated successfully.
Nov 23 10:03:10 np0005532585.localdomain podman[328046]: 
Nov 23 10:03:10 np0005532585.localdomain ceph-mon[300199]: pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 5.9 KiB/s wr, 155 op/s
Nov 23 10:03:10 np0005532585.localdomain podman[328046]: 2025-11-23 10:03:10.47014584 +0000 UTC m=+0.084753532 container create 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9.scope.
Nov 23 10:03:10 np0005532585.localdomain podman[328046]: 2025-11-23 10:03:10.430715446 +0000 UTC m=+0.045323188 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: tmp-crun.OYJzP2.mount: Deactivated successfully.
Nov 23 10:03:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ba24ef5a8302e2b3c20cfab428fcbe8c485862935b4b78879171770c8fc9f10/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:10 np0005532585.localdomain podman[328046]: 2025-11-23 10:03:10.562482705 +0000 UTC m=+0.177090417 container init 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:03:10 np0005532585.localdomain podman[328046]: 2025-11-23 10:03:10.573307987 +0000 UTC m=+0.187915679 container start 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:10 np0005532585.localdomain dnsmasq[328064]: started, version 2.85 cachesize 150
Nov 23 10:03:10 np0005532585.localdomain dnsmasq[328064]: DNS service limited to local subnets
Nov 23 10:03:10 np0005532585.localdomain dnsmasq[328064]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:10 np0005532585.localdomain dnsmasq[328064]: warning: no upstream servers configured
Nov 23 10:03:10 np0005532585.localdomain dnsmasq-dhcp[328064]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d
Nov 23 10:03:10 np0005532585.localdomain dnsmasq-dhcp[328064]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 23 10:03:10 np0005532585.localdomain dnsmasq[328064]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses
Nov 23 10:03:10 np0005532585.localdomain dnsmasq-dhcp[328064]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host
Nov 23 10:03:10 np0005532585.localdomain dnsmasq-dhcp[328064]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.808 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.809 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.814 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0e3e868-e2f4-4395-addc-ef5cd04b7c7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.810037', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e0fac2e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '0335b02f310bfcdf7fb13700cf0a116f65787ddc94335ef2946d73bc7307270e'}]}, 'timestamp': '2025-11-23 10:03:10.815664', '_unique_id': 'a73f099a69ff4389b375e8caaf135c21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.818 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fec2fbc-75c6-470a-bc9f-066308d0d288', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.818679', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1039e6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'fe75d2e566ddf4e4919a2a32930ffceacbf3c4d040ab2291dcbc3d5b6aab0c7c'}]}, 'timestamp': '2025-11-23 10:03:10.819270', '_unique_id': 'af46d71b12f04db29b05d37a0edca048'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.821 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a98328c3-ddea-4a4f-9a62-82d99387bcb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.821635', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e10b024-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '6d1bd2f83efed2d6bc6a43d7fab8d24a5cb4283c8905ee2c5825223989be1418'}]}, 'timestamp': '2025-11-23 10:03:10.822339', '_unique_id': '4227c147df6942118c616ecd1941cfc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.837 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '155ed4a9-b6fe-46d3-a592-7b700e7c46f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.824988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e132c64-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': '16ae143cd5222606699319d2a4b38d098598cf0a6262becbdd2b44c6fe9d758f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.824988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1340d2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': 'c3548e4d3449b56c6c5e370738e57d37cc528009c54af3765d2b7ff65842a0c3'}]}, 'timestamp': '2025-11-23 10:03:10.839077', '_unique_id': 'd336a5606db146b79912be48ac751e21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.855 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 17230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8d1c971-59c4-4953-b494-93f271501b02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17230000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:03:10.841688', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9e15d82e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.03303766, 'message_signature': '31ec03992c52a90895bb912d367e9267d6adf5cc8539e6fb8ad50b68a7ba1d34'}]}, 'timestamp': '2025-11-23 10:03:10.856080', '_unique_id': '24730bf0199c4960b30d87fab01278ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.858 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.858 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13a3d47e-50eb-4977-942d-05a7f131292c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.858527', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1653f8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'c2224f4c52ca879f47a5acf705ecfeafa93e7078b8c75a2797a837dfed075367'}]}, 'timestamp': '2025-11-23 10:03:10.859204', '_unique_id': 'bcd61683894845b5a9ca7014f084f314'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.861 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b99e09a5-ab29-40ca-b34d-dc1bfc27f4b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.861702', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e16cc5c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '8f5fbb2a8a1c51ca60d7c7415844da67bc851d8fe5aded0eccdff06a888d90b3'}]}, 'timestamp': '2025-11-23 10:03:10.862281', '_unique_id': '74084e462d4c448aa08436e624c6b149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d32774b-4e80-4285-8d02-b4f32f5e94a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.864587', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e173e80-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'de3ce55ecc478b07fabdd93662127e3d2042338cc9fe17fe9dc9777c291b8a93'}]}, 'timestamp': '2025-11-23 10:03:10.865220', '_unique_id': 'd5682f46d5984aa9b6989c8ae62e1a80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38dc6279-905e-42b9-841d-5b0e3f78fbd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.867519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1b9c64-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'f79adf0318f1ae087e1dec08f32ef44f5e37e8ce08c53c6abeb9363ee36d7a45'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.867519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1bb00a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '8f5015d2f818ae65a5024908fb86d81a0ee8471835cf5baaa8c08ed84cab9421'}]}, 'timestamp': '2025-11-23 10:03:10.894296', '_unique_id': '9809d682e77740eba93a34b8123cb8a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0bb7a02-9a1c-4bcc-93d7-f5295218a85e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.897201', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1c3336-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '68764f4421e94c394f727a6340be7e9109ea814b78bd5e3f662536d8a55778fb'}]}, 'timestamp': '2025-11-23 10:03:10.897676', '_unique_id': '0b2c9551c7064438951a4839e1963070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8588c11-7131-42c6-8d85-4eb2c4f3f649', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.899832', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1c9b46-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': '09f5735f5a39fbb5beb1818e509d9f7165cae33d94f8c2a69cb5ca892043865c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.899832', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1cab9a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': 'd7c56293bdf7a6ecc32de31ccfcb384b26443e7c47ccc5629c22ff7ce580a658'}]}, 'timestamp': '2025-11-23 10:03:10.900755', '_unique_id': 'c59342cadb7744e5893ac8a6827d3060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f78c272e-a4e3-4c02-8680-69396dba8631', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.902986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1d14c2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '6c051522517119bbefc94e369f7e06e0ac1ed3d4eb9f4de15e9d735d7dabacaf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.902986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1d24da-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'c3ed51fdef33b26a958cf77bdf45b4c5ef669a1b3f2c9c6dad7aedfe2a802b20'}]}, 'timestamp': '2025-11-23 10:03:10.903829', '_unique_id': '7ab8107b682849978f9384aecb8adee0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae1e8ca7-b27a-4a86-8351-b4b72178b5f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.906419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1d9adc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'ed2b899cad525eccc4fe90b9deaeb4a88acf0e25f24f2d95a5a89d7b7cb6997a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.906419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1dacac-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '80c0fcff1345accb9b291fd07f97521047963e83175eba79ea06c7b99843882d'}]}, 'timestamp': '2025-11-23 10:03:10.907307', '_unique_id': '3017044fb6424ded9b841968858aab4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5160dda2-c08c-4b81-ae81-d48a3cb36302', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.909642', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1e189a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'f76dd5340a3132a4cc840dc996b39adadcff92960c3c401939185450614a98a0'}]}, 'timestamp': '2025-11-23 10:03:10.910144', '_unique_id': '829541342d574ef7ae2fa1c819aa94f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b61c6cf-3d1f-4419-9dca-8f5618a68875', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.912305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1e8122-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': 'eb636a3863bfff4c54df122350b58ee6e0bc917abffb7a9c36dff6d8571e76af'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.912305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1e9400-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': '35870cfed0939f8a3c66c77e686e11c1fa68a5946c86853217bd4952866a4c54'}]}, 'timestamp': '2025-11-23 10:03:10.913231', '_unique_id': '901d85a0f1d74179a8d7594ac0323878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd25ee790-4f32-4369-9d69-6db82930f72e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.915427', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1efad0-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '74d4ea274c458856377c887369cb55bd06cc27be79d793e84818a5971444a2c9'}]}, 'timestamp': '2025-11-23 10:03:10.915921', '_unique_id': '0befcbbfb24548ed9cc4ac30b74204d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85be9042-f65c-4e9f-8606-4958c7133d2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:03:10.918228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9e1f6830-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.03303766, 'message_signature': 'b215422ac38a4b983734c6d3c48042d57a2137de35bfef95bc66bf76151af8c6'}]}, 'timestamp': '2025-11-23 10:03:10.918672', '_unique_id': '6279c0b69b9a4e7e9058406b40198e56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9606e24-6251-4778-bc3e-f7edfa05ef9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.920869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1fd0d6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '18a9f388942341ab7f653183b0896b749e7e27f7a4945881aae9c0c854e48d6d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.920869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1fe15c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '80bf3b7d5c0dcfa1a0e315f756e3a17489fadf056db425d8a31a3173bdb1f0f5'}]}, 'timestamp': '2025-11-23 10:03:10.921763', '_unique_id': 'd6120e5bb07949bc880164dd562c4a22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '283940b8-1d53-47a0-8b7e-6a280abf2f3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.924548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e20606e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'e4f945f9cd9858d433ab0ad7e0285ac2103ed25e402598471ed612259b42757d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.924548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e207310-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '890cd471dfdac864f537369fbd3eb6fb915b147f1a3011f6c4c310677f013ebc'}]}, 'timestamp': '2025-11-23 10:03:10.925497', '_unique_id': 'd8ec1eeceb68418d8732675db67ca790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '096a7b19-8455-44b4-9c6e-95b76600a2c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.927714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e20dbfc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'be6e5072d52e5523d02c00f1cd25352c711a2a3c38ed978f10bea3b7a32b514b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.927714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e20ecdc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '6d30d8842ca5f596da206f2c42ef984c42a8affd2c2ff626f65e9334aa5eb0c2'}]}, 'timestamp': '2025-11-23 10:03:10.928614', '_unique_id': '32e2ad4ec1ac4cb5ab7126c6f3280b34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd2a71df-e17e-42fd-a627-af76004d1dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.930506', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e2144b6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '32aafc37a153060eced10c2b3c0674d2d6e1300aab82d6cffcc90c04c81037a4'}]}, 'timestamp': '2025-11-23 10:03:10.930802', '_unique_id': '9f232413dc1c423eb554c9cc329b66e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:03:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:03:11 np0005532585.localdomain dnsmasq[328064]: exiting on receipt of SIGTERM
Nov 23 10:03:11 np0005532585.localdomain systemd[1]: tmp-crun.vbunDl.mount: Deactivated successfully.
Nov 23 10:03:11 np0005532585.localdomain podman[328082]: 2025-11-23 10:03:11.181583337 +0000 UTC m=+0.052220530 container kill 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:11 np0005532585.localdomain systemd[1]: libpod-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9.scope: Deactivated successfully.
Nov 23 10:03:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.230 263258 INFO neutron.agent.dhcp.agent [None req-020f4ddf-6b4e-44a2-b39b-a1ca5e7a7bec - - - - - -] DHCP configuration for ports {'924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed
Nov 23 10:03:11 np0005532585.localdomain podman[328095]: 2025-11-23 10:03:11.241554004 +0000 UTC m=+0.047691280 container died 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:03:11 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:11.252 2 INFO neutron.agent.securitygroups_rpc [None req-261eb8d2-9d50-4b0d-8acd-8f9cb046e671 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']
Nov 23 10:03:11 np0005532585.localdomain podman[328095]: 2025-11-23 10:03:11.267296367 +0000 UTC m=+0.073410703 container cleanup 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:11 np0005532585.localdomain systemd[1]: libpod-conmon-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9.scope: Deactivated successfully.
Nov 23 10:03:11 np0005532585.localdomain podman[328097]: 2025-11-23 10:03:11.337166099 +0000 UTC m=+0.137042793 container remove 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:11.348 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:11 np0005532585.localdomain kernel: device tap924d747d-30 left promiscuous mode
Nov 23 10:03:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:11Z|00420|binding|INFO|Releasing lport 924d747d-3069-493d-890d-d22289f6cb63 from this chassis (sb_readonly=0)
Nov 23 10:03:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:11Z|00421|binding|INFO|Setting lport 924d747d-3069-493d-890d-d22289f6cb63 down in Southbound
Nov 23 10:03:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:11.361 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8:0:2::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c46eb8d9-f4c4-4e34-a4e0-3a70a48cb8ea, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=924d747d-3069-493d-890d-d22289f6cb63) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:11.362 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 924d747d-3069-493d-890d-d22289f6cb63 in datapath 0d2bb8b4-9b3e-41c7-b595-54664cfb433a unbound from our chassis
Nov 23 10:03:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:11.364 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:11 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:11.365 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5567c206-9958-4255-a311-92243606c033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:11.371 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:11.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.853 263258 INFO neutron.agent.dhcp.agent [None req-bbe7d4cd-ac9e-415c-9900-b81395b5eb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.854 263258 INFO neutron.agent.dhcp.agent [None req-bbe7d4cd-ac9e-415c-9900-b81395b5eb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.854 263258 INFO neutron.agent.dhcp.agent [None req-bbe7d4cd-ac9e-415c-9900-b81395b5eb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:03:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:03:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:03:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159325 "" "Go-http-client/1.1"
Nov 23 10:03:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.942 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:03:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20198 "" "Go-http-client/1.1"
Nov 23 10:03:11 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e142 e142: 6 total, 6 up, 6 in
Nov 23 10:03:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/95924581' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:11 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/95924581' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-7ba24ef5a8302e2b3c20cfab428fcbe8c485862935b4b78879171770c8fc9f10-merged.mount: Deactivated successfully.
Nov 23 10:03:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:12 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d0d2bb8b4\x2d9b3e\x2d41c7\x2db595\x2d54664cfb433a.mount: Deactivated successfully.
Nov 23 10:03:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:12 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:12Z|00422|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:12.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e143 e143: 6 total, 6 up, 6 in
Nov 23 10:03:12 np0005532585.localdomain ceph-mon[300199]: pgmap v291: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 95 KiB/s rd, 4.7 KiB/s wr, 125 op/s
Nov 23 10:03:12 np0005532585.localdomain ceph-mon[300199]: osdmap e142: 6 total, 6 up, 6 in
Nov 23 10:03:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:13.093 263258 INFO neutron.agent.linux.ip_lib [None req-6fb7b696-855a-49d5-9b13-4f354ffb2709 - - - - - -] Device tap7b7efa3a-46 cannot be used as it has no MAC address
Nov 23 10:03:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:13.170 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:13 np0005532585.localdomain kernel: device tap7b7efa3a-46 entered promiscuous mode
Nov 23 10:03:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:13Z|00423|binding|INFO|Claiming lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 for this chassis.
Nov 23 10:03:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:13Z|00424|binding|INFO|7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36: Claiming unknown
Nov 23 10:03:13 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892193.1780] manager: (tap7b7efa3a-46): new Generic device (/org/freedesktop/NetworkManager/Devices/69)
Nov 23 10:03:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:13.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:13 np0005532585.localdomain systemd-udevd[328133]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:13.193 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed19edd7-37c8-4522-837f-b32faf388727, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:13.194 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 in datapath 61421cbd-202e-4179-bacd-7ac7c4ff8280 bound to our chassis
Nov 23 10:03:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:13.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port a6c5f0c7-9090-48fd-88ac-c8698fa30062 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:03:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:13.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61421cbd-202e-4179-bacd-7ac7c4ff8280, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:03:13 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:13.198 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5582a4c5-0c42-4ab8-b34b-03d2a1075a31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:13Z|00425|binding|INFO|Setting lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 ovn-installed in OVS
Nov 23 10:03:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:13Z|00426|binding|INFO|Setting lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 up in Southbound
Nov 23 10:03:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:13.215 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:13.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:13.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:13 np0005532585.localdomain ceph-mon[300199]: osdmap e143: 6 total, 6 up, 6 in
Nov 23 10:03:13 np0005532585.localdomain ceph-mon[300199]: pgmap v294: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 2.8 KiB/s wr, 66 op/s
Nov 23 10:03:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2534350437' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2534350437' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:14 np0005532585.localdomain podman[328188]: 
Nov 23 10:03:14 np0005532585.localdomain podman[328188]: 2025-11-23 10:03:14.12431696 +0000 UTC m=+0.088312701 container create c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 23 10:03:14 np0005532585.localdomain systemd[1]: Started libpod-conmon-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08.scope.
Nov 23 10:03:14 np0005532585.localdomain podman[328188]: 2025-11-23 10:03:14.08311636 +0000 UTC m=+0.047112111 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:14 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:14 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aafddc39c2a04befbf58d4bf8d8ebb00098dbf032214687c1536c4fc26c16c3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:14 np0005532585.localdomain podman[328188]: 2025-11-23 10:03:14.204283554 +0000 UTC m=+0.168279295 container init c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:14 np0005532585.localdomain podman[328188]: 2025-11-23 10:03:14.215134818 +0000 UTC m=+0.179130569 container start c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:03:14 np0005532585.localdomain dnsmasq[328206]: started, version 2.85 cachesize 150
Nov 23 10:03:14 np0005532585.localdomain dnsmasq[328206]: DNS service limited to local subnets
Nov 23 10:03:14 np0005532585.localdomain dnsmasq[328206]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:14 np0005532585.localdomain dnsmasq[328206]: warning: no upstream servers configured
Nov 23 10:03:14 np0005532585.localdomain dnsmasq-dhcp[328206]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:14 np0005532585.localdomain dnsmasq[328206]: read /var/lib/neutron/dhcp/61421cbd-202e-4179-bacd-7ac7c4ff8280/addn_hosts - 0 addresses
Nov 23 10:03:14 np0005532585.localdomain dnsmasq-dhcp[328206]: read /var/lib/neutron/dhcp/61421cbd-202e-4179-bacd-7ac7c4ff8280/host
Nov 23 10:03:14 np0005532585.localdomain dnsmasq-dhcp[328206]: read /var/lib/neutron/dhcp/61421cbd-202e-4179-bacd-7ac7c4ff8280/opts
Nov 23 10:03:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:14.290 263258 INFO neutron.agent.dhcp.agent [None req-35536409-a725-405e-a2a6-15bf8d52ff9c - - - - - -] DHCP configuration for ports {'71598bfa-e4b2-4566-9a0f-19c8303297a9'} is completed
Nov 23 10:03:14 np0005532585.localdomain dnsmasq[328206]: exiting on receipt of SIGTERM
Nov 23 10:03:14 np0005532585.localdomain systemd[1]: libpod-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08.scope: Deactivated successfully.
Nov 23 10:03:14 np0005532585.localdomain podman[328225]: 2025-11-23 10:03:14.50119632 +0000 UTC m=+0.060776403 container kill c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 10:03:14 np0005532585.localdomain podman[328237]: 2025-11-23 10:03:14.559255458 +0000 UTC m=+0.045541284 container died c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:14 np0005532585.localdomain podman[328237]: 2025-11-23 10:03:14.58397622 +0000 UTC m=+0.070262046 container cleanup c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:03:14 np0005532585.localdomain systemd[1]: libpod-conmon-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08.scope: Deactivated successfully.
Nov 23 10:03:14 np0005532585.localdomain podman[328244]: 2025-11-23 10:03:14.63006122 +0000 UTC m=+0.105011876 container remove c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:14Z|00427|binding|INFO|Releasing lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 from this chassis (sb_readonly=0)
Nov 23 10:03:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:14Z|00428|binding|INFO|Setting lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 down in Southbound
Nov 23 10:03:14 np0005532585.localdomain kernel: device tap7b7efa3a-46 left promiscuous mode
Nov 23 10:03:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:14.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:14.650 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed19edd7-37c8-4522-837f-b32faf388727, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:14.652 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 in datapath 61421cbd-202e-4179-bacd-7ac7c4ff8280 unbound from our chassis
Nov 23 10:03:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:14.653 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 61421cbd-202e-4179-bacd-7ac7c4ff8280 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:14.654 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[1a415630-1979-4cf9-b221-f4d67b7fb6a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:14.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:15 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:15.094 263258 INFO neutron.agent.dhcp.agent [None req-7c4593f8-5b68-45ff-88c2-d65b160f6202 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:15 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:15.095 263258 INFO neutron.agent.dhcp.agent [None req-7c4593f8-5b68-45ff-88c2-d65b160f6202 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/329276956' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/329276956' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-aafddc39c2a04befbf58d4bf8d8ebb00098dbf032214687c1536c4fc26c16c3a-merged.mount: Deactivated successfully.
Nov 23 10:03:15 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:15 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d61421cbd\x2d202e\x2d4179\x2dbacd\x2d7ac7c4ff8280.mount: Deactivated successfully.
Nov 23 10:03:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:15.219 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:15.220 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:15 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:15.280 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:16 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:16Z|00429|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.052 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:16 np0005532585.localdomain ceph-mon[300199]: pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 104 KiB/s rd, 5.4 KiB/s wr, 137 op/s
Nov 23 10:03:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/327852875' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/327852875' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.295 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.295 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.296 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.297 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:03:16 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:16.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:03:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:03:16 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:03:17 np0005532585.localdomain systemd[1]: tmp-crun.xXndTU.mount: Deactivated successfully.
Nov 23 10:03:17 np0005532585.localdomain podman[328269]: 2025-11-23 10:03:17.05822939 +0000 UTC m=+0.098014550 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:17 np0005532585.localdomain podman[328270]: 2025-11-23 10:03:17.099289946 +0000 UTC m=+0.136737054 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64)
Nov 23 10:03:17 np0005532585.localdomain podman[328269]: 2025-11-23 10:03:17.101459753 +0000 UTC m=+0.141244923 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 23 10:03:17 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:03:17 np0005532585.localdomain podman[328268]: 2025-11-23 10:03:17.129679072 +0000 UTC m=+0.173023201 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 23 10:03:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e144 e144: 6 total, 6 up, 6 in
Nov 23 10:03:17 np0005532585.localdomain podman[328270]: 2025-11-23 10:03:17.182307593 +0000 UTC m=+0.219754711 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9)
Nov 23 10:03:17 np0005532585.localdomain podman[328268]: 2025-11-23 10:03:17.197737358 +0000 UTC m=+0.241081507 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 10:03:17 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:03:17 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:03:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:17.248 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:03:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:17.267 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:03:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:17.268 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:03:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:17.269 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:17.269 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:03:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:17.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e145 e145: 6 total, 6 up, 6 in
Nov 23 10:03:18 np0005532585.localdomain dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 0 addresses
Nov 23 10:03:18 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host
Nov 23 10:03:18 np0005532585.localdomain podman[328349]: 2025-11-23 10:03:18.016625285 +0000 UTC m=+0.066933543 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:03:18 np0005532585.localdomain dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts
Nov 23 10:03:18 np0005532585.localdomain ceph-mon[300199]: osdmap e144: 6 total, 6 up, 6 in
Nov 23 10:03:18 np0005532585.localdomain ceph-mon[300199]: pgmap v297: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 4.3 KiB/s wr, 116 op/s
Nov 23 10:03:18 np0005532585.localdomain ceph-mon[300199]: osdmap e145: 6 total, 6 up, 6 in
Nov 23 10:03:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:18.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:18Z|00430|binding|INFO|Releasing lport ed2c180f-1a61-4a88-a761-adcb953abd22 from this chassis (sb_readonly=0)
Nov 23 10:03:18 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:18Z|00431|binding|INFO|Setting lport ed2c180f-1a61-4a88-a761-adcb953abd22 down in Southbound
Nov 23 10:03:18 np0005532585.localdomain kernel: device taped2c180f-1a left promiscuous mode
Nov 23 10:03:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:18.350 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:18.356 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10f8dd7c838246c58f1d2c4efc771237', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bd8c251-3bb7-4e15-9c8d-7fcd2e804fa5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ed2c180f-1a61-4a88-a761-adcb953abd22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:18.358 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ed2c180f-1a61-4a88-a761-adcb953abd22 in datapath 31b977a7-a37c-42ba-bed9-7b22959f6310 unbound from our chassis
Nov 23 10:03:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:18.361 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31b977a7-a37c-42ba-bed9-7b22959f6310, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:03:18 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:18.362 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c15fa6b3-093e-4f59-8dd8-eb701a9d9dca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:18.371 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:19 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:19.367 263258 INFO neutron.agent.linux.ip_lib [None req-8153408d-0707-4f00-908f-cc93975263e6 - - - - - -] Device tapa62eb638-20 cannot be used as it has no MAC address
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.429 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:19 np0005532585.localdomain kernel: device tapa62eb638-20 entered promiscuous mode
Nov 23 10:03:19 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892199.4361] manager: (tapa62eb638-20): new Generic device (/org/freedesktop/NetworkManager/Devices/70)
Nov 23 10:03:19 np0005532585.localdomain systemd-udevd[328383]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:19Z|00432|binding|INFO|Claiming lport a62eb638-209a-4f5d-90e4-610cfcccbc39 for this chassis.
Nov 23 10:03:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:19Z|00433|binding|INFO|a62eb638-209a-4f5d-90e4-610cfcccbc39: Claiming unknown
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:19.450 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c2e626d-1a69-43ac-9d65-b501a9547aeb, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a62eb638-209a-4f5d-90e4-610cfcccbc39) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:19.452 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a62eb638-209a-4f5d-90e4-610cfcccbc39 in datapath 326109dc-7744-4dc1-8604-7d25ed028442 bound to our chassis
Nov 23 10:03:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:19.454 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 326109dc-7744-4dc1-8604-7d25ed028442 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:19.457 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cecd540f-8f8d-462b-81e2-1dc077c4be17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:19Z|00434|binding|INFO|Setting lport a62eb638-209a-4f5d-90e4-610cfcccbc39 ovn-installed in OVS
Nov 23 10:03:19 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:19Z|00435|binding|INFO|Setting lport a62eb638-209a-4f5d-90e4-610cfcccbc39 up in Southbound
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.471 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.698 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:19 np0005532585.localdomain ceph-mon[300199]: pgmap v299: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 150 KiB/s rd, 6.0 KiB/s wr, 197 op/s
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.727 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.764 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:19.796 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.241 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.241 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.241 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.242 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:03:20 np0005532585.localdomain podman[328457]: 
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2320941116' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain podman[328457]: 2025-11-23 10:03:20.645690244 +0000 UTC m=+0.086829316 container create 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.657 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:03:20 np0005532585.localdomain systemd[1]: Started libpod-conmon-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f.scope.
Nov 23 10:03:20 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:20 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/285ef1a63d76dfccf3c353674c61e591f2fc967ae03a06c927284b817a3c25d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:20 np0005532585.localdomain podman[328457]: 2025-11-23 10:03:20.604193806 +0000 UTC m=+0.045332898 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:20 np0005532585.localdomain podman[328457]: 2025-11-23 10:03:20.705523787 +0000 UTC m=+0.146662859 container init 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:20 np0005532585.localdomain podman[328457]: 2025-11-23 10:03:20.711504841 +0000 UTC m=+0.152643903 container start 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.714 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.715 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:03:20 np0005532585.localdomain dnsmasq[328477]: started, version 2.85 cachesize 150
Nov 23 10:03:20 np0005532585.localdomain dnsmasq[328477]: DNS service limited to local subnets
Nov 23 10:03:20 np0005532585.localdomain dnsmasq[328477]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:20 np0005532585.localdomain dnsmasq[328477]: warning: no upstream servers configured
Nov 23 10:03:20 np0005532585.localdomain dnsmasq-dhcp[328477]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Nov 23 10:03:20 np0005532585.localdomain dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 0 addresses
Nov 23 10:03:20 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host
Nov 23 10:03:20 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/573425647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/887069621' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1101695890' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1101695890' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2320941116' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/534642139' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:20.912 263258 INFO neutron.agent.dhcp.agent [None req-aa4c81ee-b19e-4620-ac08-099c55e0bd42 - - - - - -] DHCP configuration for ports {'a7428151-2964-46ed-a65c-aea518d3a1f3'} is completed
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.920 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.921 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11120MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.922 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:03:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:20.922 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.085 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.085 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.085 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.461 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:03:21 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1650261008' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:21 np0005532585.localdomain ceph-mon[300199]: pgmap v300: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 121 KiB/s rd, 4.9 KiB/s wr, 159 op/s
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.877 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:21 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:03:21 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/585599417' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.948 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.954 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.980 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.983 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:03:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:21.984 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:03:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/585599417' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e146 e146: 6 total, 6 up, 6 in
Nov 23 10:03:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:22.985 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:22.987 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:23Z|00436|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:23.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:23 np0005532585.localdomain dnsmasq[324132]: exiting on receipt of SIGTERM
Nov 23 10:03:23 np0005532585.localdomain podman[328515]: 2025-11-23 10:03:23.740025478 +0000 UTC m=+0.069785331 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:03:23 np0005532585.localdomain systemd[1]: libpod-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99.scope: Deactivated successfully.
Nov 23 10:03:23 np0005532585.localdomain podman[328529]: 2025-11-23 10:03:23.807392043 +0000 UTC m=+0.056244174 container died c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:03:23 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:23 np0005532585.localdomain podman[328529]: 2025-11-23 10:03:23.893112393 +0000 UTC m=+0.141964494 container cleanup c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:23 np0005532585.localdomain systemd[1]: libpod-conmon-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99.scope: Deactivated successfully.
Nov 23 10:03:23 np0005532585.localdomain podman[328536]: 2025-11-23 10:03:23.918933308 +0000 UTC m=+0.155945624 container remove c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:03:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:23.947 263258 INFO neutron.agent.dhcp.agent [None req-ceeaad65-fd22-40bb-856c-19912c25a24c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:23.948 263258 INFO neutron.agent.dhcp.agent [None req-ceeaad65-fd22-40bb-856c-19912c25a24c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:23 np0005532585.localdomain ceph-mon[300199]: osdmap e146: 6 total, 6 up, 6 in
Nov 23 10:03:23 np0005532585.localdomain ceph-mon[300199]: pgmap v302: 177 pgs: 177 active+clean; 145 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 2.1 KiB/s wr, 91 op/s
Nov 23 10:03:24 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:24.339 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:23Z, description=, device_id=546a2558-70b9-4cc7-8d44-6c3be7bdf264, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad96d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad9a30>], id=5a6c76dd-6c35-476b-8f16-632b02edc10d, ip_allocation=immediate, mac_address=fa:16:3e:61:b9:6b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:15Z, description=, dns_domain=, id=326109dc-7744-4dc1-8604-7d25ed028442, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-3825805, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2208, status=ACTIVE, subnets=['73db731d-a2f7-4dcc-a56f-d6dcfd7e2ce1'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:18Z, vlan_transparent=None, network_id=326109dc-7744-4dc1-8604-7d25ed028442, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2245, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:24Z on network 326109dc-7744-4dc1-8604-7d25ed028442
Nov 23 10:03:24 np0005532585.localdomain dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 1 addresses
Nov 23 10:03:24 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host
Nov 23 10:03:24 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts
Nov 23 10:03:24 np0005532585.localdomain podman[328577]: 2025-11-23 10:03:24.555642723 +0000 UTC m=+0.060899537 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:03:24 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d30fb1fb9da3ec81fc09dcff1705485fc67933914b93a576884d8b0f75a2cf79-merged.mount: Deactivated successfully.
Nov 23 10:03:24 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d31b977a7\x2da37c\x2d42ba\x2dbed9\x2d7b22959f6310.mount: Deactivated successfully.
Nov 23 10:03:24 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:24.908 263258 INFO neutron.agent.dhcp.agent [None req-a2f5c79d-2039-450f-9c98-aabc386843ee - - - - - -] DHCP configuration for ports {'5a6c76dd-6c35-476b-8f16-632b02edc10d'} is completed
Nov 23 10:03:25 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:25.021 2 INFO neutron.agent.securitygroups_rpc [None req-82f52ebc-5307-4b75-9da0-dca3e27d739d da47bb8e9ce044b7a6c60aeaa303445e 1ed74022d4944d5c8276b163cae1a73a - - default default] Security group member updated ['0c7393ad-63e1-4b57-bb16-ccf2466506e9']
Nov 23 10:03:25 np0005532585.localdomain ceph-mon[300199]: pgmap v303: 177 pgs: 177 active+clean; 163 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 403 KiB/s wr, 133 op/s
Nov 23 10:03:25 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:25.906 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:23Z, description=, device_id=546a2558-70b9-4cc7-8d44-6c3be7bdf264, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6640>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6a90>], id=5a6c76dd-6c35-476b-8f16-632b02edc10d, ip_allocation=immediate, mac_address=fa:16:3e:61:b9:6b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:15Z, description=, dns_domain=, id=326109dc-7744-4dc1-8604-7d25ed028442, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-3825805, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2208, status=ACTIVE, subnets=['73db731d-a2f7-4dcc-a56f-d6dcfd7e2ce1'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:18Z, vlan_transparent=None, network_id=326109dc-7744-4dc1-8604-7d25ed028442, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2245, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:24Z on network 326109dc-7744-4dc1-8604-7d25ed028442
Nov 23 10:03:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:03:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:03:26 np0005532585.localdomain podman[328599]: 2025-11-23 10:03:26.036352406 +0000 UTC m=+0.085806653 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:03:26 np0005532585.localdomain podman[328599]: 2025-11-23 10:03:26.050300666 +0000 UTC m=+0.099754873 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 23 10:03:26 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:03:26 np0005532585.localdomain systemd[1]: tmp-crun.GOqwZs.mount: Deactivated successfully.
Nov 23 10:03:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:26.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:26.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 10:03:26 np0005532585.localdomain dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 1 addresses
Nov 23 10:03:26 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host
Nov 23 10:03:26 np0005532585.localdomain podman[328643]: 2025-11-23 10:03:26.219227481 +0000 UTC m=+0.079030417 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:26 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts
Nov 23 10:03:26 np0005532585.localdomain podman[328601]: 2025-11-23 10:03:26.218647822 +0000 UTC m=+0.266028506 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:03:26 np0005532585.localdomain podman[328601]: 2025-11-23 10:03:26.231279671 +0000 UTC m=+0.278660335 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:03:26 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:03:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 10:03:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/693879309' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:03:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:26.494 263258 INFO neutron.agent.dhcp.agent [None req-7f8d0217-ed26-446d-9f9a-63e3cbe5de4e - - - - - -] DHCP configuration for ports {'5a6c76dd-6c35-476b-8f16-632b02edc10d'} is completed
Nov 23 10:03:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/693879309' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:03:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:26.930 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:27.023 263258 INFO neutron.agent.linux.ip_lib [None req-726027db-25ed-4cd0-bd23-442d668dd27b - - - - - -] Device tap8e54ef2c-e4 cannot be used as it has no MAC address
Nov 23 10:03:27 np0005532585.localdomain systemd[1]: tmp-crun.47e1Ro.mount: Deactivated successfully.
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.046 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain kernel: device tap8e54ef2c-e4 entered promiscuous mode
Nov 23 10:03:27 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892207.0553] manager: (tap8e54ef2c-e4): new Generic device (/org/freedesktop/NetworkManager/Devices/71)
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:27Z|00437|binding|INFO|Claiming lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 for this chassis.
Nov 23 10:03:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:27Z|00438|binding|INFO|8e54ef2c-e414-469a-ae06-6d0022eb93e2: Claiming unknown
Nov 23 10:03:27 np0005532585.localdomain systemd-udevd[328688]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.070 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a00f0c05-4895-49ad-a79a-15697f3729df, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8e54ef2c-e414-469a-ae06-6d0022eb93e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.071 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8e54ef2c-e414-469a-ae06-6d0022eb93e2 in datapath 42acb812-12e3-44f8-b0aa-aebaa30b36a2 bound to our chassis
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.072 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42acb812-12e3-44f8-b0aa-aebaa30b36a2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.072 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8f3ce5-5491-4059-9645-c5c0f1226a42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:27Z|00439|binding|INFO|Setting lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 ovn-installed in OVS
Nov 23 10:03:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:27Z|00440|binding|INFO|Setting lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 up in Southbound
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.099 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.100 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.127 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.150 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain ceph-mon[300199]: pgmap v304: 177 pgs: 177 active+clean; 163 MiB data, 818 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 340 KiB/s wr, 112 op/s
Nov 23 10:03:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e147 e147: 6 total, 6 up, 6 in
Nov 23 10:03:27 np0005532585.localdomain dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 0 addresses
Nov 23 10:03:27 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host
Nov 23 10:03:27 np0005532585.localdomain dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts
Nov 23 10:03:27 np0005532585.localdomain podman[328744]: 2025-11-23 10:03:27.572823909 +0000 UTC m=+0.070268536 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:27 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:27.694 2 INFO neutron.agent.securitygroups_rpc [None req-746b6928-7309-4035-bfd7-9d9f95de5728 da47bb8e9ce044b7a6c60aeaa303445e 1ed74022d4944d5c8276b163cae1a73a - - default default] Security group member updated ['0c7393ad-63e1-4b57-bb16-ccf2466506e9']
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.790 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:27 np0005532585.localdomain kernel: device tapa62eb638-20 left promiscuous mode
Nov 23 10:03:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:27Z|00441|binding|INFO|Releasing lport a62eb638-209a-4f5d-90e4-610cfcccbc39 from this chassis (sb_readonly=0)
Nov 23 10:03:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:27Z|00442|binding|INFO|Setting lport a62eb638-209a-4f5d-90e4-610cfcccbc39 down in Southbound
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.806 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c2e626d-1a69-43ac-9d65-b501a9547aeb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a62eb638-209a-4f5d-90e4-610cfcccbc39) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.808 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a62eb638-209a-4f5d-90e4-610cfcccbc39 in datapath 326109dc-7744-4dc1-8604-7d25ed028442 unbound from our chassis
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.810 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 326109dc-7744-4dc1-8604-7d25ed028442 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:27 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:27.812 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2e70eb-6cd2-482d-a5b0-978271eb622a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:27.818 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:28 np0005532585.localdomain sshd[328789]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:03:28 np0005532585.localdomain podman[328795]: 
Nov 23 10:03:28 np0005532585.localdomain podman[328795]: 2025-11-23 10:03:28.116351252 +0000 UTC m=+0.090623463 container create 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:28 np0005532585.localdomain podman[328795]: 2025-11-23 10:03:28.067700333 +0000 UTC m=+0.041972544 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:28 np0005532585.localdomain systemd[1]: Started libpod-conmon-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575.scope.
Nov 23 10:03:28 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:28 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c870a93f87fe203152ea7cfa7a09e2c00cb9ef905fa83f8a1c3bfd9d8722c30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:28 np0005532585.localdomain podman[328795]: 2025-11-23 10:03:28.206249552 +0000 UTC m=+0.180521743 container init 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:28 np0005532585.localdomain podman[328795]: 2025-11-23 10:03:28.215349352 +0000 UTC m=+0.189621573 container start 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:28 np0005532585.localdomain dnsmasq[328815]: started, version 2.85 cachesize 150
Nov 23 10:03:28 np0005532585.localdomain dnsmasq[328815]: DNS service limited to local subnets
Nov 23 10:03:28 np0005532585.localdomain dnsmasq[328815]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:28 np0005532585.localdomain dnsmasq[328815]: warning: no upstream servers configured
Nov 23 10:03:28 np0005532585.localdomain dnsmasq-dhcp[328815]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:28 np0005532585.localdomain dnsmasq[328815]: read /var/lib/neutron/dhcp/42acb812-12e3-44f8-b0aa-aebaa30b36a2/addn_hosts - 0 addresses
Nov 23 10:03:28 np0005532585.localdomain dnsmasq-dhcp[328815]: read /var/lib/neutron/dhcp/42acb812-12e3-44f8-b0aa-aebaa30b36a2/host
Nov 23 10:03:28 np0005532585.localdomain dnsmasq-dhcp[328815]: read /var/lib/neutron/dhcp/42acb812-12e3-44f8-b0aa-aebaa30b36a2/opts
Nov 23 10:03:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:28.246 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:28Z|00443|binding|INFO|Removing iface tap8e54ef2c-e4 ovn-installed in OVS
Nov 23 10:03:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:28Z|00444|binding|INFO|Removing lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 ovn-installed in OVS
Nov 23 10:03:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:28.341 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0b0ac5e7-076f-4df3-8266-dbc49db160ed with type ""
Nov 23 10:03:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:28.343 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a00f0c05-4895-49ad-a79a-15697f3729df, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8e54ef2c-e414-469a-ae06-6d0022eb93e2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:28.345 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8e54ef2c-e414-469a-ae06-6d0022eb93e2 in datapath 42acb812-12e3-44f8-b0aa-aebaa30b36a2 unbound from our chassis
Nov 23 10:03:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:28.347 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42acb812-12e3-44f8-b0aa-aebaa30b36a2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:28 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:28.348 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[09156e62-825e-430c-8125-86f2a991fd25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:28.388 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:28.388 263258 INFO neutron.agent.dhcp.agent [None req-df6d5d6d-5c4f-4daf-9db9-e2ab766fdfbc - - - - - -] DHCP configuration for ports {'5fe96874-43e3-416b-848d-729ed5b3ad15'} is completed
Nov 23 10:03:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:28.390 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:28 np0005532585.localdomain dnsmasq[328815]: exiting on receipt of SIGTERM
Nov 23 10:03:28 np0005532585.localdomain podman[328834]: 2025-11-23 10:03:28.523502734 +0000 UTC m=+0.048786163 container kill 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:03:28 np0005532585.localdomain systemd[1]: libpod-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575.scope: Deactivated successfully.
Nov 23 10:03:28 np0005532585.localdomain ceph-mon[300199]: osdmap e147: 6 total, 6 up, 6 in
Nov 23 10:03:28 np0005532585.localdomain podman[328848]: 2025-11-23 10:03:28.586699702 +0000 UTC m=+0.052160819 container died 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:03:28 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:28Z|00445|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:28 np0005532585.localdomain podman[328848]: 2025-11-23 10:03:28.629521641 +0000 UTC m=+0.094982738 container cleanup 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:28 np0005532585.localdomain systemd[1]: libpod-conmon-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575.scope: Deactivated successfully.
Nov 23 10:03:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:28.649 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:28 np0005532585.localdomain podman[328850]: 2025-11-23 10:03:28.673473515 +0000 UTC m=+0.130788440 container remove 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:28.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:28 np0005532585.localdomain kernel: device tap8e54ef2c-e4 left promiscuous mode
Nov 23 10:03:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:28.699 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:28.725 263258 INFO neutron.agent.dhcp.agent [None req-50134160-1f00-41cb-b8cd-7f78693adfc6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:28.726 263258 INFO neutron.agent.dhcp.agent [None req-50134160-1f00-41cb-b8cd-7f78693adfc6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e148 e148: 6 total, 6 up, 6 in
Nov 23 10:03:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3c870a93f87fe203152ea7cfa7a09e2c00cb9ef905fa83f8a1c3bfd9d8722c30-merged.mount: Deactivated successfully.
Nov 23 10:03:29 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:29 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d42acb812\x2d12e3\x2d44f8\x2db0aa\x2daebaa30b36a2.mount: Deactivated successfully.
Nov 23 10:03:29 np0005532585.localdomain sshd[328789]: Invalid user adnan from 175.126.166.172 port 51376
Nov 23 10:03:29 np0005532585.localdomain sshd[328789]: Received disconnect from 175.126.166.172 port 51376:11: Bye Bye [preauth]
Nov 23 10:03:29 np0005532585.localdomain sshd[328789]: Disconnected from invalid user adnan 175.126.166.172 port 51376 [preauth]
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:03:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:03:30 np0005532585.localdomain ceph-mon[300199]: osdmap e148: 6 total, 6 up, 6 in
Nov 23 10:03:30 np0005532585.localdomain ceph-mon[300199]: pgmap v307: 177 pgs: 177 active+clean; 238 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 6.5 MiB/s rd, 6.6 MiB/s wr, 142 op/s
Nov 23 10:03:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e149 e149: 6 total, 6 up, 6 in
Nov 23 10:03:30 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:30.184 2 INFO neutron.agent.securitygroups_rpc [None req-7bdd1f21-8588-4120-93d3-3c530b607701 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:30 np0005532585.localdomain dnsmasq[328477]: exiting on receipt of SIGTERM
Nov 23 10:03:30 np0005532585.localdomain podman[328892]: 2025-11-23 10:03:30.382747895 +0000 UTC m=+0.061001889 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:03:30 np0005532585.localdomain systemd[1]: libpod-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f.scope: Deactivated successfully.
Nov 23 10:03:30 np0005532585.localdomain podman[328906]: 2025-11-23 10:03:30.458031342 +0000 UTC m=+0.062473983 container died 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:03:30 np0005532585.localdomain systemd[1]: tmp-crun.YZ8ytI.mount: Deactivated successfully.
Nov 23 10:03:30 np0005532585.localdomain podman[328906]: 2025-11-23 10:03:30.50213625 +0000 UTC m=+0.106578831 container cleanup 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:03:30 np0005532585.localdomain systemd[1]: libpod-conmon-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f.scope: Deactivated successfully.
Nov 23 10:03:30 np0005532585.localdomain podman[328908]: 2025-11-23 10:03:30.534826835 +0000 UTC m=+0.131001797 container remove 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:03:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:30.756 263258 INFO neutron.agent.dhcp.agent [None req-d52a2512-637f-4eeb-a6ef-d124ca0bc07f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:30.888 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:31 np0005532585.localdomain ceph-mon[300199]: osdmap e149: 6 total, 6 up, 6 in
Nov 23 10:03:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e150 e150: 6 total, 6 up, 6 in
Nov 23 10:03:31 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:31.149 2 INFO neutron.agent.securitygroups_rpc [None req-04f8e66d-7c6f-40a2-b025-3f4a41cc4961 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 10:03:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:31.226 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.228 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 10:03:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-285ef1a63d76dfccf3c353674c61e591f2fc967ae03a06c927284b817a3c25d2-merged.mount: Deactivated successfully.
Nov 23 10:03:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:31 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d326109dc\x2d7744\x2d4dc1\x2d8604\x2d7d25ed028442.mount: Deactivated successfully.
Nov 23 10:03:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:31.549 263258 INFO neutron.agent.linux.ip_lib [None req-77f79bd3-900d-4c34-97e6-e52b9371395a - - - - - -] Device tap1874bdbc-c6 cannot be used as it has no MAC address
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.570 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:31 np0005532585.localdomain kernel: device tap1874bdbc-c6 entered promiscuous mode
Nov 23 10:03:31 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892211.5786] manager: (tap1874bdbc-c6): new Generic device (/org/freedesktop/NetworkManager/Devices/72)
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.578 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:31Z|00446|binding|INFO|Claiming lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 for this chassis.
Nov 23 10:03:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:31Z|00447|binding|INFO|1874bdbc-c65e-41bc-8fc7-89c003c8f6e9: Claiming unknown
Nov 23 10:03:31 np0005532585.localdomain systemd-udevd[328943]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:31.594 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34ed355d-aacf-40b7-85ca-d489cba8c831, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1874bdbc-c65e-41bc-8fc7-89c003c8f6e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:31.596 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 in datapath 42ad5b5e-db5f-4e5a-846c-652f436fa783 bound to our chassis
Nov 23 10:03:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:31.597 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42ad5b5e-db5f-4e5a-846c-652f436fa783 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:31.599 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4d5be3-169a-4ccc-a86f-46d94ad8b1e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:31Z|00448|binding|INFO|Setting lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 ovn-installed in OVS
Nov 23 10:03:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:31Z|00449|binding|INFO|Setting lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 up in Southbound
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.615 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.658 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:31 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:31.859 2 INFO neutron.agent.securitygroups_rpc [None req-493a4cea-33ac-4b44-8c12-ef9399b27484 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:31.932 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:32 np0005532585.localdomain ceph-mon[300199]: osdmap e150: 6 total, 6 up, 6 in
Nov 23 10:03:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:03:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "format": "json"}]: dispatch
Nov 23 10:03:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:03:32 np0005532585.localdomain ceph-mon[300199]: pgmap v310: 177 pgs: 177 active+clean; 238 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 5.4 MiB/s rd, 9.9 MiB/s wr, 107 op/s
Nov 23 10:03:32 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1092660839' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:03:32 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:32.401 2 INFO neutron.agent.securitygroups_rpc [None req-88038149-5d7c-44ef-b630-13d9beb8c240 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:32 np0005532585.localdomain podman[329012]: 
Nov 23 10:03:32 np0005532585.localdomain podman[329012]: 2025-11-23 10:03:32.559013451 +0000 UTC m=+0.060358140 container create c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:32 np0005532585.localdomain systemd[1]: Started libpod-conmon-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35.scope.
Nov 23 10:03:32 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:32 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/669ca07686ba0518e574dcb34d9d6983765cc1732dd82e1f456d50492dd71e6f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:32 np0005532585.localdomain podman[329012]: 2025-11-23 10:03:32.622702649 +0000 UTC m=+0.124047378 container init c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:03:32 np0005532585.localdomain podman[329012]: 2025-11-23 10:03:32.523761309 +0000 UTC m=+0.025106048 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:32 np0005532585.localdomain systemd[1]: tmp-crun.TAhm4w.mount: Deactivated successfully.
Nov 23 10:03:32 np0005532585.localdomain podman[329012]: 2025-11-23 10:03:32.635566677 +0000 UTC m=+0.136911376 container start c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:03:32 np0005532585.localdomain dnsmasq[329030]: started, version 2.85 cachesize 150
Nov 23 10:03:32 np0005532585.localdomain dnsmasq[329030]: DNS service limited to local subnets
Nov 23 10:03:32 np0005532585.localdomain dnsmasq[329030]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:32 np0005532585.localdomain dnsmasq[329030]: warning: no upstream servers configured
Nov 23 10:03:32 np0005532585.localdomain dnsmasq-dhcp[329030]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:32 np0005532585.localdomain dnsmasq[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/addn_hosts - 0 addresses
Nov 23 10:03:32 np0005532585.localdomain dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/host
Nov 23 10:03:32 np0005532585.localdomain dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/opts
Nov 23 10:03:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:32.796 263258 INFO neutron.agent.dhcp.agent [None req-9344b75d-950f-4844-938c-892c83b2be3c - - - - - -] DHCP configuration for ports {'62366403-2af7-4425-bdc9-dc79db3ff768'} is completed
Nov 23 10:03:33 np0005532585.localdomain ceph-mon[300199]: mgrmap e45: np0005532584.naxwxy(active, since 8m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:03:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e151 e151: 6 total, 6 up, 6 in
Nov 23 10:03:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:33 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:33.429 2 INFO neutron.agent.securitygroups_rpc [None req-6155fed5-7bb5-4f1e-ab16-bb23a0937b77 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:34Z|00450|binding|INFO|Removing iface tap1874bdbc-c6 ovn-installed in OVS
Nov 23 10:03:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:34Z|00451|binding|INFO|Removing lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 ovn-installed in OVS
Nov 23 10:03:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:34.046 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 1468a584-8a33-4043-9a71-a0ffcebeb0a0 with type ""
Nov 23 10:03:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:34.047 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34ed355d-aacf-40b7-85ca-d489cba8c831, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1874bdbc-c65e-41bc-8fc7-89c003c8f6e9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:34.048 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:34.050 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 in datapath 42ad5b5e-db5f-4e5a-846c-652f436fa783 unbound from our chassis
Nov 23 10:03:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:34.052 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42ad5b5e-db5f-4e5a-846c-652f436fa783, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:03:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:34.053 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[1fcb7785-8213-479e-b957-6762c5c5eb09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:34 np0005532585.localdomain kernel: device tap1874bdbc-c6 left promiscuous mode
Nov 23 10:03:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:34.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:34.076 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:34 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "snap_name": "019c4f4e-7af3-4cd6-b0e3-b6cacc762334", "format": "json"}]: dispatch
Nov 23 10:03:34 np0005532585.localdomain ceph-mon[300199]: osdmap e151: 6 total, 6 up, 6 in
Nov 23 10:03:34 np0005532585.localdomain ceph-mon[300199]: pgmap v312: 177 pgs: 177 active+clean; 238 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 4.8 MiB/s rd, 8.8 MiB/s wr, 96 op/s
Nov 23 10:03:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e152 e152: 6 total, 6 up, 6 in
Nov 23 10:03:34 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:34.755 2 INFO neutron.agent.securitygroups_rpc [None req-0eb2e2be-3d24-4ace-a008-a7d7e29c4328 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:35 np0005532585.localdomain ceph-mon[300199]: osdmap e152: 6 total, 6 up, 6 in
Nov 23 10:03:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e153 e153: 6 total, 6 up, 6 in
Nov 23 10:03:35 np0005532585.localdomain podman[329048]: 2025-11-23 10:03:35.213503301 +0000 UTC m=+0.081633820 container kill c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:03:35 np0005532585.localdomain dnsmasq[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/addn_hosts - 0 addresses
Nov 23 10:03:35 np0005532585.localdomain dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/host
Nov 23 10:03:35 np0005532585.localdomain dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/opts
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent [None req-c69f5f2a-9d3e-46d9-ba87-e0af38de0c5b - - - - - -] Unable to reload_allocations dhcp for 42ad5b5e-db5f-4e5a-846c-652f436fa783.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1874bdbc-c6 not found in namespace qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783.
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1874bdbc-c6 not found in namespace qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783.
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent 
Nov 23 10:03:35 np0005532585.localdomain sshd[329062]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:03:35 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.454 263258 INFO neutron.agent.linux.ip_lib [None req-db6700fe-b5a8-400d-8f63-8836ad2ec7a0 - - - - - -] Device tapf441590c-f0 cannot be used as it has no MAC address
Nov 23 10:03:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:35.487 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:35 np0005532585.localdomain kernel: device tapf441590c-f0 entered promiscuous mode
Nov 23 10:03:35 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892215.4962] manager: (tapf441590c-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/73)
Nov 23 10:03:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:35.498 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:35Z|00452|binding|INFO|Claiming lport f441590c-f097-4418-b5c6-fc684c4c990b for this chassis.
Nov 23 10:03:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:35Z|00453|binding|INFO|f441590c-f097-4418-b5c6-fc684c4c990b: Claiming unknown
Nov 23 10:03:35 np0005532585.localdomain systemd-udevd[329073]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:35Z|00454|binding|INFO|Setting lport f441590c-f097-4418-b5c6-fc684c4c990b ovn-installed in OVS
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:35.528 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:35Z|00455|binding|INFO|Setting lport f441590c-f097-4418-b5c6-fc684c4c990b up in Southbound
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:35.558 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe156e5f-7616-4580-b024-66548b9558a4, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f441590c-f097-4418-b5c6-fc684c4c990b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:35.560 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f441590c-f097-4418-b5c6-fc684c4c990b in datapath 4a43af34-e77a-4d75-9a08-a9727e7ca345 bound to our chassis
Nov 23 10:03:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:35.561 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4a43af34-e77a-4d75-9a08-a9727e7ca345 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:35.563 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a48634-c821-481e-8e85-f6ae0d93207a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:35 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapf441590c-f0: No such device
Nov 23 10:03:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:35.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:35.604 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:35Z|00456|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:35.937 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3526128852' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3526128852' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: osdmap e153: 6 total, 6 up, 6 in
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: pgmap v315: 177 pgs: 177 active+clean; 284 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 4.9 MiB/s rd, 4.8 MiB/s wr, 188 op/s
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3526128852' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:36 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3526128852' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:36 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:36.195 2 INFO neutron.agent.securitygroups_rpc [None req-a71f4ada-2ee4-4e29-a05b-0c137a49cc85 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:36 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:36.446 2 INFO neutron.agent.securitygroups_rpc [None req-606c84ff-d0b3-4da5-8d08-f4db462a1bb4 a8a12d646f734219a5736bd9a89106d3 cd27ceae55c44d478998092e7554fd8a - - default default] Security group member updated ['57d92d06-0a9a-469b-b69f-4fb9e6e560cf']
Nov 23 10:03:36 np0005532585.localdomain podman[329145]: 
Nov 23 10:03:36 np0005532585.localdomain podman[329145]: 2025-11-23 10:03:36.581632274 +0000 UTC m=+0.099296402 container create 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:03:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:03:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:03:36 np0005532585.localdomain systemd[1]: Started libpod-conmon-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac.scope.
Nov 23 10:03:36 np0005532585.localdomain podman[329145]: 2025-11-23 10:03:36.533350379 +0000 UTC m=+0.051014587 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:36 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:36 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efc00143707c6c7059c45fc469c5c19df6826f798c48ffab4fae84399025aa28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:36 np0005532585.localdomain podman[329145]: 2025-11-23 10:03:36.685320048 +0000 UTC m=+0.202984176 container init 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:03:36 np0005532585.localdomain dnsmasq[329187]: started, version 2.85 cachesize 150
Nov 23 10:03:36 np0005532585.localdomain dnsmasq[329187]: DNS service limited to local subnets
Nov 23 10:03:36 np0005532585.localdomain dnsmasq[329187]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:36 np0005532585.localdomain dnsmasq[329187]: warning: no upstream servers configured
Nov 23 10:03:36 np0005532585.localdomain dnsmasq-dhcp[329187]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:03:36 np0005532585.localdomain dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 0 addresses
Nov 23 10:03:36 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host
Nov 23 10:03:36 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts
Nov 23 10:03:36 np0005532585.localdomain podman[329159]: 2025-11-23 10:03:36.735964483 +0000 UTC m=+0.105609862 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:03:36 np0005532585.localdomain podman[329145]: 2025-11-23 10:03:36.745736277 +0000 UTC m=+0.263400405 container start 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:03:36 np0005532585.localdomain podman[329159]: 2025-11-23 10:03:36.776343479 +0000 UTC m=+0.145988878 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:03:36 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:03:36 np0005532585.localdomain podman[329160]: 2025-11-23 10:03:36.798186837 +0000 UTC m=+0.162954099 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Nov 23 10:03:36 np0005532585.localdomain podman[329160]: 2025-11-23 10:03:36.813367984 +0000 UTC m=+0.178135256 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:36 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:03:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:36.832 263258 INFO neutron.agent.dhcp.agent [None req-0bff647a-37c2-4421-8ce0-5d475f777c74 - - - - - -] Synchronizing state
Nov 23 10:03:36 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:36.981 2 INFO neutron.agent.securitygroups_rpc [None req-f95b4467-0d9c-4e77-aa60-8f1596442e50 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:36.995 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:36.999 263258 INFO neutron.agent.dhcp.agent [None req-ae3e1a76-bd76-4077-9d87-65d863f72c3b - - - - - -] DHCP configuration for ports {'bfac63e5-6fee-4aa0-b816-de3604981a61'} is completed
Nov 23 10:03:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:37.162 263258 INFO neutron.agent.dhcp.agent [None req-7d16b2fb-0d37-486c-ac3a-228527858e8b - - - - - -] All active networks have been fetched through RPC.
Nov 23 10:03:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e154 e154: 6 total, 6 up, 6 in
Nov 23 10:03:37 np0005532585.localdomain dnsmasq[329030]: exiting on receipt of SIGTERM
Nov 23 10:03:37 np0005532585.localdomain podman[329222]: 2025-11-23 10:03:37.364131645 +0000 UTC m=+0.064008899 container kill c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:37 np0005532585.localdomain systemd[1]: libpod-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35.scope: Deactivated successfully.
Nov 23 10:03:37 np0005532585.localdomain podman[329234]: 2025-11-23 10:03:37.42107809 +0000 UTC m=+0.044520992 container died c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:37 np0005532585.localdomain podman[329234]: 2025-11-23 10:03:37.450503677 +0000 UTC m=+0.073946509 container cleanup c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:03:37 np0005532585.localdomain systemd[1]: libpod-conmon-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35.scope: Deactivated successfully.
Nov 23 10:03:37 np0005532585.localdomain sshd[329259]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:03:37 np0005532585.localdomain podman[329241]: 2025-11-23 10:03:37.526040243 +0000 UTC m=+0.132974457 container remove c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:03:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:37.568 263258 INFO neutron.agent.dhcp.agent [None req-116d79b7-3d46-461e-94b5-c6eecda7a59c - - - - - -] Synchronizing state complete
Nov 23 10:03:37 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:37.570 263258 INFO neutron.agent.dhcp.agent [None req-db6700fe-b5a8-400d-8f63-8836ad2ec7a0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:35Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c89442b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8944af0>], id=2221bad8-4bd8-4391-a72d-5c149805734c, ip_allocation=immediate, mac_address=fa:16:3e:ff:95:cf, name=tempest-RoutersIpV6Test-1138851014, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:32Z, description=, dns_domain=, id=4a43af34-e77a-4d75-9a08-a9727e7ca345, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-14611547, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13436, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2295, status=ACTIVE, subnets=['b13803ac-8d12-481f-abee-61d6c126b729'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:34Z, vlan_transparent=None, network_id=4a43af34-e77a-4d75-9a08-a9727e7ca345, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['57d92d06-0a9a-469b-b69f-4fb9e6e560cf'], standard_attr_id=2309, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:35Z on network 4a43af34-e77a-4d75-9a08-a9727e7ca345
Nov 23 10:03:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-669ca07686ba0518e574dcb34d9d6983765cc1732dd82e1f456d50492dd71e6f-merged.mount: Deactivated successfully.
Nov 23 10:03:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:37 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d42ad5b5e\x2ddb5f\x2d4e5a\x2d846c\x2d652f436fa783.mount: Deactivated successfully.
Nov 23 10:03:37 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:37.617 2 INFO neutron.agent.securitygroups_rpc [None req-dffb97f7-0929-42aa-ac92-4890704581f2 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:37 np0005532585.localdomain podman[329286]: 2025-11-23 10:03:37.792333004 +0000 UTC m=+0.065302649 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:03:37 np0005532585.localdomain dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 1 addresses
Nov 23 10:03:37 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host
Nov 23 10:03:37 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts
Nov 23 10:03:37 np0005532585.localdomain sshd[329062]: Invalid user default from 182.78.111.34 port 45132
Nov 23 10:03:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e155 e155: 6 total, 6 up, 6 in
Nov 23 10:03:38 np0005532585.localdomain sshd[329259]: Invalid user alex from 207.154.194.2 port 60702
Nov 23 10:03:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:38.037 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:35Z, description=, device_id=31c6ba21-90e0-41d9-88c7-cee3c38bbce7, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c89fa6a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c89fa160>], id=2221bad8-4bd8-4391-a72d-5c149805734c, ip_allocation=immediate, mac_address=fa:16:3e:ff:95:cf, name=tempest-RoutersIpV6Test-1138851014, network_id=4a43af34-e77a-4d75-9a08-a9727e7ca345, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['57d92d06-0a9a-469b-b69f-4fb9e6e560cf'], standard_attr_id=2309, status=ACTIVE, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:37Z on network 4a43af34-e77a-4d75-9a08-a9727e7ca345
Nov 23 10:03:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:38.053 263258 INFO neutron.agent.dhcp.agent [None req-7baaf0bf-e893-4af3-83fe-d22af8afb311 - - - - - -] DHCP configuration for ports {'2221bad8-4bd8-4391-a72d-5c149805734c'} is completed
Nov 23 10:03:38 np0005532585.localdomain sshd[329259]: Received disconnect from 207.154.194.2 port 60702:11: Bye Bye [preauth]
Nov 23 10:03:38 np0005532585.localdomain sshd[329259]: Disconnected from invalid user alex 207.154.194.2 port 60702 [preauth]
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "snap_name": "019c4f4e-7af3-4cd6-b0e3-b6cacc762334_c6b7c25b-ee07-417d-94a7-231549eff0ad", "force": true, "format": "json"}]: dispatch
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "snap_name": "019c4f4e-7af3-4cd6-b0e3-b6cacc762334", "force": true, "format": "json"}]: dispatch
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: osdmap e154: 6 total, 6 up, 6 in
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/327806816' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/327806816' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: pgmap v317: 177 pgs: 177 active+clean; 284 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 5.0 MiB/s rd, 4.9 MiB/s wr, 190 op/s
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: osdmap e155: 6 total, 6 up, 6 in
Nov 23 10:03:38 np0005532585.localdomain dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 1 addresses
Nov 23 10:03:38 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host
Nov 23 10:03:38 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts
Nov 23 10:03:38 np0005532585.localdomain podman[329324]: 2025-11-23 10:03:38.233433011 +0000 UTC m=+0.062024239 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:03:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:38 np0005532585.localdomain sshd[329062]: Connection closed by invalid user default 182.78.111.34 port 45132 [preauth]
Nov 23 10:03:38 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:38.867 263258 INFO neutron.agent.dhcp.agent [None req-79d7e5b3-beab-4433-9727-cba888515eb6 - - - - - -] DHCP configuration for ports {'2221bad8-4bd8-4391-a72d-5c149805734c'} is completed
Nov 23 10:03:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e156 e156: 6 total, 6 up, 6 in
Nov 23 10:03:39 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:39.221 2 INFO neutron.agent.securitygroups_rpc [None req-9d87b2e2-86da-485b-bbea-cfc6692748e1 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:03:39 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1869128489' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:03:39 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1869128489' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:39 np0005532585.localdomain sudo[329345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:03:39 np0005532585.localdomain sudo[329345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:03:39 np0005532585.localdomain sudo[329345]: pam_unix(sudo:session): session closed for user root
Nov 23 10:03:39 np0005532585.localdomain sudo[329363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:03:39 np0005532585.localdomain sudo[329363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:03:39 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:39.949 2 INFO neutron.agent.securitygroups_rpc [None req-0cc7d2c8-7386-4953-9ec2-42e302a377e6 a8a12d646f734219a5736bd9a89106d3 cd27ceae55c44d478998092e7554fd8a - - default default] Security group member updated ['57d92d06-0a9a-469b-b69f-4fb9e6e560cf']
Nov 23 10:03:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e157 e157: 6 total, 6 up, 6 in
Nov 23 10:03:40 np0005532585.localdomain ceph-mon[300199]: osdmap e156: 6 total, 6 up, 6 in
Nov 23 10:03:40 np0005532585.localdomain ceph-mon[300199]: pgmap v320: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 175 KiB/s rd, 24 KiB/s wr, 243 op/s
Nov 23 10:03:40 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1869128489' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:40 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1869128489' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:40 np0005532585.localdomain dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 0 addresses
Nov 23 10:03:40 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host
Nov 23 10:03:40 np0005532585.localdomain dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts
Nov 23 10:03:40 np0005532585.localdomain podman[329410]: 2025-11-23 10:03:40.274425933 +0000 UTC m=+0.051541564 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:03:40 np0005532585.localdomain kernel: device tapf441590c-f0 left promiscuous mode
Nov 23 10:03:40 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:40Z|00457|binding|INFO|Releasing lport f441590c-f097-4418-b5c6-fc684c4c990b from this chassis (sb_readonly=0)
Nov 23 10:03:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:40.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:40 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:40Z|00458|binding|INFO|Setting lport f441590c-f097-4418-b5c6-fc684c4c990b down in Southbound
Nov 23 10:03:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:40.479 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe156e5f-7616-4580-b024-66548b9558a4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f441590c-f097-4418-b5c6-fc684c4c990b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:40.481 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f441590c-f097-4418-b5c6-fc684c4c990b in datapath 4a43af34-e77a-4d75-9a08-a9727e7ca345 unbound from our chassis
Nov 23 10:03:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:40.483 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4a43af34-e77a-4d75-9a08-a9727e7ca345 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:40 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:40.483 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9422df5d-2900-437f-9cd1-4ccf7d5b04fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:40.485 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:40 np0005532585.localdomain sudo[329363]: pam_unix(sudo:session): session closed for user root
Nov 23 10:03:40 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:40.590 2 INFO neutron.agent.securitygroups_rpc [None req-bd23e853-4daf-4921-98db-bb97f636a505 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:40 np0005532585.localdomain sudo[329453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:03:40 np0005532585.localdomain sudo[329453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:03:40 np0005532585.localdomain sudo[329453]: pam_unix(sudo:session): session closed for user root
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: osdmap e157: 6 total, 6 up, 6 in
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "format": "json"}]: dispatch
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cafe6b35-cbcb-405c-8bdc-6e0c35312d23", "force": true, "format": "json"}]: dispatch
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:03:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:41.217 2 INFO neutron.agent.securitygroups_rpc [None req-8d10cea8-bd4a-4441-b879-9913f6e3c03c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e158 e158: 6 total, 6 up, 6 in
Nov 23 10:03:41 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:41.712 2 INFO neutron.agent.securitygroups_rpc [None req-99b4e496-02de-4948-b003-eb8832f49bd1 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']
Nov 23 10:03:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:03:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:03:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:03:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159318 "" "Go-http-client/1.1"
Nov 23 10:03:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:03:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20209 "" "Go-http-client/1.1"
Nov 23 10:03:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:42.036 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:42 np0005532585.localdomain ceph-mon[300199]: pgmap v322: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 177 KiB/s rd, 25 KiB/s wr, 245 op/s
Nov 23 10:03:42 np0005532585.localdomain ceph-mon[300199]: osdmap e158: 6 total, 6 up, 6 in
Nov 23 10:03:42 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:42.185 2 INFO neutron.agent.securitygroups_rpc [None req-dbf580d8-52e8-4ecb-9364-932e14668854 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group rule updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']
Nov 23 10:03:42 np0005532585.localdomain dnsmasq[329187]: exiting on receipt of SIGTERM
Nov 23 10:03:42 np0005532585.localdomain systemd[1]: libpod-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac.scope: Deactivated successfully.
Nov 23 10:03:42 np0005532585.localdomain podman[329487]: 2025-11-23 10:03:42.206290657 +0000 UTC m=+0.059707360 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:03:42 np0005532585.localdomain podman[329500]: 2025-11-23 10:03:42.276875414 +0000 UTC m=+0.059370821 container died 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:03:42 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:42 np0005532585.localdomain podman[329500]: 2025-11-23 10:03:42.316077855 +0000 UTC m=+0.098573222 container cleanup 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:03:42 np0005532585.localdomain systemd[1]: libpod-conmon-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac.scope: Deactivated successfully.
Nov 23 10:03:42 np0005532585.localdomain podman[329502]: 2025-11-23 10:03:42.364596086 +0000 UTC m=+0.137386280 container remove 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:42.432 263258 INFO neutron.agent.dhcp.agent [None req-a5e4dc26-30c2-4e8a-904b-c9192946c94b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:42.601 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:42 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:42.748 2 INFO neutron.agent.securitygroups_rpc [None req-d8c6f897-4254-46a6-9c9c-683b3a672b23 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group rule updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']
Nov 23 10:03:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:42Z|00459|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:42.788 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e159 e159: 6 total, 6 up, 6 in
Nov 23 10:03:43 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-efc00143707c6c7059c45fc469c5c19df6826f798c48ffab4fae84399025aa28-merged.mount: Deactivated successfully.
Nov 23 10:03:43 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d4a43af34\x2de77a\x2d4d75\x2d9a08\x2da9727e7ca345.mount: Deactivated successfully.
Nov 23 10:03:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:43 np0005532585.localdomain ceph-mon[300199]: mgrmap e46: np0005532584.naxwxy(active, since 8m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:03:43 np0005532585.localdomain ceph-mon[300199]: osdmap e159: 6 total, 6 up, 6 in
Nov 23 10:03:43 np0005532585.localdomain ceph-mon[300199]: pgmap v325: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 170 KiB/s rd, 24 KiB/s wr, 235 op/s
Nov 23 10:03:43 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:03:43 np0005532585.localdomain dnsmasq[326522]: exiting on receipt of SIGTERM
Nov 23 10:03:43 np0005532585.localdomain podman[329547]: 2025-11-23 10:03:43.910326568 +0000 UTC m=+0.061224046 container kill 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:03:43 np0005532585.localdomain systemd[1]: libpod-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf.scope: Deactivated successfully.
Nov 23 10:03:43 np0005532585.localdomain podman[329561]: 2025-11-23 10:03:43.974552382 +0000 UTC m=+0.049898024 container died 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:03:44 np0005532585.localdomain podman[329561]: 2025-11-23 10:03:44.012586077 +0000 UTC m=+0.087931679 container cleanup 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:44 np0005532585.localdomain systemd[1]: libpod-conmon-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf.scope: Deactivated successfully.
Nov 23 10:03:44 np0005532585.localdomain podman[329563]: 2025-11-23 10:03:44.035507598 +0000 UTC m=+0.102156418 container remove 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:03:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:44Z|00460|binding|INFO|Releasing lport 6f155903-a394-40bc-9c4e-04010e974788 from this chassis (sb_readonly=0)
Nov 23 10:03:44 np0005532585.localdomain kernel: device tap6f155903-a3 left promiscuous mode
Nov 23 10:03:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:44.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:44Z|00461|binding|INFO|Setting lport 6f155903-a394-40bc-9c4e-04010e974788 down in Southbound
Nov 23 10:03:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:44.104 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d782672f-ba9a-4b1f-9286-2b53b24a21c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=6f155903-a394-40bc-9c4e-04010e974788) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:44.105 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6f155903-a394-40bc-9c4e-04010e974788 in datapath 725f5f75-c3ef-4a36-ba95-e1cd3131878c unbound from our chassis
Nov 23 10:03:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:44.106 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 725f5f75-c3ef-4a36-ba95-e1cd3131878c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:44.107 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed131f6-8928-4b92-828d-53e93133af7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:44.110 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6b5117719bced5fdac21fc3bb297cfd580c490322e10e374d90cdd191e13b6d1-merged.mount: Deactivated successfully.
Nov 23 10:03:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:44 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:44.289 2 INFO neutron.agent.securitygroups_rpc [None req-71dcb7fc-4cba-40db-b8c8-6bad4f6af9d0 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:44 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d725f5f75\x2dc3ef\x2d4a36\x2dba95\x2de1cd3131878c.mount: Deactivated successfully.
Nov 23 10:03:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:44.401 263258 INFO neutron.agent.dhcp.agent [None req-1f63856d-d247-4b91-86d2-83fbcb612c6e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:44.402 263258 INFO neutron.agent.dhcp.agent [None req-1f63856d-d247-4b91-86d2-83fbcb612c6e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:44 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:44.510 2 INFO neutron.agent.securitygroups_rpc [None req-71dcb7fc-4cba-40db-b8c8-6bad4f6af9d0 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:44 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:44.633 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:44Z|00462|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:44.935 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:45 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1260994540' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:45 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1260994540' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:45 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:45.128 2 INFO neutron.agent.securitygroups_rpc [None req-cef19edc-dbb5-4bcd-8945-0c2ead165d91 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:45.150 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:45.633 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:45.635 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:03:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:45.635 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:46 np0005532585.localdomain ceph-mon[300199]: pgmap v326: 177 pgs: 177 active+clean; 145 MiB data, 877 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 11 KiB/s wr, 73 op/s
Nov 23 10:03:46 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:46.279 2 INFO neutron.agent.securitygroups_rpc [None req-b1ac9adf-2928-4e8c-b93d-9a7afa468620 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:47.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:47 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:47.439 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:47 np0005532585.localdomain ceph-mon[300199]: pgmap v327: 177 pgs: 177 active+clean; 145 MiB data, 877 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 9.3 KiB/s wr, 59 op/s
Nov 23 10:03:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:03:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:03:47 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:03:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e160 e160: 6 total, 6 up, 6 in
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: tmp-crun.bMhEoW.mount: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain podman[329593]: 2025-11-23 10:03:48.064938357 +0000 UTC m=+0.109134238 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:03:48 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:48.066 263258 INFO neutron.agent.linux.ip_lib [None req-8a5fbce6-c093-4db1-ae3f-ba673c83a5a6 - - - - - -] Device tapf2565742-27 cannot be used as it has no MAC address
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.094 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:48 np0005532585.localdomain podman[329593]: 2025-11-23 10:03:48.102393575 +0000 UTC m=+0.146589416 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:03:48 np0005532585.localdomain kernel: device tapf2565742-27 entered promiscuous mode
Nov 23 10:03:48 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892228.1064] manager: (tapf2565742-27): new Generic device (/org/freedesktop/NetworkManager/Devices/74)
Nov 23 10:03:48 np0005532585.localdomain podman[329594]: 2025-11-23 10:03:48.110088237 +0000 UTC m=+0.149839525 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 10:03:48 np0005532585.localdomain systemd-udevd[329645]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:48 np0005532585.localdomain podman[329594]: 2025-11-23 10:03:48.141610157 +0000 UTC m=+0.181361445 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Nov 23 10:03:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:48Z|00463|binding|INFO|Claiming lport f2565742-2705-427d-b837-b10dc9f90604 for this chassis.
Nov 23 10:03:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:48Z|00464|binding|INFO|f2565742-2705-427d-b837-b10dc9f90604: Claiming unknown
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.143 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:48Z|00465|binding|INFO|Setting lport f2565742-2705-427d-b837-b10dc9f90604 ovn-installed in OVS
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:48Z|00466|binding|INFO|Setting lport f2565742-2705-427d-b837-b10dc9f90604 up in Southbound
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.199 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d5704c4-8ab8-49b3-a1d9-b5f0c2d3d763, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f2565742-2705-427d-b837-b10dc9f90604) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.201 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f2565742-2705-427d-b837-b10dc9f90604 in datapath 44ad0888-d340-45c9-a658-e70067183c3d bound to our chassis
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.203 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 44ad0888-d340-45c9-a658-e70067183c3d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:48 np0005532585.localdomain podman[329592]: 2025-11-23 10:03:48.153424722 +0000 UTC m=+0.202874862 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.204 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[494c9323-26fd-4a4f-b900-127fe9a24f99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.219 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:48 np0005532585.localdomain podman[329592]: 2025-11-23 10:03:48.237465024 +0000 UTC m=+0.286915194 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:48 np0005532585.localdomain dnsmasq[325708]: exiting on receipt of SIGTERM
Nov 23 10:03:48 np0005532585.localdomain podman[329710]: 2025-11-23 10:03:48.761962064 +0000 UTC m=+0.059677579 container kill 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: libpod-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49.scope: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain podman[329722]: 2025-11-23 10:03:48.835243691 +0000 UTC m=+0.055551184 container died 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:03:48 np0005532585.localdomain podman[329722]: 2025-11-23 10:03:48.866859384 +0000 UTC m=+0.087166827 container cleanup 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: libpod-conmon-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49.scope: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain podman[329724]: 2025-11-23 10:03:48.915676584 +0000 UTC m=+0.129619876 container remove 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:48Z|00467|binding|INFO|Releasing lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 from this chassis (sb_readonly=0)
Nov 23 10:03:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:48Z|00468|binding|INFO|Setting lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 down in Southbound
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.945 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:48 np0005532585.localdomain kernel: device tape9b240b4-dd left promiscuous mode
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.953 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87fb399b-8c32-4da7-b979-46b50b5b7dd8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=e9b240b4-dda7-48fc-a63a-d3fd91217a97) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.955 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e9b240b4-dda7-48fc-a63a-d3fd91217a97 in datapath 319f7ca3-1c18-4436-8178-bfc17a98eb45 unbound from our chassis
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: tmp-crun.nabist.mount: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0ed3cccae87a2cec8679f8b781e7106fb3edbfdb329ca120981a36091e9dbff3-merged.mount: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.964 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 319f7ca3-1c18-4436-8178-bfc17a98eb45 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:48 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:48.965 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b26ed10d-1c7e-4666-bb39-5b0393b5c4e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:48.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:49 np0005532585.localdomain ceph-mon[300199]: osdmap e160: 6 total, 6 up, 6 in
Nov 23 10:03:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/516761612' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:03:49 np0005532585.localdomain podman[329776]: 
Nov 23 10:03:49 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d319f7ca3\x2d1c18\x2d4436\x2d8178\x2dbfc17a98eb45.mount: Deactivated successfully.
Nov 23 10:03:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:49.381 263258 INFO neutron.agent.dhcp.agent [None req-bfb1c85d-6963-4c65-8d70-8c0b1bd74c3c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:49 np0005532585.localdomain podman[329776]: 2025-11-23 10:03:49.385830576 +0000 UTC m=+0.104837188 container create 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:49 np0005532585.localdomain podman[329776]: 2025-11-23 10:03:49.331534951 +0000 UTC m=+0.050541603 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:49 np0005532585.localdomain systemd[1]: Started libpod-conmon-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac.scope.
Nov 23 10:03:49 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:49 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc09d36762312268b4e2ea330956ffbfc2d1ae0c044e08cc5ab54ff63b2faae9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:49 np0005532585.localdomain podman[329776]: 2025-11-23 10:03:49.478942231 +0000 UTC m=+0.197948833 container init 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:03:49 np0005532585.localdomain podman[329776]: 2025-11-23 10:03:49.4878374 +0000 UTC m=+0.206844012 container start 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:49 np0005532585.localdomain dnsmasq[329794]: started, version 2.85 cachesize 150
Nov 23 10:03:49 np0005532585.localdomain dnsmasq[329794]: DNS service limited to local subnets
Nov 23 10:03:49 np0005532585.localdomain dnsmasq[329794]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:49 np0005532585.localdomain dnsmasq[329794]: warning: no upstream servers configured
Nov 23 10:03:49 np0005532585.localdomain dnsmasq-dhcp[329794]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:49 np0005532585.localdomain dnsmasq[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/addn_hosts - 0 addresses
Nov 23 10:03:49 np0005532585.localdomain dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/host
Nov 23 10:03:49 np0005532585.localdomain dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/opts
Nov 23 10:03:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:49.674 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:49.722 263258 INFO neutron.agent.dhcp.agent [None req-5af237d5-3cb4-45aa-b223-d0feb72d77cd - - - - - -] DHCP configuration for ports {'0e48174f-0d37-4bdd-a9a9-fdda16cdb82b'} is completed
Nov 23 10:03:50 np0005532585.localdomain ceph-mon[300199]: pgmap v329: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 11 KiB/s wr, 57 op/s
Nov 23 10:03:50 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:50.227 2 INFO neutron.agent.securitygroups_rpc [req-dc022d26-398e-4427-8b9e-d6e32e3174fc req-12998a72-36e8-4adc-96fc-04c6618198f0 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group member updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']
Nov 23 10:03:50 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:50.319 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:03:50 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:50Z|00469|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:50.909 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:51Z|00470|binding|INFO|Removing iface tapf2565742-27 ovn-installed in OVS
Nov 23 10:03:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:51Z|00471|binding|INFO|Removing lport f2565742-2705-427d-b837-b10dc9f90604 ovn-installed in OVS
Nov 23 10:03:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:51.259 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0727ed07-70eb-4362-adaa-42c5e2a55093 with type ""
Nov 23 10:03:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:51.261 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d5704c4-8ab8-49b3-a1d9-b5f0c2d3d763, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=f2565742-2705-427d-b837-b10dc9f90604) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:51.261 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:51.264 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f2565742-2705-427d-b837-b10dc9f90604 in datapath 44ad0888-d340-45c9-a658-e70067183c3d unbound from our chassis
Nov 23 10:03:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:51.267 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44ad0888-d340-45c9-a658-e70067183c3d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:03:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:51.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:51.268 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cedbeedb-43a3-4f90-96e3-04a292e701e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:51.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:51 np0005532585.localdomain kernel: device tapf2565742-27 left promiscuous mode
Nov 23 10:03:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:51.293 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:51 np0005532585.localdomain ceph-mon[300199]: pgmap v330: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 10 KiB/s wr, 52 op/s
Nov 23 10:03:51 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:51.637 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:03:51 np0005532585.localdomain dnsmasq[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/addn_hosts - 0 addresses
Nov 23 10:03:51 np0005532585.localdomain dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/host
Nov 23 10:03:51 np0005532585.localdomain podman[329814]: 2025-11-23 10:03:51.651214418 +0000 UTC m=+0.061945387 container kill 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:51 np0005532585.localdomain dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/opts
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent [None req-e340167e-5dcb-4505-a2f4-712bb4bf6d40 - - - - - -] Unable to reload_allocations dhcp for 44ad0888-d340-45c9-a658-e70067183c3d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf2565742-27 not found in namespace qdhcp-44ad0888-d340-45c9-a658-e70067183c3d.
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf2565742-27 not found in namespace qdhcp-44ad0888-d340-45c9-a658-e70067183c3d.
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent 
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.680 263258 INFO neutron.agent.dhcp.agent [None req-116d79b7-3d46-461e-94b5-c6eecda7a59c - - - - - -] Synchronizing state
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.797 263258 INFO neutron.agent.dhcp.agent [None req-37a3c96b-351c-4de9-96d3-7234122ba6a5 - - - - - -] All active networks have been fetched through RPC.
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.797 263258 INFO neutron.agent.dhcp.agent [-] Starting network 44ad0888-d340-45c9-a658-e70067183c3d dhcp configuration
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.798 263258 INFO neutron.agent.dhcp.agent [-] Finished network 44ad0888-d340-45c9-a658-e70067183c3d dhcp configuration
Nov 23 10:03:51 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.799 263258 INFO neutron.agent.dhcp.agent [None req-37a3c96b-351c-4de9-96d3-7234122ba6a5 - - - - - -] Synchronizing state complete
Nov 23 10:03:51 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:51Z|00472|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:03:51 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:51.951 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:52 np0005532585.localdomain dnsmasq[329794]: exiting on receipt of SIGTERM
Nov 23 10:03:52 np0005532585.localdomain podman[329844]: 2025-11-23 10:03:52.055475475 +0000 UTC m=+0.067440763 container kill 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:52 np0005532585.localdomain systemd[1]: tmp-crun.8NWlCd.mount: Deactivated successfully.
Nov 23 10:03:52 np0005532585.localdomain systemd[1]: libpod-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac.scope: Deactivated successfully.
Nov 23 10:03:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:52.080 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:52 np0005532585.localdomain podman[329857]: 2025-11-23 10:03:52.1286995 +0000 UTC m=+0.059905595 container died 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:52 np0005532585.localdomain podman[329857]: 2025-11-23 10:03:52.159967682 +0000 UTC m=+0.091173737 container cleanup 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:52 np0005532585.localdomain systemd[1]: libpod-conmon-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac.scope: Deactivated successfully.
Nov 23 10:03:52 np0005532585.localdomain podman[329859]: 2025-11-23 10:03:52.202621448 +0000 UTC m=+0.127439551 container remove 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-bc09d36762312268b4e2ea330956ffbfc2d1ae0c044e08cc5ab54ff63b2faae9-merged.mount: Deactivated successfully.
Nov 23 10:03:52 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac-userdata-shm.mount: Deactivated successfully.
Nov 23 10:03:52 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d44ad0888\x2dd340\x2d45c9\x2da658\x2de70067183c3d.mount: Deactivated successfully.
Nov 23 10:03:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:52.938 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:03:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:52.959 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 23 10:03:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:52.960 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:03:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:52.961 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:03:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:52.989 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:03:52 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:52.993 2 INFO neutron.agent.securitygroups_rpc [None req-e884606d-3955-464b-8443-536f305941fb 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e161 e161: 6 total, 6 up, 6 in
Nov 23 10:03:54 np0005532585.localdomain ceph-mon[300199]: pgmap v331: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 8.6 KiB/s wr, 44 op/s
Nov 23 10:03:54 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:54.192 2 INFO neutron.agent.securitygroups_rpc [None req-5a886eea-af54-4cb9-a980-5c3836eff3f1 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:03:54 np0005532585.localdomain snmpd[67457]: empty variable list in _query
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: osdmap e161: 6 total, 6 up, 6 in
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/95359321' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.076310) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235076338, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1464, "num_deletes": 262, "total_data_size": 2778890, "memory_usage": 2889520, "flush_reason": "Manual Compaction"}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235087021, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1826635, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25519, "largest_seqno": 26977, "table_properties": {"data_size": 1820430, "index_size": 3419, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14676, "raw_average_key_size": 21, "raw_value_size": 1807542, "raw_average_value_size": 2669, "num_data_blocks": 148, "num_entries": 677, "num_filter_entries": 677, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892168, "oldest_key_time": 1763892168, "file_creation_time": 1763892235, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10761 microseconds, and 4983 cpu microseconds.
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.087065) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1826635 bytes OK
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.087088) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.089949) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.089971) EVENT_LOG_v1 {"time_micros": 1763892235089964, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.089992) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2771728, prev total WAL file size 2771728, number of live WAL files 2.
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.090761) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1783KB)], [45(14MB)]
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235090834, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 16754327, "oldest_snapshot_seqno": -1}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12589 keys, 15502277 bytes, temperature: kUnknown
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235165441, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 15502277, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15433424, "index_size": 36304, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 340950, "raw_average_key_size": 27, "raw_value_size": 15221571, "raw_average_value_size": 1209, "num_data_blocks": 1345, "num_entries": 12589, "num_filter_entries": 12589, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892235, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.166171) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 15502277 bytes
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.167921) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 224.2 rd, 207.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 14.2 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(17.7) write-amplify(8.5) OK, records in: 13128, records dropped: 539 output_compression: NoCompression
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.167952) EVENT_LOG_v1 {"time_micros": 1763892235167937, "job": 26, "event": "compaction_finished", "compaction_time_micros": 74722, "compaction_time_cpu_micros": 45616, "output_level": 6, "num_output_files": 1, "total_output_size": 15502277, "num_input_records": 13128, "num_output_records": 12589, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235168383, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235170568, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.090636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170612) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:03:55 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:03:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4204955573' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:03:56 np0005532585.localdomain ceph-mon[300199]: pgmap v333: 177 pgs: 177 active+clean; 192 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 2.7 MiB/s wr, 52 op/s
Nov 23 10:03:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e162 e162: 6 total, 6 up, 6 in
Nov 23 10:03:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:03:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:03:57 np0005532585.localdomain podman[329888]: 2025-11-23 10:03:57.040571903 +0000 UTC m=+0.099357964 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd)
Nov 23 10:03:57 np0005532585.localdomain podman[329888]: 2025-11-23 10:03:57.053511132 +0000 UTC m=+0.112297173 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3)
Nov 23 10:03:57 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.082 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.127 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.128 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:03:57 np0005532585.localdomain ceph-mon[300199]: osdmap e162: 6 total, 6 up, 6 in
Nov 23 10:03:57 np0005532585.localdomain podman[329889]: 2025-11-23 10:03:57.164722233 +0000 UTC m=+0.215743980 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:03:57 np0005532585.localdomain podman[329889]: 2025-11-23 10:03:57.175949381 +0000 UTC m=+0.226971138 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:03:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:57.180 263258 INFO neutron.agent.linux.ip_lib [None req-55eeec42-bcb3-4134-8456-e7be7be94fe9 - - - - - -] Device tap305095bc-16 cannot be used as it has no MAC address
Nov 23 10:03:57 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.201 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:57 np0005532585.localdomain kernel: device tap305095bc-16 entered promiscuous mode
Nov 23 10:03:57 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892237.2095] manager: (tap305095bc-16): new Generic device (/org/freedesktop/NetworkManager/Devices/75)
Nov 23 10:03:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:57Z|00473|binding|INFO|Claiming lport 305095bc-169f-4019-8019-b335745719a8 for this chassis.
Nov 23 10:03:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:57Z|00474|binding|INFO|305095bc-169f-4019-8019-b335745719a8: Claiming unknown
Nov 23 10:03:57 np0005532585.localdomain systemd-udevd[329940]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:57.226 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '807b835f4cc944269d2f71f8e519b08a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7004edea-7321-4a7b-bb91-bf959c0155ab, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=305095bc-169f-4019-8019-b335745719a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:03:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:57.228 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 305095bc-169f-4019-8019-b335745719a8 in datapath fbb2f473-9d45-472b-acf8-1ed5f2c6e75a bound to our chassis
Nov 23 10:03:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:57.230 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fbb2f473-9d45-472b-acf8-1ed5f2c6e75a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:03:57 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:03:57.230 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2886fe62-bac5-4cf9-abf4-82db8087bc4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:57Z|00475|binding|INFO|Setting lport 305095bc-169f-4019-8019-b335745719a8 ovn-installed in OVS
Nov 23 10:03:57 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:03:57Z|00476|binding|INFO|Setting lport 305095bc-169f-4019-8019-b335745719a8 up in Southbound
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.242 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap305095bc-16: No such device
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.274 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:57.306 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1518047932' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:03:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1518047932' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:03:58 np0005532585.localdomain ceph-mon[300199]: pgmap v335: 177 pgs: 177 active+clean; 192 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 2.7 MiB/s wr, 52 op/s
Nov 23 10:03:58 np0005532585.localdomain podman[330011]: 
Nov 23 10:03:58 np0005532585.localdomain podman[330011]: 2025-11-23 10:03:58.172708607 +0000 UTC m=+0.094082346 container create 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:58 np0005532585.localdomain systemd[1]: Started libpod-conmon-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e.scope.
Nov 23 10:03:58 np0005532585.localdomain podman[330011]: 2025-11-23 10:03:58.126539135 +0000 UTC m=+0.047912904 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:58 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:58 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7be32356762f3029209aca713f7a677b1c24e82a292b776689aed1ed5de000e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:58 np0005532585.localdomain podman[330011]: 2025-11-23 10:03:58.253612954 +0000 UTC m=+0.174986733 container init 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:03:58 np0005532585.localdomain podman[330011]: 2025-11-23 10:03:58.267675758 +0000 UTC m=+0.189049507 container start 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:03:58 np0005532585.localdomain dnsmasq[330031]: started, version 2.85 cachesize 150
Nov 23 10:03:58 np0005532585.localdomain dnsmasq[330031]: DNS service limited to local subnets
Nov 23 10:03:58 np0005532585.localdomain dnsmasq[330031]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:58 np0005532585.localdomain dnsmasq[330031]: warning: no upstream servers configured
Nov 23 10:03:58 np0005532585.localdomain dnsmasq-dhcp[330031]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:58 np0005532585.localdomain dnsmasq[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/addn_hosts - 0 addresses
Nov 23 10:03:58 np0005532585.localdomain dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/host
Nov 23 10:03:58 np0005532585.localdomain dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/opts
Nov 23 10:03:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:03:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:58.431 263258 INFO neutron.agent.dhcp.agent [None req-66b4c661-3b71-417d-a81d-c2551ccf6024 - - - - - -] DHCP configuration for ports {'1b6f9917-dcd8-46f2-841b-9a5f608eb356'} is completed
Nov 23 10:03:58 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:58.437 263258 INFO neutron.agent.linux.ip_lib [None req-e53debe7-7303-47bd-8cb4-60fae38a0ec2 - - - - - -] Device tape9fe654f-69 cannot be used as it has no MAC address
Nov 23 10:03:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:58.503 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:58 np0005532585.localdomain kernel: device tape9fe654f-69 entered promiscuous mode
Nov 23 10:03:58 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892238.5092] manager: (tape9fe654f-69): new Generic device (/org/freedesktop/NetworkManager/Devices/76)
Nov 23 10:03:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:58.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:58.519 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:58.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:58.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:03:58.620 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:03:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:03:59.106 2 INFO neutron.agent.securitygroups_rpc [None req-73f9f53a-edf4-45e5-a635-4120a726bffe f436a64c9a134831a0f528309f399f1d 807b835f4cc944269d2f71f8e519b08a - - default default] Security group member updated ['c2582f3e-b285-4f13-ba8c-38a0c5b47d8d']
Nov 23 10:03:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:59.161 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:58Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4dc40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4dbb0>], id=ef98dbbe-315b-4005-87bd-c9bb26809710, ip_allocation=immediate, mac_address=fa:16:3e:b3:2a:18, name=tempest-TagsExtTest-154991436, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:54Z, description=, dns_domain=, id=fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-852574996, port_security_enabled=True, project_id=807b835f4cc944269d2f71f8e519b08a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50479, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2406, status=ACTIVE, subnets=['9d3d60ab-4d93-4d58-a0ca-53c9dcb8d46d'], tags=[], tenant_id=807b835f4cc944269d2f71f8e519b08a, updated_at=2025-11-23T10:03:55Z, vlan_transparent=None, network_id=fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, port_security_enabled=True, project_id=807b835f4cc944269d2f71f8e519b08a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c2582f3e-b285-4f13-ba8c-38a0c5b47d8d'], standard_attr_id=2425, status=DOWN, tags=[], tenant_id=807b835f4cc944269d2f71f8e519b08a, updated_at=2025-11-23T10:03:58Z on network fbb2f473-9d45-472b-acf8-1ed5f2c6e75a
Nov 23 10:03:59 np0005532585.localdomain systemd[1]: tmp-crun.yCmdzb.mount: Deactivated successfully.
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/addn_hosts - 1 addresses
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/host
Nov 23 10:03:59 np0005532585.localdomain podman[330095]: 2025-11-23 10:03:59.436688162 +0000 UTC m=+0.106392326 container kill 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/opts
Nov 23 10:03:59 np0005532585.localdomain podman[330123]: 
Nov 23 10:03:59 np0005532585.localdomain podman[330123]: 2025-11-23 10:03:59.497722591 +0000 UTC m=+0.079219758 container create cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:59 np0005532585.localdomain ceph-mon[300199]: pgmap v336: 177 pgs: 177 active+clean; 192 MiB data, 957 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.7 MiB/s wr, 176 op/s
Nov 23 10:03:59 np0005532585.localdomain systemd[1]: Started libpod-conmon-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14.scope.
Nov 23 10:03:59 np0005532585.localdomain podman[330123]: 2025-11-23 10:03:59.4615225 +0000 UTC m=+0.043019757 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:03:59 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:03:59 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db53714652613f07480fa88db9328a7f78acd740bb6a93da8aece7d20ced86cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:03:59 np0005532585.localdomain podman[330123]: 2025-11-23 10:03:59.587641679 +0000 UTC m=+0.169138886 container init cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:03:59 np0005532585.localdomain podman[330123]: 2025-11-23 10:03:59.596233728 +0000 UTC m=+0.177730925 container start cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330150]: started, version 2.85 cachesize 150
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330150]: DNS service limited to local subnets
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330150]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330150]: warning: no upstream servers configured
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330150]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/addn_hosts - 0 addresses
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/host
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/opts
Nov 23 10:03:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:59.720 263258 INFO neutron.agent.dhcp.agent [None req-9ea3a00e-253c-40d9-9521-ad2df9959247 - - - - - -] DHCP configuration for ports {'ef98dbbe-315b-4005-87bd-c9bb26809710'} is completed
Nov 23 10:03:59 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:03:59.888 263258 INFO neutron.agent.dhcp.agent [None req-a689ed5b-aeb8-4883-8959-019e8e9b5fea - - - - - -] DHCP configuration for ports {'29220730-8a13-4870-ad1d-d4b8339b0131'} is completed
Nov 23 10:03:59 np0005532585.localdomain dnsmasq[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/addn_hosts - 0 addresses
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/host
Nov 23 10:03:59 np0005532585.localdomain dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/opts
Nov 23 10:03:59 np0005532585.localdomain podman[330167]: 2025-11-23 10:03:59.937763836 +0000 UTC m=+0.056105771 container kill cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:03:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:03:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:04:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:00.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:00 np0005532585.localdomain kernel: device tape9fe654f-69 left promiscuous mode
Nov 23 10:04:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:00.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:04:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2561672654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:04:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2561672654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:00 np0005532585.localdomain podman[330209]: 2025-11-23 10:04:00.490809155 +0000 UTC m=+0.058356909 container kill cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:04:00 np0005532585.localdomain dnsmasq[330150]: exiting on receipt of SIGTERM
Nov 23 10:04:00 np0005532585.localdomain systemd[1]: libpod-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14.scope: Deactivated successfully.
Nov 23 10:04:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2561672654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2561672654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:00 np0005532585.localdomain podman[330221]: 2025-11-23 10:04:00.609715677 +0000 UTC m=+0.107360555 container died cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:04:00 np0005532585.localdomain podman[330221]: 2025-11-23 10:04:00.642609968 +0000 UTC m=+0.140254826 container cleanup cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:04:00 np0005532585.localdomain systemd[1]: libpod-conmon-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14.scope: Deactivated successfully.
Nov 23 10:04:00 np0005532585.localdomain podman[330223]: 2025-11-23 10:04:00.665887219 +0000 UTC m=+0.153138614 container remove cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:04:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-db53714652613f07480fa88db9328a7f78acd740bb6a93da8aece7d20ced86cf-merged.mount: Deactivated successfully.
Nov 23 10:04:01 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:01 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d9b96e7fb\x2d0af6\x2d422a\x2d9328\x2d26ea617f94f5.mount: Deactivated successfully.
Nov 23 10:04:01 np0005532585.localdomain ceph-mon[300199]: pgmap v337: 177 pgs: 177 active+clean; 192 MiB data, 957 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.7 MiB/s wr, 176 op/s
Nov 23 10:04:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:02.174 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 e163: 6 total, 6 up, 6 in
Nov 23 10:04:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:04:04 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:04.015 263258 INFO neutron.agent.linux.ip_lib [None req-5980f272-a008-4562-9c6f-00c268543df4 - - - - - -] Device tap9d23bf51-48 cannot be used as it has no MAC address
Nov 23 10:04:04 np0005532585.localdomain ceph-mon[300199]: osdmap e163: 6 total, 6 up, 6 in
Nov 23 10:04:04 np0005532585.localdomain ceph-mon[300199]: pgmap v339: 177 pgs: 177 active+clean; 192 MiB data, 957 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 23 KiB/s wr, 124 op/s
Nov 23 10:04:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:04.079 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:04 np0005532585.localdomain kernel: device tap9d23bf51-48 entered promiscuous mode
Nov 23 10:04:04 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892244.0848] manager: (tap9d23bf51-48): new Generic device (/org/freedesktop/NetworkManager/Devices/77)
Nov 23 10:04:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:04.086 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:04Z|00477|binding|INFO|Claiming lport 9d23bf51-4878-4815-a311-5305afd6c960 for this chassis.
Nov 23 10:04:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:04Z|00478|binding|INFO|9d23bf51-4878-4815-a311-5305afd6c960: Claiming unknown
Nov 23 10:04:04 np0005532585.localdomain systemd-udevd[330260]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:04.097 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e25b49f-2b5c-47dc-9e51-af7d0f597458, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=9d23bf51-4878-4815-a311-5305afd6c960) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:04.099 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9d23bf51-4878-4815-a311-5305afd6c960 in datapath fb46dcb8-1e03-4e90-b074-b34f166ad626 bound to our chassis
Nov 23 10:04:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:04.104 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fb46dcb8-1e03-4e90-b074-b34f166ad626 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:04 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:04.107 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6062012f-ffc1-4f1c-9dbb-c7c0fae5a847]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:04Z|00479|binding|INFO|Setting lport 9d23bf51-4878-4815-a311-5305afd6c960 ovn-installed in OVS
Nov 23 10:04:04 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:04Z|00480|binding|INFO|Setting lport 9d23bf51-4878-4815-a311-5305afd6c960 up in Southbound
Nov 23 10:04:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:04.121 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:04.123 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap9d23bf51-48: No such device
Nov 23 10:04:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:04.163 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:04.196 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:05 np0005532585.localdomain podman[330331]: 
Nov 23 10:04:05 np0005532585.localdomain podman[330331]: 2025-11-23 10:04:05.089037629 +0000 UTC m=+0.101081236 container create 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:04:05 np0005532585.localdomain systemd[1]: Started libpod-conmon-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a.scope.
Nov 23 10:04:05 np0005532585.localdomain podman[330331]: 2025-11-23 10:04:05.041981781 +0000 UTC m=+0.054025408 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:05 np0005532585.localdomain systemd[1]: tmp-crun.lEc4qv.mount: Deactivated successfully.
Nov 23 10:04:05 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:05 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ee271f1a827858027c6783c7ec8838583352e6408914449aa6a883f1c3a25e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:05 np0005532585.localdomain podman[330331]: 2025-11-23 10:04:05.190200786 +0000 UTC m=+0.202244413 container init 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:04:05 np0005532585.localdomain podman[330331]: 2025-11-23 10:04:05.199737893 +0000 UTC m=+0.211781500 container start 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:04:05 np0005532585.localdomain dnsmasq[330349]: started, version 2.85 cachesize 150
Nov 23 10:04:05 np0005532585.localdomain dnsmasq[330349]: DNS service limited to local subnets
Nov 23 10:04:05 np0005532585.localdomain dnsmasq[330349]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:05 np0005532585.localdomain dnsmasq[330349]: warning: no upstream servers configured
Nov 23 10:04:05 np0005532585.localdomain dnsmasq-dhcp[330349]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:04:05 np0005532585.localdomain dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 0 addresses
Nov 23 10:04:05 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host
Nov 23 10:04:05 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts
Nov 23 10:04:05 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:05.365 263258 INFO neutron.agent.dhcp.agent [None req-29820e85-86e1-4ebf-8ba9-57623ed7f6e5 - - - - - -] DHCP configuration for ports {'5fc5bd48-bf7c-4d0a-a432-46b8babd3b7a'} is completed
Nov 23 10:04:05 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:05.408 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:04Z, description=, device_id=58cf0f29-b2f0-4d2a-8c4c-099287c6e849, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c946abe0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c946ae80>], id=e189f8c0-3b95-4fe4-bad8-556b71c8aa3c, ip_allocation=immediate, mac_address=fa:16:3e:f5:ab:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:01Z, description=, dns_domain=, id=fb46dcb8-1e03-4e90-b074-b34f166ad626, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-240847515, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43865, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2449, status=ACTIVE, subnets=['6e136941-a1bc-4b07-ae85-8009aea12ffe'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:03Z, vlan_transparent=None, network_id=fb46dcb8-1e03-4e90-b074-b34f166ad626, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2455, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:04Z on network fb46dcb8-1e03-4e90-b074-b34f166ad626
Nov 23 10:04:05 np0005532585.localdomain ceph-mon[300199]: pgmap v340: 177 pgs: 177 active+clean; 192 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 20 KiB/s wr, 123 op/s
Nov 23 10:04:05 np0005532585.localdomain dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 1 addresses
Nov 23 10:04:05 np0005532585.localdomain podman[330369]: 2025-11-23 10:04:05.618493348 +0000 UTC m=+0.066016360 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:05 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host
Nov 23 10:04:05 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts
Nov 23 10:04:05 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:05.907 263258 INFO neutron.agent.dhcp.agent [None req-d74b055e-3da3-4a47-81e8-3a6f4da0a475 - - - - - -] DHCP configuration for ports {'e189f8c0-3b95-4fe4-bad8-556b71c8aa3c'} is completed
Nov 23 10:04:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:04:06 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:04:07 np0005532585.localdomain podman[330389]: 2025-11-23 10:04:07.031830551 +0000 UTC m=+0.084814195 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:04:07 np0005532585.localdomain podman[330389]: 2025-11-23 10:04:07.044641478 +0000 UTC m=+0.097625172 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:04:07 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:04:07 np0005532585.localdomain podman[330390]: 2025-11-23 10:04:07.142596498 +0000 UTC m=+0.193224871 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:04:07 np0005532585.localdomain podman[330390]: 2025-11-23 10:04:07.211158403 +0000 UTC m=+0.261786776 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:04:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:07.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:07 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:04:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:07.415 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:04Z, description=, device_id=58cf0f29-b2f0-4d2a-8c4c-099287c6e849, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3c9a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3c820>], id=e189f8c0-3b95-4fe4-bad8-556b71c8aa3c, ip_allocation=immediate, mac_address=fa:16:3e:f5:ab:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:01Z, description=, dns_domain=, id=fb46dcb8-1e03-4e90-b074-b34f166ad626, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-240847515, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43865, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2449, status=ACTIVE, subnets=['6e136941-a1bc-4b07-ae85-8009aea12ffe'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:03Z, vlan_transparent=None, network_id=fb46dcb8-1e03-4e90-b074-b34f166ad626, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2455, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:04Z on network fb46dcb8-1e03-4e90-b074-b34f166ad626
Nov 23 10:04:07 np0005532585.localdomain ceph-mon[300199]: pgmap v341: 177 pgs: 177 active+clean; 192 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 19 KiB/s wr, 115 op/s
Nov 23 10:04:07 np0005532585.localdomain dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 1 addresses
Nov 23 10:04:07 np0005532585.localdomain podman[330448]: 2025-11-23 10:04:07.620076871 +0000 UTC m=+0.061015189 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:04:07 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host
Nov 23 10:04:07 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts
Nov 23 10:04:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:07.669 2 INFO neutron.agent.securitygroups_rpc [None req-cd03e682-7688-4e93-ac2d-e601f5fc3971 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:07 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:07.894 263258 INFO neutron.agent.dhcp.agent [None req-cef8ef95-1676-4fbb-a0b7-14fe3e59f797 - - - - - -] DHCP configuration for ports {'e189f8c0-3b95-4fe4-bad8-556b71c8aa3c'} is completed
Nov 23 10:04:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:08.169 2 INFO neutron.agent.securitygroups_rpc [None req-3d60f928-a89d-481c-a25d-e6417d1d55cf 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:04:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:08.744 2 INFO neutron.agent.securitygroups_rpc [None req-4c8f3bc2-c43f-4c06-bcd4-666015157129 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:08.959 2 INFO neutron.agent.securitygroups_rpc [None req-fdb6f567-90f0-41d5-acb8-83f08adab1b1 f436a64c9a134831a0f528309f399f1d 807b835f4cc944269d2f71f8e519b08a - - default default] Security group member updated ['c2582f3e-b285-4f13-ba8c-38a0c5b47d8d']
Nov 23 10:04:09 np0005532585.localdomain dnsmasq[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/addn_hosts - 0 addresses
Nov 23 10:04:09 np0005532585.localdomain dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/host
Nov 23 10:04:09 np0005532585.localdomain podman[330485]: 2025-11-23 10:04:09.201250971 +0000 UTC m=+0.059537475 container kill 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:04:09 np0005532585.localdomain dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/opts
Nov 23 10:04:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:09.302 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:04:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:04:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:04:09 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:09.342 2 INFO neutron.agent.securitygroups_rpc [None req-b9ca6263-bc29-4379-89bf-449c3fc12e0d 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:09 np0005532585.localdomain ceph-mon[300199]: pgmap v342: 177 pgs: 177 active+clean; 213 MiB data, 973 MiB used, 41 GiB / 42 GiB avail; 693 KiB/s rd, 2.4 MiB/s wr, 50 op/s
Nov 23 10:04:09 np0005532585.localdomain dnsmasq[330031]: exiting on receipt of SIGTERM
Nov 23 10:04:09 np0005532585.localdomain podman[330522]: 2025-11-23 10:04:09.827365081 +0000 UTC m=+0.057261265 container kill 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:09 np0005532585.localdomain systemd[1]: libpod-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e.scope: Deactivated successfully.
Nov 23 10:04:09 np0005532585.localdomain podman[330534]: 2025-11-23 10:04:09.895861895 +0000 UTC m=+0.053811932 container died 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:04:09 np0005532585.localdomain podman[330534]: 2025-11-23 10:04:09.938062316 +0000 UTC m=+0.096012353 container cleanup 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:04:09 np0005532585.localdomain systemd[1]: libpod-conmon-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e.scope: Deactivated successfully.
Nov 23 10:04:09 np0005532585.localdomain podman[330536]: 2025-11-23 10:04:09.971867745 +0000 UTC m=+0.120478130 container remove 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:04:10 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:10Z|00481|binding|INFO|Releasing lport 305095bc-169f-4019-8019-b335745719a8 from this chassis (sb_readonly=0)
Nov 23 10:04:10 np0005532585.localdomain kernel: device tap305095bc-16 left promiscuous mode
Nov 23 10:04:10 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:10Z|00482|binding|INFO|Setting lport 305095bc-169f-4019-8019-b335745719a8 down in Southbound
Nov 23 10:04:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:10.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.026 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '807b835f4cc944269d2f71f8e519b08a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7004edea-7321-4a7b-bb91-bf959c0155ab, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=305095bc-169f-4019-8019-b335745719a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.029 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 305095bc-169f-4019-8019-b335745719a8 in datapath fbb2f473-9d45-472b-acf8-1ed5f2c6e75a unbound from our chassis
Nov 23 10:04:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:10.031 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.031 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:04:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:10.032 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.032 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8163c8c4-0e0a-489a-8723-4098f9ed5c33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:10.059 263258 INFO neutron.agent.dhcp.agent [None req-89cd6c38-4ad8-49af-832c-b992818f16bf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:10 np0005532585.localdomain dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 0 addresses
Nov 23 10:04:10 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host
Nov 23 10:04:10 np0005532585.localdomain dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts
Nov 23 10:04:10 np0005532585.localdomain podman[330577]: 2025-11-23 10:04:10.079919189 +0000 UTC m=+0.042761649 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:04:10 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:10.173 2 INFO neutron.agent.securitygroups_rpc [None req-aba9c038-400a-4d01-8bf0-588461edf0a1 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b7be32356762f3029209aca713f7a677b1c24e82a292b776689aed1ed5de000e-merged.mount: Deactivated successfully.
Nov 23 10:04:10 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:10 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dfbb2f473\x2d9d45\x2d472b\x2dacf8\x2d1ed5f2c6e75a.mount: Deactivated successfully.
Nov 23 10:04:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:10.237 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:10 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:10Z|00483|binding|INFO|Releasing lport 9d23bf51-4878-4815-a311-5305afd6c960 from this chassis (sb_readonly=0)
Nov 23 10:04:10 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:10Z|00484|binding|INFO|Setting lport 9d23bf51-4878-4815-a311-5305afd6c960 down in Southbound
Nov 23 10:04:10 np0005532585.localdomain kernel: device tap9d23bf51-48 left promiscuous mode
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.248 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e25b49f-2b5c-47dc-9e51-af7d0f597458, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=9d23bf51-4878-4815-a311-5305afd6c960) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.250 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9d23bf51-4878-4815-a311-5305afd6c960 in datapath fb46dcb8-1e03-4e90-b074-b34f166ad626 unbound from our chassis
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.251 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fb46dcb8-1e03-4e90-b074-b34f166ad626 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:10 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:10.252 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c273d104-cf0b-4b9c-a5c5-49d685cd7008]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:10.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:10.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:10.978 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:11 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:11Z|00485|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:04:11 np0005532585.localdomain ceph-mon[300199]: pgmap v343: 177 pgs: 177 active+clean; 213 MiB data, 973 MiB used, 41 GiB / 42 GiB avail; 692 KiB/s rd, 2.4 MiB/s wr, 50 op/s
Nov 23 10:04:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:11.562 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:04:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:04:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:04:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155678 "" "Go-http-client/1.1"
Nov 23 10:04:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:04:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1"
Nov 23 10:04:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:12.007 2 INFO neutron.agent.securitygroups_rpc [None req-b1d88626-831d-4bca-895f-9342c26bbcc2 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:12 np0005532585.localdomain dnsmasq[330349]: exiting on receipt of SIGTERM
Nov 23 10:04:12 np0005532585.localdomain podman[330615]: 2025-11-23 10:04:12.181984 +0000 UTC m=+0.063062491 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:12 np0005532585.localdomain systemd[1]: libpod-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a.scope: Deactivated successfully.
Nov 23 10:04:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:12.217 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:12 np0005532585.localdomain podman[330630]: 2025-11-23 10:04:12.228984965 +0000 UTC m=+0.032484029 container died 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:04:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:12 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9ee271f1a827858027c6783c7ec8838583352e6408914449aa6a883f1c3a25e4-merged.mount: Deactivated successfully.
Nov 23 10:04:12 np0005532585.localdomain podman[330630]: 2025-11-23 10:04:12.266215657 +0000 UTC m=+0.069714711 container remove 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:12 np0005532585.localdomain systemd[1]: libpod-conmon-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a.scope: Deactivated successfully.
Nov 23 10:04:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:12.462 263258 INFO neutron.agent.dhcp.agent [None req-d61d22d5-ba6f-424d-9828-c48bc8239139 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:12.463 263258 INFO neutron.agent.dhcp.agent [None req-d61d22d5-ba6f-424d-9828-c48bc8239139 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:13 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dfb46dcb8\x2d1e03\x2d4e90\x2db074\x2db34f166ad626.mount: Deactivated successfully.
Nov 23 10:04:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:13.375 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:04:14 np0005532585.localdomain ceph-mon[300199]: pgmap v344: 177 pgs: 177 active+clean; 213 MiB data, 973 MiB used, 41 GiB / 42 GiB avail; 663 KiB/s rd, 2.3 MiB/s wr, 48 op/s
Nov 23 10:04:15 np0005532585.localdomain ceph-mon[300199]: pgmap v345: 177 pgs: 177 active+clean; 225 MiB data, 1017 MiB used, 41 GiB / 42 GiB avail; 777 KiB/s rd, 2.1 MiB/s wr, 77 op/s
Nov 23 10:04:15 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:15.641 263258 INFO neutron.agent.linux.ip_lib [None req-76c51997-687f-4567-b5c6-457cc8e31ad8 - - - - - -] Device tapa41f1fd9-25 cannot be used as it has no MAC address
Nov 23 10:04:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:15.667 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:15 np0005532585.localdomain kernel: device tapa41f1fd9-25 entered promiscuous mode
Nov 23 10:04:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:15.675 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:15 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892255.6757] manager: (tapa41f1fd9-25): new Generic device (/org/freedesktop/NetworkManager/Devices/78)
Nov 23 10:04:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:15Z|00486|binding|INFO|Claiming lport a41f1fd9-25d7-4c85-96e2-18d396386762 for this chassis.
Nov 23 10:04:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:15Z|00487|binding|INFO|a41f1fd9-25d7-4c85-96e2-18d396386762: Claiming unknown
Nov 23 10:04:15 np0005532585.localdomain systemd-udevd[330663]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:15.689 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a81c63d1-c197-41eb-93f7-be983c9ed80d, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a41f1fd9-25d7-4c85-96e2-18d396386762) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:15.693 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a41f1fd9-25d7-4c85-96e2-18d396386762 in datapath accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e bound to our chassis
Nov 23 10:04:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:15.694 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:15 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:15.696 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3baab3fe-cacc-4604-a218-719161fccc6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:15.705 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:15Z|00488|binding|INFO|Setting lport a41f1fd9-25d7-4c85-96e2-18d396386762 ovn-installed in OVS
Nov 23 10:04:15 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:15Z|00489|binding|INFO|Setting lport a41f1fd9-25d7-4c85-96e2-18d396386762 up in Southbound
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:15.710 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device
Nov 23 10:04:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:15.753 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:15.784 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:16 np0005532585.localdomain podman[330734]: 
Nov 23 10:04:16 np0005532585.localdomain podman[330734]: 2025-11-23 10:04:16.8030727 +0000 UTC m=+0.088289030 container create d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:04:16 np0005532585.localdomain systemd[1]: Started libpod-conmon-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e.scope.
Nov 23 10:04:16 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:16 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69ba135d8b099b630e853bde2c2d90fa92899019265ae0b9907bbcf5750d8edb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:16 np0005532585.localdomain podman[330734]: 2025-11-23 10:04:16.762526709 +0000 UTC m=+0.047743089 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:16 np0005532585.localdomain podman[330734]: 2025-11-23 10:04:16.869333376 +0000 UTC m=+0.154549736 container init d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:16 np0005532585.localdomain podman[330734]: 2025-11-23 10:04:16.880434441 +0000 UTC m=+0.165650791 container start d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:04:16 np0005532585.localdomain dnsmasq[330752]: started, version 2.85 cachesize 150
Nov 23 10:04:16 np0005532585.localdomain dnsmasq[330752]: DNS service limited to local subnets
Nov 23 10:04:16 np0005532585.localdomain dnsmasq[330752]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:16 np0005532585.localdomain dnsmasq[330752]: warning: no upstream servers configured
Nov 23 10:04:16 np0005532585.localdomain dnsmasq-dhcp[330752]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:04:16 np0005532585.localdomain dnsmasq[330752]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses
Nov 23 10:04:16 np0005532585.localdomain dnsmasq-dhcp[330752]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:16 np0005532585.localdomain dnsmasq-dhcp[330752]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:17 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:17.015 263258 INFO neutron.agent.dhcp.agent [None req-427d2825-c3cf-4df7-9137-6592cbc40c01 - - - - - -] DHCP configuration for ports {'b87e3e64-b6cc-4f08-95a6-de593e031494'} is completed
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.220 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.232 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.232 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.233 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.233 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:04:17 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e164 e164: 6 total, 6 up, 6 in
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.605 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.606 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.606 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:04:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:17.607 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:04:17 np0005532585.localdomain systemd[1]: tmp-crun.Z89cRe.mount: Deactivated successfully.
Nov 23 10:04:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:04:18 np0005532585.localdomain ceph-mon[300199]: osdmap e164: 6 total, 6 up, 6 in
Nov 23 10:04:18 np0005532585.localdomain ceph-mon[300199]: pgmap v347: 177 pgs: 177 active+clean; 225 MiB data, 1017 MiB used, 41 GiB / 42 GiB avail; 429 KiB/s rd, 2.6 MiB/s wr, 77 op/s
Nov 23 10:04:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 10:04:18 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4044330584' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:18 np0005532585.localdomain systemd[1]: tmp-crun.xzNgxZ.mount: Deactivated successfully.
Nov 23 10:04:18 np0005532585.localdomain dnsmasq[330752]: exiting on receipt of SIGTERM
Nov 23 10:04:18 np0005532585.localdomain podman[330769]: 2025-11-23 10:04:18.693616178 +0000 UTC m=+0.060311777 container kill d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:04:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:04:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:04:18 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:04:18 np0005532585.localdomain systemd[1]: libpod-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e.scope: Deactivated successfully.
Nov 23 10:04:18 np0005532585.localdomain podman[330784]: 2025-11-23 10:04:18.771257697 +0000 UTC m=+0.053813062 container died d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:04:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:18.788 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:04:18 np0005532585.localdomain podman[330792]: 2025-11-23 10:04:18.810037555 +0000 UTC m=+0.083648331 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Nov 23 10:04:18 np0005532585.localdomain podman[330792]: 2025-11-23 10:04:18.85933572 +0000 UTC m=+0.132946546 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 10:04:18 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:04:18 np0005532585.localdomain podman[330791]: 2025-11-23 10:04:18.919043479 +0000 UTC m=+0.196655866 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 23 10:04:18 np0005532585.localdomain podman[330786]: 2025-11-23 10:04:18.959615931 +0000 UTC m=+0.239622830 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:04:18 np0005532585.localdomain podman[330791]: 2025-11-23 10:04:18.9788579 +0000 UTC m=+0.256470297 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:19 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:04:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:19.021 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:04:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:19.022 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:04:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:19.022 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:19.023 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:19.023 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:04:19 np0005532585.localdomain podman[330786]: 2025-11-23 10:04:19.035300341 +0000 UTC m=+0.315307250 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:04:19 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:04:19 np0005532585.localdomain podman[330784]: 2025-11-23 10:04:19.115156377 +0000 UTC m=+0.397711742 container remove d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 10:04:19 np0005532585.localdomain systemd[1]: libpod-conmon-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e.scope: Deactivated successfully.
Nov 23 10:04:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4044330584' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:19 np0005532585.localdomain ceph-mon[300199]: pgmap v348: 177 pgs: 177 active+clean; 225 MiB data, 1017 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 133 KiB/s wr, 79 op/s
Nov 23 10:04:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e165 e165: 6 total, 6 up, 6 in
Nov 23 10:04:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:19.680 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:9f:5c 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a81c63d1-c197-41eb-93f7-be983c9ed80d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b87e3e64-b6cc-4f08-95a6-de593e031494) old=Port_Binding(mac=['fa:16:3e:af:9f:5c 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:19.682 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b87e3e64-b6cc-4f08-95a6-de593e031494 in datapath accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e updated
Nov 23 10:04:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:19.685 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port d51081f3-52bd-41f7-b871-b291aa0ee588 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:04:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:19.685 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:04:19 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:19.686 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9075d485-ddfa-4a72-aa09-e7d97ab604e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-69ba135d8b099b630e853bde2c2d90fa92899019265ae0b9907bbcf5750d8edb-merged.mount: Deactivated successfully.
Nov 23 10:04:19 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:20.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:20.373 263258 INFO neutron.agent.linux.ip_lib [None req-9ddd3a60-f3f4-410e-a986-5a8b0041fb18 - - - - - -] Device tap39f9520b-e6 cannot be used as it has no MAC address
Nov 23 10:04:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:20.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:20 np0005532585.localdomain kernel: device tap39f9520b-e6 entered promiscuous mode
Nov 23 10:04:20 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892260.4133] manager: (tap39f9520b-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/79)
Nov 23 10:04:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:20.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:20 np0005532585.localdomain systemd-udevd[330903]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:20Z|00490|binding|INFO|Claiming lport 39f9520b-e67f-48de-9178-ad0c9c37f804 for this chassis.
Nov 23 10:04:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:20Z|00491|binding|INFO|39f9520b-e67f-48de-9178-ad0c9c37f804: Claiming unknown
Nov 23 10:04:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:20.430 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8633d61c76748a7a900f3c8cea84ef3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de783514-b079-4988-843f-abee02f82863, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=39f9520b-e67f-48de-9178-ad0c9c37f804) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:20.432 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 39f9520b-e67f-48de-9178-ad0c9c37f804 in datapath 556c5c2e-c414-4271-8e77-d61a599ccbad bound to our chassis
Nov 23 10:04:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:20.435 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 556c5c2e-c414-4271-8e77-d61a599ccbad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:20.436 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ff08a49a-33a9-4f97-bd6d-45896ba50239]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:20Z|00492|binding|INFO|Setting lport 39f9520b-e67f-48de-9178-ad0c9c37f804 ovn-installed in OVS
Nov 23 10:04:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:20Z|00493|binding|INFO|Setting lport 39f9520b-e67f-48de-9178-ad0c9c37f804 up in Southbound
Nov 23 10:04:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:20.453 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:20.512 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:20.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:20 np0005532585.localdomain ceph-mon[300199]: osdmap e165: 6 total, 6 up, 6 in
Nov 23 10:04:21 np0005532585.localdomain podman[330960]: 
Nov 23 10:04:21 np0005532585.localdomain podman[330960]: 2025-11-23 10:04:21.077544459 +0000 UTC m=+0.092109036 container create 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:04:21 np0005532585.localdomain systemd[1]: Started libpod-conmon-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498.scope.
Nov 23 10:04:21 np0005532585.localdomain podman[330960]: 2025-11-23 10:04:21.032651387 +0000 UTC m=+0.047215994 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:21 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:21.142 2 INFO neutron.agent.securitygroups_rpc [None req-b505d753-a321-4285-8b8a-57c320b6a991 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:21 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:21 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e452c8f92d1256ad9f801b589d5525f70ce43f85a0d2fd79bc7391c3f9d88b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:21 np0005532585.localdomain podman[330960]: 2025-11-23 10:04:21.162448726 +0000 UTC m=+0.177013313 container init 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:21 np0005532585.localdomain podman[330960]: 2025-11-23 10:04:21.171273423 +0000 UTC m=+0.185838010 container start 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[330981]: started, version 2.85 cachesize 150
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[330981]: DNS service limited to local subnets
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[330981]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[330981]: warning: no upstream servers configured
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[330981]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[330981]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:21.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:21.236 263258 INFO neutron.agent.dhcp.agent [None req-69a79c0f-ed94-4296-ad2f-407a6886f388 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:20Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7da00>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7d520>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a66730>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c9460b50>], id=27ad7b49-380b-4823-9acc-84d37fa1a881, ip_allocation=immediate, mac_address=fa:16:3e:3f:cd:97, name=tempest-PortsTestJSON-278306433, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:12Z, description=, dns_domain=, id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-571660484, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15506, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=2492, status=ACTIVE, subnets=['10c87fae-337c-47db-8d78-19673fc519f2', 'd18e15c0-46c0-4162-ac14-6406dd01bc14'], tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:18Z, vlan_transparent=None, network_id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cadd5356-9a8d-419a-ac04-589c2522a695'], standard_attr_id=2545, status=DOWN, tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:20Z on network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e
Nov 23 10:04:21 np0005532585.localdomain podman[331005]: 
Nov 23 10:04:21 np0005532585.localdomain podman[331005]: 2025-11-23 10:04:21.479206418 +0000 UTC m=+0.105075116 container create 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:21 np0005532585.localdomain systemd[1]: Started libpod-conmon-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72.scope.
Nov 23 10:04:21 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:21 np0005532585.localdomain podman[331005]: 2025-11-23 10:04:21.4314754 +0000 UTC m=+0.057344128 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:21 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ddc73133ee97a18b5d676ae71aebdcc91b4415d41e32db3c006d1af969d02b28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:21.540 263258 INFO neutron.agent.dhcp.agent [None req-536422c3-25b8-428a-aab5-e2ae20ae31e6 - - - - - -] DHCP configuration for ports {'a41f1fd9-25d7-4c85-96e2-18d396386762', 'b87e3e64-b6cc-4f08-95a6-de593e031494'} is completed
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 2 addresses
Nov 23 10:04:21 np0005532585.localdomain podman[331032]: 2025-11-23 10:04:21.551086904 +0000 UTC m=+0.063243996 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:21 np0005532585.localdomain podman[331005]: 2025-11-23 10:04:21.592861372 +0000 UTC m=+0.218730070 container init 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:04:21 np0005532585.localdomain podman[331005]: 2025-11-23 10:04:21.601366478 +0000 UTC m=+0.227235176 container start 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[331054]: started, version 2.85 cachesize 150
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[331054]: DNS service limited to local subnets
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[331054]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[331054]: warning: no upstream servers configured
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[331054]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:04:21 np0005532585.localdomain dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 0 addresses
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host
Nov 23 10:04:21 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts
Nov 23 10:04:21 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3761541653' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:21 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3553475909' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:21 np0005532585.localdomain ceph-mon[300199]: pgmap v350: 177 pgs: 177 active+clean; 225 MiB data, 1017 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 166 KiB/s wr, 99 op/s
Nov 23 10:04:21 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e166 e166: 6 total, 6 up, 6 in
Nov 23 10:04:21 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:21.912 263258 INFO neutron.agent.dhcp.agent [None req-a529ca7a-60ec-4769-8137-40df9acc23c6 - - - - - -] DHCP configuration for ports {'4e58acd3-0ca1-4d41-96d8-20dc84b68999', '27ad7b49-380b-4823-9acc-84d37fa1a881'} is completed
Nov 23 10:04:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:22.006 263258 INFO neutron.agent.linux.ip_lib [None req-7dd5ffbc-8c5b-42f2-812d-a91f1617a370 - - - - - -] Device tap1ae8d8fe-51 cannot be used as it has no MAC address
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.071 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:22 np0005532585.localdomain kernel: device tap1ae8d8fe-51 entered promiscuous mode
Nov 23 10:04:22 np0005532585.localdomain systemd-udevd[330905]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:22 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892262.0800] manager: (tap1ae8d8fe-51): new Generic device (/org/freedesktop/NetworkManager/Devices/80)
Nov 23 10:04:22 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:22Z|00494|binding|INFO|Claiming lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 for this chassis.
Nov 23 10:04:22 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:22Z|00495|binding|INFO|1ae8d8fe-5156-4993-b482-c4b01b921e85: Claiming unknown
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:22 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:22.094 2 INFO neutron.agent.securitygroups_rpc [None req-71e1d61a-9581-46c3-850d-9298f6399521 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:22 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:22.097 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14e80905-5b14-4ffd-9b3d-73c1c6bb812f, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1ae8d8fe-5156-4993-b482-c4b01b921e85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:22 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:22.099 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1ae8d8fe-5156-4993-b482-c4b01b921e85 in datapath 2fa353bd-02c5-4044-adba-b918030b206e bound to our chassis
Nov 23 10:04:22 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:22.100 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fa353bd-02c5-4044-adba-b918030b206e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:22 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:22.101 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[55c22863-6c4d-4f0a-a928-78339ce4acc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:22.110 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:20Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4dee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4d460>], id=27ad7b49-380b-4823-9acc-84d37fa1a881, ip_allocation=immediate, mac_address=fa:16:3e:3f:cd:97, name=tempest-PortsTestJSON-278306433, network_id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['cadd5356-9a8d-419a-ac04-589c2522a695'], standard_attr_id=2545, status=DOWN, tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:21Z on network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e
Nov 23 10:04:22 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:22Z|00496|binding|INFO|Setting lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 ovn-installed in OVS
Nov 23 10:04:22 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:22Z|00497|binding|INFO|Setting lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 up in Southbound
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.168 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.205 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.221 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.239 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.241 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.242 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:04:22 np0005532585.localdomain dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 1 addresses
Nov 23 10:04:22 np0005532585.localdomain podman[331097]: 2025-11-23 10:04:22.347000519 +0000 UTC m=+0.051671017 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:04:22 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:22 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1006420118' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: osdmap e166: 6 total, 6 up, 6 in
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3320085940' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "format": "json"}]: dispatch
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/299102806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.691 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:04:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e167 e167: 6 total, 6 up, 6 in
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.757 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.757 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.992 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.994 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11083MB free_disk=41.70033645629883GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.995 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:04:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:22.995 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.053 263258 INFO neutron.agent.dhcp.agent [None req-a93a8cc3-d826-4dd1-86d8-7c1d7a7ec395 - - - - - -] DHCP configuration for ports {'27ad7b49-380b-4823-9acc-84d37fa1a881'} is completed
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.090 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.091 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.091 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.117 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.144 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.144 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 10:04:23 np0005532585.localdomain podman[331180]: 
Nov 23 10:04:23 np0005532585.localdomain podman[331180]: 2025-11-23 10:04:23.166687951 +0000 UTC m=+0.071549237 container create 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.178 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 10:04:23 np0005532585.localdomain systemd[1]: Started libpod-conmon-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614.scope.
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.205 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 10:04:23 np0005532585.localdomain systemd[1]: tmp-crun.eL97H9.mount: Deactivated successfully.
Nov 23 10:04:23 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:23 np0005532585.localdomain podman[331180]: 2025-11-23 10:04:23.124955324 +0000 UTC m=+0.029816640 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:23 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf0b02e359ac1721f19ba2c6fc86978494592aa5ddd27403653b646396ae9cc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:23 np0005532585.localdomain podman[331180]: 2025-11-23 10:04:23.235779502 +0000 UTC m=+0.140640788 container init 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 23 10:04:23 np0005532585.localdomain podman[331180]: 2025-11-23 10:04:23.245043961 +0000 UTC m=+0.149905257 container start 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: started, version 2.85 cachesize 150
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: DNS service limited to local subnets
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: warning: no upstream servers configured
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 0 addresses
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.281 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.287 263258 INFO neutron.agent.dhcp.agent [None req-7dd5ffbc-8c5b-42f2-812d-a91f1617a370 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:21Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3c910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3c0a0>], id=0fe884b5-722e-4d25-b56d-fbb9db7a0a8a, ip_allocation=immediate, mac_address=fa:16:3e:10:68:e0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:18Z, description=, dns_domain=, id=2fa353bd-02c5-4044-adba-b918030b206e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-781167405, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46422, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2539, status=ACTIVE, subnets=['4b70b84e-2158-4880-972f-dc112d8544d2'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:20Z, vlan_transparent=None, network_id=2fa353bd-02c5-4044-adba-b918030b206e, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2548, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:21Z on network 2fa353bd-02c5-4044-adba-b918030b206e
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.414 263258 INFO neutron.agent.dhcp.agent [None req-58d8e269-1697-4c21-82a2-1372a0442ded - - - - - -] DHCP configuration for ports {'2145a468-64d1-418f-ba24-1e61ff91c9e8'} is completed
Nov 23 10:04:23 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:23.449 2 INFO neutron.agent.securitygroups_rpc [None req-360628bd-6ea7-46e4-a35e-1747acc2d18c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.467 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:20Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c89aa9a0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c89aabb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6850>, <neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6280>], id=27ad7b49-380b-4823-9acc-84d37fa1a881, ip_allocation=immediate, mac_address=fa:16:3e:3f:cd:97, name=tempest-PortsTestJSON-278306433, network_id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['cadd5356-9a8d-419a-ac04-589c2522a695'], standard_attr_id=2545, status=DOWN, tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:23Z on network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.513 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 1 addresses
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host
Nov 23 10:04:23 np0005532585.localdomain podman[331233]: 2025-11-23 10:04:23.547040549 +0000 UTC m=+0.112213352 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/299102806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: osdmap e167: 6 total, 6 up, 6 in
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2622383035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: pgmap v353: 177 pgs: 177 active+clean; 225 MiB data, 1017 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 30 KiB/s wr, 60 op/s
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.723 263258 INFO neutron.agent.dhcp.agent [None req-7dd5ffbc-8c5b-42f2-812d-a91f1617a370 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:21Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8be6a60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8be6d60>], id=0fe884b5-722e-4d25-b56d-fbb9db7a0a8a, ip_allocation=immediate, mac_address=fa:16:3e:10:68:e0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:18Z, description=, dns_domain=, id=2fa353bd-02c5-4044-adba-b918030b206e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-781167405, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46422, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2539, status=ACTIVE, subnets=['4b70b84e-2158-4880-972f-dc112d8544d2'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:20Z, vlan_transparent=None, network_id=2fa353bd-02c5-4044-adba-b918030b206e, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2548, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:21Z on network 2fa353bd-02c5-4044-adba-b918030b206e
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 2 addresses
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e168 e168: 6 total, 6 up, 6 in
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:23 np0005532585.localdomain podman[331269]: 2025-11-23 10:04:23.730382421 +0000 UTC m=+0.060427722 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:04:23 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/313014762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.776 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.785 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.817 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.821 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:04:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:23.822 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.873 263258 INFO neutron.agent.dhcp.agent [None req-66d0d4ff-854f-48ce-9bee-0c9d10a9a941 - - - - - -] DHCP configuration for ports {'0fe884b5-722e-4d25-b56d-fbb9db7a0a8a'} is completed
Nov 23 10:04:23 np0005532585.localdomain dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 1 addresses
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host
Nov 23 10:04:23 np0005532585.localdomain podman[331307]: 2025-11-23 10:04:23.911312062 +0000 UTC m=+0.062976219 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:04:23 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts
Nov 23 10:04:23 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:23.984 2 INFO neutron.agent.securitygroups_rpc [None req-53980d87-428d-4527-8575-9963b178026f 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:23 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.988 263258 INFO neutron.agent.dhcp.agent [None req-be57a20f-d712-45c5-97f5-2a326ff96e22 - - - - - -] DHCP configuration for ports {'27ad7b49-380b-4823-9acc-84d37fa1a881'} is completed
Nov 23 10:04:24 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:24.161 263258 INFO neutron.agent.dhcp.agent [None req-0e7a1307-7a69-45b0-a8d7-fd2a09ad26d5 - - - - - -] DHCP configuration for ports {'0fe884b5-722e-4d25-b56d-fbb9db7a0a8a'} is completed
Nov 23 10:04:24 np0005532585.localdomain dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses
Nov 23 10:04:24 np0005532585.localdomain podman[331348]: 2025-11-23 10:04:24.259537081 +0000 UTC m=+0.056036589 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:24 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:24 np0005532585.localdomain dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:24 np0005532585.localdomain ceph-mon[300199]: osdmap e168: 6 total, 6 up, 6 in
Nov 23 10:04:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/313014762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:25 np0005532585.localdomain dnsmasq[330981]: exiting on receipt of SIGTERM
Nov 23 10:04:25 np0005532585.localdomain podman[331386]: 2025-11-23 10:04:25.129042833 +0000 UTC m=+0.076772123 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:04:25 np0005532585.localdomain systemd[1]: libpod-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498.scope: Deactivated successfully.
Nov 23 10:04:25 np0005532585.localdomain systemd[1]: tmp-crun.Rj1YSU.mount: Deactivated successfully.
Nov 23 10:04:25 np0005532585.localdomain podman[331399]: 2025-11-23 10:04:25.203844517 +0000 UTC m=+0.057723081 container died 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:04:25 np0005532585.localdomain podman[331399]: 2025-11-23 10:04:25.24314479 +0000 UTC m=+0.097023304 container cleanup 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:04:25 np0005532585.localdomain systemd[1]: libpod-conmon-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498.scope: Deactivated successfully.
Nov 23 10:04:25 np0005532585.localdomain podman[331400]: 2025-11-23 10:04:25.281556458 +0000 UTC m=+0.130829062 container remove 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e169 e169: 6 total, 6 up, 6 in
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "snap_name": "5aacd1b6-f8f5-4003-8630-0121025e58d0", "format": "json"}]: dispatch
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2125767633' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: pgmap v355: 177 pgs: 177 active+clean; 271 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 3.6 MiB/s wr, 208 op/s
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3854060828' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:04:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3854060828' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:26 np0005532585.localdomain podman[331475]: 
Nov 23 10:04:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-9e452c8f92d1256ad9f801b589d5525f70ce43f85a0d2fd79bc7391c3f9d88b0-merged.mount: Deactivated successfully.
Nov 23 10:04:26 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:26 np0005532585.localdomain podman[331475]: 2025-11-23 10:04:26.172922628 +0000 UTC m=+0.075158465 container create 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:04:26 np0005532585.localdomain systemd[1]: Started libpod-conmon-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91.scope.
Nov 23 10:04:26 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:26 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b39ba390b209ce21a98a569539df45ea77948795ea2108691055c1abc7e9596/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:26 np0005532585.localdomain podman[331475]: 2025-11-23 10:04:26.233980148 +0000 UTC m=+0.136215985 container init 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:26 np0005532585.localdomain podman[331475]: 2025-11-23 10:04:26.141130671 +0000 UTC m=+0.043366568 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:26 np0005532585.localdomain podman[331475]: 2025-11-23 10:04:26.241117332 +0000 UTC m=+0.143353169 container start 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331493]: started, version 2.85 cachesize 150
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331493]: DNS service limited to local subnets
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331493]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331493]: warning: no upstream servers configured
Nov 23 10:04:26 np0005532585.localdomain dnsmasq-dhcp[331493]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331493]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses
Nov 23 10:04:26 np0005532585.localdomain dnsmasq-dhcp[331493]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host
Nov 23 10:04:26 np0005532585.localdomain dnsmasq-dhcp[331493]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.431 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d51081f3-52bd-41f7-b871-b291aa0ee588 with type ""
Nov 23 10:04:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:26Z|00498|binding|INFO|Removing iface tapa41f1fd9-25 ovn-installed in OVS
Nov 23 10:04:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:26Z|00499|binding|INFO|Removing lport a41f1fd9-25d7-4c85-96e2-18d396386762 ovn-installed in OVS
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.433 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a81c63d1-c197-41eb-93f7-be983c9ed80d, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=a41f1fd9-25d7-4c85-96e2-18d396386762) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.434 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.436 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a41f1fd9-25d7-4c85-96e2-18d396386762 in datapath accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e unbound from our chassis
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.439 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.440 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[24903e6b-9ff5-4992-8265-fa3e6164b2b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.442 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:26.514 263258 INFO neutron.agent.dhcp.agent [None req-9bb08951-2c68-4617-9549-c8440aab8b5e - - - - - -] DHCP configuration for ports {'a41f1fd9-25d7-4c85-96e2-18d396386762', 'b87e3e64-b6cc-4f08-95a6-de593e031494'} is completed
Nov 23 10:04:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:26.613 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=bace1285-0dd7-4599-87e6-ae783a5add31, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aeb6d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aeb430>], id=1f8399a1-cba9-45be-90c1-08ef6a05cb9b, ip_allocation=immediate, mac_address=fa:16:3e:56:06:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:14Z, description=, dns_domain=, id=556c5c2e-c414-4271-8e77-d61a599ccbad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1079361768, port_security_enabled=True, project_id=d8633d61c76748a7a900f3c8cea84ef3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5645, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2519, status=ACTIVE, subnets=['4feaee24-e671-42a0-b0a6-e401ae11bc88'], tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:18Z, vlan_transparent=None, network_id=556c5c2e-c414-4271-8e77-d61a599ccbad, port_security_enabled=False, project_id=d8633d61c76748a7a900f3c8cea84ef3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2556, status=DOWN, tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:26Z on network 556c5c2e-c414-4271-8e77-d61a599ccbad
Nov 23 10:04:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:26.658 263258 INFO neutron.agent.linux.ip_lib [None req-807830b7-35e3-4070-8731-d3370c53b145 - - - - - -] Device tap8437b436-67 cannot be used as it has no MAC address
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain kernel: device tap8437b436-67 entered promiscuous mode
Nov 23 10:04:26 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892266.6931] manager: (tap8437b436-67): new Generic device (/org/freedesktop/NetworkManager/Devices/81)
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:26Z|00500|binding|INFO|Claiming lport 8437b436-673f-4dcd-8ddd-08df6599f4ab for this chassis.
Nov 23 10:04:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:26Z|00501|binding|INFO|8437b436-673f-4dcd-8ddd-08df6599f4ab: Claiming unknown
Nov 23 10:04:26 np0005532585.localdomain systemd-udevd[331541]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.705 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e6f3ebd-dc7a-479e-ba09-dcee6cc4d506, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8437b436-673f-4dcd-8ddd-08df6599f4ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.707 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8437b436-673f-4dcd-8ddd-08df6599f4ab in datapath 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 bound to our chassis
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.708 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:26.708 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c53f9f25-b45a-4384-86e0-2a283e88071e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:26Z|00502|binding|INFO|Setting lport 8437b436-673f-4dcd-8ddd-08df6599f4ab ovn-installed in OVS
Nov 23 10:04:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:26Z|00503|binding|INFO|Setting lport 8437b436-673f-4dcd-8ddd-08df6599f4ab up in Southbound
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.744 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331493]: exiting on receipt of SIGTERM
Nov 23 10:04:26 np0005532585.localdomain podman[331517]: 2025-11-23 10:04:26.753038763 +0000 UTC m=+0.106032425 container kill 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:04:26 np0005532585.localdomain systemd[1]: libpod-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91.scope: Deactivated successfully.
Nov 23 10:04:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e170 e170: 6 total, 6 up, 6 in
Nov 23 10:04:26 np0005532585.localdomain ceph-mon[300199]: osdmap e169: 6 total, 6 up, 6 in
Nov 23 10:04:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3854060828' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3854060828' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3931301299' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3931301299' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.799 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.822 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain podman[331555]: 2025-11-23 10:04:26.829030472 +0000 UTC m=+0.065570206 container died 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:04:26 np0005532585.localdomain podman[331555]: 2025-11-23 10:04:26.855380016 +0000 UTC m=+0.091919760 container cleanup 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:26 np0005532585.localdomain systemd[1]: libpod-conmon-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91.scope: Deactivated successfully.
Nov 23 10:04:26 np0005532585.localdomain podman[331557]: 2025-11-23 10:04:26.917530438 +0000 UTC m=+0.144204645 container remove 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:04:26 np0005532585.localdomain dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 1 addresses
Nov 23 10:04:26 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host
Nov 23 10:04:26 np0005532585.localdomain podman[331574]: 2025-11-23 10:04:26.971860375 +0000 UTC m=+0.168859688 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:04:26 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts
Nov 23 10:04:26 np0005532585.localdomain kernel: device tapa41f1fd9-25 left promiscuous mode
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:26.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.004 263258 INFO neutron.agent.dhcp.agent [None req-7a00c611-80b1-46ca-938b-33e21d680f4d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.005 263258 INFO neutron.agent.dhcp.agent [None req-7a00c611-80b1-46ca-938b-33e21d680f4d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0b39ba390b209ce21a98a569539df45ea77948795ea2108691055c1abc7e9596-merged.mount: Deactivated successfully.
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2daccb7a47\x2db6e0\x2d4b0c\x2d81bc\x2d2cbf0d4d0f4e.mount: Deactivated successfully.
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:04:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:27.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.271 263258 INFO neutron.agent.dhcp.agent [None req-7157e83e-185d-404d-8c1e-3c98a58b564b - - - - - -] DHCP configuration for ports {'1f8399a1-cba9-45be-90c1-08ef6a05cb9b'} is completed
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: tmp-crun.r3wCs0.mount: Deactivated successfully.
Nov 23 10:04:27 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:27Z|00504|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:04:27 np0005532585.localdomain podman[331623]: 2025-11-23 10:04:27.312187057 +0000 UTC m=+0.107555872 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:27 np0005532585.localdomain podman[331623]: 2025-11-23 10:04:27.328279681 +0000 UTC m=+0.123648486 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:27.333 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:04:27 np0005532585.localdomain podman[331625]: 2025-11-23 10:04:27.415786328 +0000 UTC m=+0.208277656 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:04:27 np0005532585.localdomain podman[331625]: 2025-11-23 10:04:27.423429367 +0000 UTC m=+0.215920665 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:04:27 np0005532585.localdomain podman[331696]: 
Nov 23 10:04:27 np0005532585.localdomain podman[331696]: 2025-11-23 10:04:27.743294222 +0000 UTC m=+0.079949578 container create b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: Started libpod-conmon-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457.scope.
Nov 23 10:04:27 np0005532585.localdomain ceph-mon[300199]: osdmap e170: 6 total, 6 up, 6 in
Nov 23 10:04:27 np0005532585.localdomain ceph-mon[300199]: pgmap v358: 177 pgs: 177 active+clean; 271 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 174 KiB/s rd, 4.5 MiB/s wr, 256 op/s
Nov 23 10:04:27 np0005532585.localdomain podman[331696]: 2025-11-23 10:04:27.701017759 +0000 UTC m=+0.037673095 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:27 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:27 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b86b4f9fba21239c4ef1c266e92b520c83208906cd248f3a235bb0ae25f61f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e171 e171: 6 total, 6 up, 6 in
Nov 23 10:04:27 np0005532585.localdomain podman[331696]: 2025-11-23 10:04:27.81989319 +0000 UTC m=+0.156548506 container init b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:04:27 np0005532585.localdomain podman[331696]: 2025-11-23 10:04:27.829119348 +0000 UTC m=+0.165774674 container start b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:04:27 np0005532585.localdomain dnsmasq[331714]: started, version 2.85 cachesize 150
Nov 23 10:04:27 np0005532585.localdomain dnsmasq[331714]: DNS service limited to local subnets
Nov 23 10:04:27 np0005532585.localdomain dnsmasq[331714]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:27 np0005532585.localdomain dnsmasq[331714]: warning: no upstream servers configured
Nov 23 10:04:27 np0005532585.localdomain dnsmasq-dhcp[331714]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d
Nov 23 10:04:27 np0005532585.localdomain dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 0 addresses
Nov 23 10:04:27 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host
Nov 23 10:04:27 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts
Nov 23 10:04:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.904 263258 INFO neutron.agent.dhcp.agent [None req-807830b7-35e3-4070-8731-d3370c53b145 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3ce80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3c610>], id=4b22a564-08c0-4921-9bac-e5cb75632958, ip_allocation=immediate, mac_address=fa:16:3e:27:c4:53, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:23Z, description=, dns_domain=, id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-543864074, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10320, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2551, status=ACTIVE, subnets=['a5d8fd54-c4bc-4361-bb44-fb672867e441'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:24Z, vlan_transparent=None, network_id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2557, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:26Z on network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6
Nov 23 10:04:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:28.034 263258 INFO neutron.agent.dhcp.agent [None req-ff340251-4043-4f77-b2d9-5fef3b0067bb - - - - - -] DHCP configuration for ports {'bc8ae273-d15b-44d3-8796-f23802f38110'} is completed
Nov 23 10:04:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e172 e172: 6 total, 6 up, 6 in
Nov 23 10:04:28 np0005532585.localdomain dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 1 addresses
Nov 23 10:04:28 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host
Nov 23 10:04:28 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts
Nov 23 10:04:28 np0005532585.localdomain podman[331733]: 2025-11-23 10:04:28.082573243 +0000 UTC m=+0.054654198 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:28 np0005532585.localdomain systemd[1]: tmp-crun.T6F1Lh.mount: Deactivated successfully.
Nov 23 10:04:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:28.284 263258 INFO neutron.agent.dhcp.agent [None req-36073ca9-2b12-435f-810c-3fca4b7b4fe8 - - - - - -] DHCP configuration for ports {'4b22a564-08c0-4921-9bac-e5cb75632958'} is completed
Nov 23 10:04:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 10:04:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:28.722 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=bace1285-0dd7-4599-87e6-ae783a5add31, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd220>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd5b0>], id=1f8399a1-cba9-45be-90c1-08ef6a05cb9b, ip_allocation=immediate, mac_address=fa:16:3e:56:06:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:14Z, description=, dns_domain=, id=556c5c2e-c414-4271-8e77-d61a599ccbad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1079361768, port_security_enabled=True, project_id=d8633d61c76748a7a900f3c8cea84ef3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5645, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2519, status=ACTIVE, subnets=['4feaee24-e671-42a0-b0a6-e401ae11bc88'], tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:18Z, vlan_transparent=None, network_id=556c5c2e-c414-4271-8e77-d61a599ccbad, port_security_enabled=False, project_id=d8633d61c76748a7a900f3c8cea84ef3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2556, status=DOWN, tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:26Z on network 556c5c2e-c414-4271-8e77-d61a599ccbad
Nov 23 10:04:28 np0005532585.localdomain ceph-mon[300199]: osdmap e171: 6 total, 6 up, 6 in
Nov 23 10:04:28 np0005532585.localdomain ceph-mon[300199]: osdmap e172: 6 total, 6 up, 6 in
Nov 23 10:04:28 np0005532585.localdomain systemd[1]: tmp-crun.Q7r31W.mount: Deactivated successfully.
Nov 23 10:04:28 np0005532585.localdomain podman[331772]: 2025-11-23 10:04:28.96018903 +0000 UTC m=+0.078354742 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:28 np0005532585.localdomain dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 1 addresses
Nov 23 10:04:28 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host
Nov 23 10:04:28 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts
Nov 23 10:04:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:29.000 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa7a90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a0a820>], id=4b22a564-08c0-4921-9bac-e5cb75632958, ip_allocation=immediate, mac_address=fa:16:3e:27:c4:53, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:23Z, description=, dns_domain=, id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-543864074, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10320, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2551, status=ACTIVE, subnets=['a5d8fd54-c4bc-4361-bb44-fb672867e441'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:24Z, vlan_transparent=None, network_id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2557, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:26Z on network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6
Nov 23 10:04:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e173 e173: 6 total, 6 up, 6 in
Nov 23 10:04:29 np0005532585.localdomain dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 1 addresses
Nov 23 10:04:29 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host
Nov 23 10:04:29 np0005532585.localdomain systemd[1]: tmp-crun.NG8Xfl.mount: Deactivated successfully.
Nov 23 10:04:29 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts
Nov 23 10:04:29 np0005532585.localdomain podman[331811]: 2025-11-23 10:04:29.208386586 +0000 UTC m=+0.055402660 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:29.237 263258 INFO neutron.agent.dhcp.agent [None req-4d2e13da-829e-4417-998e-40b0ddf3acf8 - - - - - -] DHCP configuration for ports {'1f8399a1-cba9-45be-90c1-08ef6a05cb9b'} is completed
Nov 23 10:04:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:04:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1866303281' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:04:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1866303281' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:29.487 263258 INFO neutron.agent.dhcp.agent [None req-5f7c91b5-6561-4ae1-966b-5fd577b64a7a - - - - - -] DHCP configuration for ports {'4b22a564-08c0-4921-9bac-e5cb75632958'} is completed
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:04:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:04:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "snap_name": "5aacd1b6-f8f5-4003-8630-0121025e58d0", "target_sub_name": "b0d4ea08-4592-4af4-b78a-914919545708", "format": "json"}]: dispatch
Nov 23 10:04:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b0d4ea08-4592-4af4-b78a-914919545708", "format": "json"}]: dispatch
Nov 23 10:04:30 np0005532585.localdomain ceph-mon[300199]: osdmap e173: 6 total, 6 up, 6 in
Nov 23 10:04:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1866303281' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1866303281' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:30 np0005532585.localdomain ceph-mon[300199]: pgmap v362: 177 pgs: 177 active+clean; 225 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 234 KiB/s rd, 42 KiB/s wr, 324 op/s
Nov 23 10:04:31 np0005532585.localdomain ceph-mon[300199]: mgrmap e47: np0005532584.naxwxy(active, since 9m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:04:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 10:04:31 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3039705748' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e174 e174: 6 total, 6 up, 6 in
Nov 23 10:04:31 np0005532585.localdomain podman[331849]: 2025-11-23 10:04:31.502546223 +0000 UTC m=+0.066008889 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:04:31 np0005532585.localdomain dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 0 addresses
Nov 23 10:04:31 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host
Nov 23 10:04:31 np0005532585.localdomain dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts
Nov 23 10:04:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:31Z|00505|binding|INFO|Releasing lport 39f9520b-e67f-48de-9178-ad0c9c37f804 from this chassis (sb_readonly=0)
Nov 23 10:04:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:31Z|00506|binding|INFO|Setting lport 39f9520b-e67f-48de-9178-ad0c9c37f804 down in Southbound
Nov 23 10:04:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:31.701 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:31 np0005532585.localdomain kernel: device tap39f9520b-e6 left promiscuous mode
Nov 23 10:04:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:31.709 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8633d61c76748a7a900f3c8cea84ef3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de783514-b079-4988-843f-abee02f82863, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=39f9520b-e67f-48de-9178-ad0c9c37f804) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:31.711 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 39f9520b-e67f-48de-9178-ad0c9c37f804 in datapath 556c5c2e-c414-4271-8e77-d61a599ccbad unbound from our chassis
Nov 23 10:04:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:31.714 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 556c5c2e-c414-4271-8e77-d61a599ccbad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:04:31 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:31.715 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4b06c757-f6e2-4585-a9e9-8dc38d9ae988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:31.725 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e175 e175: 6 total, 6 up, 6 in
Nov 23 10:04:32 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3039705748' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:32 np0005532585.localdomain ceph-mon[300199]: osdmap e174: 6 total, 6 up, 6 in
Nov 23 10:04:32 np0005532585.localdomain ceph-mon[300199]: pgmap v364: 177 pgs: 177 active+clean; 225 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 217 KiB/s rd, 39 KiB/s wr, 301 op/s
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:32.615 2 INFO neutron.agent.securitygroups_rpc [None req-fa166c8d-7d85-4b6f-949c-c3bef6490854 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:32.633 263258 INFO neutron.agent.linux.ip_lib [None req-c1584d16-a8a6-41a5-babf-2c48d2c69e5c - - - - - -] Device tap2aa3f527-d9 cannot be used as it has no MAC address
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.658 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain kernel: device tap2aa3f527-d9 entered promiscuous mode
Nov 23 10:04:32 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892272.6652] manager: (tap2aa3f527-d9): new Generic device (/org/freedesktop/NetworkManager/Devices/82)
Nov 23 10:04:32 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:32Z|00507|binding|INFO|Claiming lport 2aa3f527-d9c7-4028-9d39-988134131b8f for this chassis.
Nov 23 10:04:32 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:32Z|00508|binding|INFO|2aa3f527-d9c7-4028-9d39-988134131b8f: Claiming unknown
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.666 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain systemd-udevd[331882]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:32 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:32.688 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:3::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db97bbb5-462a-4c5e-a728-28756b766887, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=2aa3f527-d9c7-4028-9d39-988134131b8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:32 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:32Z|00509|binding|INFO|Setting lport 2aa3f527-d9c7-4028-9d39-988134131b8f ovn-installed in OVS
Nov 23 10:04:32 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:32Z|00510|binding|INFO|Setting lport 2aa3f527-d9c7-4028-9d39-988134131b8f up in Southbound
Nov 23 10:04:32 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:32.690 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa3f527-d9c7-4028-9d39-988134131b8f in datapath 43058c00-cde7-48e1-8e0e-eba5f793c5ac bound to our chassis
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:32.692 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43058c00-cde7-48e1-8e0e-eba5f793c5ac or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:32 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:32.693 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a0bc61d4-4a57-4f22-892d-c3c4e251cc47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.706 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.713 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.757 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:32.785 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e176 e176: 6 total, 6 up, 6 in
Nov 23 10:04:33 np0005532585.localdomain ceph-mon[300199]: osdmap e175: 6 total, 6 up, 6 in
Nov 23 10:04:33 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1119145413' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:33 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1119145413' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:33 np0005532585.localdomain ceph-mon[300199]: osdmap e176: 6 total, 6 up, 6 in
Nov 23 10:04:33 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:33.350 2 INFO neutron.agent.securitygroups_rpc [None req-4ffb6c00-2d92-41b8-843d-89b3bf39eddb 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:04:33 np0005532585.localdomain podman[331937]: 
Nov 23 10:04:33 np0005532585.localdomain podman[331937]: 2025-11-23 10:04:33.628528974 +0000 UTC m=+0.093011802 container create 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 23 10:04:33 np0005532585.localdomain systemd[1]: Started libpod-conmon-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1.scope.
Nov 23 10:04:33 np0005532585.localdomain podman[331937]: 2025-11-23 10:04:33.581345423 +0000 UTC m=+0.045828291 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:33 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:33 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6027bb6fd654c4a054428fb77e5d5ffcb12dbfa91fa97836ded173ca06b20034/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:33 np0005532585.localdomain podman[331937]: 2025-11-23 10:04:33.708595586 +0000 UTC m=+0.173078414 container init 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:04:33 np0005532585.localdomain podman[331937]: 2025-11-23 10:04:33.720452153 +0000 UTC m=+0.184934981 container start 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:04:33 np0005532585.localdomain dnsmasq[331955]: started, version 2.85 cachesize 150
Nov 23 10:04:33 np0005532585.localdomain dnsmasq[331955]: DNS service limited to local subnets
Nov 23 10:04:33 np0005532585.localdomain dnsmasq[331955]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:33 np0005532585.localdomain dnsmasq[331955]: warning: no upstream servers configured
Nov 23 10:04:33 np0005532585.localdomain dnsmasq-dhcp[331955]: DHCPv6, static leases only on 2001:db8:3::, lease time 1d
Nov 23 10:04:33 np0005532585.localdomain dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 0 addresses
Nov 23 10:04:33 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host
Nov 23 10:04:33 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts
Nov 23 10:04:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:33.782 263258 INFO neutron.agent.dhcp.agent [None req-c1584d16-a8a6-41a5-babf-2c48d2c69e5c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:32Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c64520>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad9d30>], id=b15dac7f-40d8-4298-9412-2390f2fb7bcd, ip_allocation=immediate, mac_address=fa:16:3e:d4:02:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:30Z, description=, dns_domain=, id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-123647688, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46466, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2568, status=ACTIVE, subnets=['6842a301-3e2f-49bd-9e15-3d3ecfcc3da6'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:31Z, vlan_transparent=None, network_id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2580, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:32Z on network 43058c00-cde7-48e1-8e0e-eba5f793c5ac
Nov 23 10:04:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:33.922 263258 INFO neutron.agent.dhcp.agent [None req-18278e1a-c29a-4847-acbc-f48cf80aab68 - - - - - -] DHCP configuration for ports {'8ddabb54-5eeb-4483-ac10-999923ac4961'} is completed
Nov 23 10:04:33 np0005532585.localdomain dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 1 addresses
Nov 23 10:04:33 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host
Nov 23 10:04:33 np0005532585.localdomain podman[331972]: 2025-11-23 10:04:33.993974613 +0000 UTC m=+0.063753642 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:33 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts
Nov 23 10:04:34 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:34.148 263258 INFO neutron.agent.dhcp.agent [None req-c1584d16-a8a6-41a5-babf-2c48d2c69e5c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:32Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3cd30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b3c430>], id=b15dac7f-40d8-4298-9412-2390f2fb7bcd, ip_allocation=immediate, mac_address=fa:16:3e:d4:02:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:30Z, description=, dns_domain=, id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-123647688, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46466, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2568, status=ACTIVE, subnets=['6842a301-3e2f-49bd-9e15-3d3ecfcc3da6'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:31Z, vlan_transparent=None, network_id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2580, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:32Z on network 43058c00-cde7-48e1-8e0e-eba5f793c5ac
Nov 23 10:04:34 np0005532585.localdomain ceph-mon[300199]: pgmap v367: 177 pgs: 177 active+clean; 225 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 197 KiB/s rd, 36 KiB/s wr, 272 op/s
Nov 23 10:04:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e177 e177: 6 total, 6 up, 6 in
Nov 23 10:04:34 np0005532585.localdomain sshd[332016]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:04:34 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:34.275 263258 INFO neutron.agent.dhcp.agent [None req-fe76be24-48c5-4d74-addb-987c5c113fdf - - - - - -] DHCP configuration for ports {'b15dac7f-40d8-4298-9412-2390f2fb7bcd'} is completed
Nov 23 10:04:34 np0005532585.localdomain dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 1 addresses
Nov 23 10:04:34 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host
Nov 23 10:04:34 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts
Nov 23 10:04:34 np0005532585.localdomain podman[332010]: 2025-11-23 10:04:34.318499919 +0000 UTC m=+0.057839864 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:04:34 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:34.394 2 INFO neutron.agent.securitygroups_rpc [None req-e699b8a4-5f06-487a-a300-fd9ee1a788a2 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:34 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:34.533 263258 INFO neutron.agent.dhcp.agent [None req-44400d95-ade1-4ca4-ae1f-a5a5c0e2be84 - - - - - -] DHCP configuration for ports {'b15dac7f-40d8-4298-9412-2390f2fb7bcd'} is completed
Nov 23 10:04:34 np0005532585.localdomain systemd[1]: tmp-crun.a905NT.mount: Deactivated successfully.
Nov 23 10:04:35 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:35.170 2 INFO neutron.agent.securitygroups_rpc [None req-519a4a04-72f9-40b4-96af-e4987b6dbb80 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:35 np0005532585.localdomain ceph-mon[300199]: osdmap e177: 6 total, 6 up, 6 in
Nov 23 10:04:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e178 e178: 6 total, 6 up, 6 in
Nov 23 10:04:35 np0005532585.localdomain sshd[332016]: Invalid user ubuntu from 107.172.15.139 port 59846
Nov 23 10:04:35 np0005532585.localdomain sshd[332016]: Received disconnect from 107.172.15.139 port 59846:11: Bye Bye [preauth]
Nov 23 10:04:35 np0005532585.localdomain sshd[332016]: Disconnected from invalid user ubuntu 107.172.15.139 port 59846 [preauth]
Nov 23 10:04:36 np0005532585.localdomain ceph-mon[300199]: osdmap e178: 6 total, 6 up, 6 in
Nov 23 10:04:36 np0005532585.localdomain ceph-mon[300199]: pgmap v370: 177 pgs: 177 active+clean; 225 MiB data, 976 MiB used, 41 GiB / 42 GiB avail; 152 KiB/s rd, 27 KiB/s wr, 207 op/s
Nov 23 10:04:36 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e179 e179: 6 total, 6 up, 6 in
Nov 23 10:04:37 np0005532585.localdomain ceph-mon[300199]: osdmap e179: 6 total, 6 up, 6 in
Nov 23 10:04:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:37.282 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:04:37 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:04:38 np0005532585.localdomain podman[332034]: 2025-11-23 10:04:38.024072141 +0000 UTC m=+0.078010670 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:04:38 np0005532585.localdomain podman[332034]: 2025-11-23 10:04:38.037063003 +0000 UTC m=+0.091001512 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:04:38 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e180 e180: 6 total, 6 up, 6 in
Nov 23 10:04:38 np0005532585.localdomain podman[332035]: 2025-11-23 10:04:38.091967957 +0000 UTC m=+0.143194965 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:04:38 np0005532585.localdomain podman[332035]: 2025-11-23 10:04:38.105308359 +0000 UTC m=+0.156535407 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:04:38 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "format": "json"}]: dispatch
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: pgmap v372: 177 pgs: 177 active+clean; 225 MiB data, 976 MiB used, 41 GiB / 42 GiB avail; 137 KiB/s rd, 24 KiB/s wr, 186 op/s
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: osdmap e180: 6 total, 6 up, 6 in
Nov 23 10:04:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:04:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e181 e181: 6 total, 6 up, 6 in
Nov 23 10:04:39 np0005532585.localdomain dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 0 addresses
Nov 23 10:04:39 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host
Nov 23 10:04:39 np0005532585.localdomain podman[332094]: 2025-11-23 10:04:39.618621824 +0000 UTC m=+0.059579296 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:39 np0005532585.localdomain dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts
Nov 23 10:04:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:39.846 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:39 np0005532585.localdomain kernel: device tap2aa3f527-d9 left promiscuous mode
Nov 23 10:04:39 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:39Z|00511|binding|INFO|Releasing lport 2aa3f527-d9c7-4028-9d39-988134131b8f from this chassis (sb_readonly=0)
Nov 23 10:04:39 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:39Z|00512|binding|INFO|Setting lport 2aa3f527-d9c7-4028-9d39-988134131b8f down in Southbound
Nov 23 10:04:39 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:39.858 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:3::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db97bbb5-462a-4c5e-a728-28756b766887, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=2aa3f527-d9c7-4028-9d39-988134131b8f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:39 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:39.860 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa3f527-d9c7-4028-9d39-988134131b8f in datapath 43058c00-cde7-48e1-8e0e-eba5f793c5ac unbound from our chassis
Nov 23 10:04:39 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:39.862 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43058c00-cde7-48e1-8e0e-eba5f793c5ac or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:39 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:39.863 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f33b1b69-7141-4c3c-9697-22f55d1f1685]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:39.878 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:39.879 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:40 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:40.216 2 INFO neutron.agent.securitygroups_rpc [req-dd0696f0-72c9-4ee8-8c44-5e6e5f5d0a7a req-8f09e20d-9c7b-422f-bb6f-02a4d95509a5 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group member updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']
Nov 23 10:04:40 np0005532585.localdomain sshd[332117]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:04:40 np0005532585.localdomain ceph-mon[300199]: osdmap e181: 6 total, 6 up, 6 in
Nov 23 10:04:40 np0005532585.localdomain ceph-mon[300199]: pgmap v375: 177 pgs: 177 active+clean; 146 MiB data, 914 MiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 24 KiB/s wr, 190 op/s
Nov 23 10:04:40 np0005532585.localdomain dnsmasq[331955]: exiting on receipt of SIGTERM
Nov 23 10:04:40 np0005532585.localdomain podman[332135]: 2025-11-23 10:04:40.673915553 +0000 UTC m=+0.062188675 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:04:40 np0005532585.localdomain systemd[1]: libpod-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1.scope: Deactivated successfully.
Nov 23 10:04:40 np0005532585.localdomain podman[332147]: 2025-11-23 10:04:40.745839469 +0000 UTC m=+0.059011188 container died 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:04:40 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:40 np0005532585.localdomain podman[332147]: 2025-11-23 10:04:40.785773832 +0000 UTC m=+0.098945501 container cleanup 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:04:40 np0005532585.localdomain systemd[1]: libpod-conmon-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1.scope: Deactivated successfully.
Nov 23 10:04:40 np0005532585.localdomain podman[332151]: 2025-11-23 10:04:40.827687825 +0000 UTC m=+0.131684968 container remove 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:04:41 np0005532585.localdomain sudo[332177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:04:41 np0005532585.localdomain sudo[332177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:04:41 np0005532585.localdomain sudo[332177]: pam_unix(sudo:session): session closed for user root
Nov 23 10:04:41 np0005532585.localdomain sshd[332117]: Received disconnect from 207.154.194.2 port 54192:11: Bye Bye [preauth]
Nov 23 10:04:41 np0005532585.localdomain sshd[332117]: Disconnected from authenticating user root 207.154.194.2 port 54192 [preauth]
Nov 23 10:04:41 np0005532585.localdomain sudo[332195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:04:41 np0005532585.localdomain sudo[332195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:04:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:41.267 263258 INFO neutron.agent.dhcp.agent [None req-109eb951-e910-417c-a956-ee0f2e889ebf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:41 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2673424660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:04:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e182 e182: 6 total, 6 up, 6 in
Nov 23 10:04:41 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-6027bb6fd654c4a054428fb77e5d5ffcb12dbfa91fa97836ded173ca06b20034-merged.mount: Deactivated successfully.
Nov 23 10:04:41 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d43058c00\x2dcde7\x2d48e1\x2d8e0e\x2deba5f793c5ac.mount: Deactivated successfully.
Nov 23 10:04:41 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:41.730 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:41 np0005532585.localdomain sudo[332195]: pam_unix(sudo:session): session closed for user root
Nov 23 10:04:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:04:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:04:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:04:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159324 "" "Go-http-client/1.1"
Nov 23 10:04:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:04:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1"
Nov 23 10:04:42 np0005532585.localdomain sudo[332246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:04:42 np0005532585.localdomain sudo[332246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:04:42 np0005532585.localdomain sudo[332246]: pam_unix(sudo:session): session closed for user root
Nov 23 10:04:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:42.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4c1787fb-1a00-4288-a97a-0f0da441edd9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4c1787fb-1a00-4288-a97a-0f0da441edd9", "format": "json"}]: dispatch
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: pgmap v376: 177 pgs: 177 active+clean; 146 MiB data, 914 MiB used, 41 GiB / 42 GiB avail; 95 KiB/s rd, 17 KiB/s wr, 135 op/s
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: osdmap e182: 6 total, 6 up, 6 in
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:04:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:04:42 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:42.655 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:43 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:43Z|00513|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:04:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e183 e183: 6 total, 6 up, 6 in
Nov 23 10:04:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:43.079 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:04:44 np0005532585.localdomain ceph-mon[300199]: osdmap e183: 6 total, 6 up, 6 in
Nov 23 10:04:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:04:44 np0005532585.localdomain podman[332280]: 2025-11-23 10:04:44.629623721 +0000 UTC m=+0.058419451 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:04:44 np0005532585.localdomain dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 0 addresses
Nov 23 10:04:44 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host
Nov 23 10:04:44 np0005532585.localdomain dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts
Nov 23 10:04:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:44Z|00514|binding|INFO|Releasing lport 8437b436-673f-4dcd-8ddd-08df6599f4ab from this chassis (sb_readonly=0)
Nov 23 10:04:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:44.808 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:44 np0005532585.localdomain kernel: device tap8437b436-67 left promiscuous mode
Nov 23 10:04:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:44Z|00515|binding|INFO|Setting lport 8437b436-673f-4dcd-8ddd-08df6599f4ab down in Southbound
Nov 23 10:04:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:44.828 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e6f3ebd-dc7a-479e-ba09-dcee6cc4d506, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8437b436-673f-4dcd-8ddd-08df6599f4ab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:44.831 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8437b436-673f-4dcd-8ddd-08df6599f4ab in datapath 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 unbound from our chassis
Nov 23 10:04:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:44.831 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:44.833 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:44.834 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[619c4494-baa6-47c7-be0b-420429eed29f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:44 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:44.989 2 INFO neutron.agent.securitygroups_rpc [None req-f2af9951-cbc0-4ee3-8964-70390f4dbee5 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:45 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e184 e184: 6 total, 6 up, 6 in
Nov 23 10:04:45 np0005532585.localdomain ceph-mon[300199]: pgmap v379: 177 pgs: 177 active+clean; 146 MiB data, 914 MiB used, 41 GiB / 42 GiB avail; 105 KiB/s rd, 19 KiB/s wr, 149 op/s
Nov 23 10:04:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "4c1787fb-1a00-4288-a97a-0f0da441edd9", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 23 10:04:45 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/548781773' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:45 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/548781773' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:45 np0005532585.localdomain ceph-mon[300199]: osdmap e184: 6 total, 6 up, 6 in
Nov 23 10:04:45 np0005532585.localdomain podman[332321]: 2025-11-23 10:04:45.746147994 +0000 UTC m=+0.056703819 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:04:45 np0005532585.localdomain dnsmasq[331714]: exiting on receipt of SIGTERM
Nov 23 10:04:45 np0005532585.localdomain systemd[1]: libpod-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457.scope: Deactivated successfully.
Nov 23 10:04:45 np0005532585.localdomain podman[332335]: 2025-11-23 10:04:45.813629567 +0000 UTC m=+0.051940566 container died b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:04:45 np0005532585.localdomain podman[332335]: 2025-11-23 10:04:45.847569959 +0000 UTC m=+0.085880958 container cleanup b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:04:45 np0005532585.localdomain systemd[1]: libpod-conmon-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457.scope: Deactivated successfully.
Nov 23 10:04:45 np0005532585.localdomain podman[332337]: 2025-11-23 10:04:45.898930076 +0000 UTC m=+0.128425919 container remove b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 10:04:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:45.926 263258 INFO neutron.agent.dhcp.agent [None req-293b986b-4bca-460f-b9c6-aacb82472f33 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:45.927 263258 INFO neutron.agent.dhcp.agent [None req-293b986b-4bca-460f-b9c6-aacb82472f33 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:45.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:45.989 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:45 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:45Z|00516|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:04:45 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:45.991 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:04:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:46.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:46 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e185 e185: 6 total, 6 up, 6 in
Nov 23 10:04:46 np0005532585.localdomain dnsmasq[331054]: exiting on receipt of SIGTERM
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: libpod-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72.scope: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain podman[332382]: 2025-11-23 10:04:46.241523276 +0000 UTC m=+0.060298918 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 10:04:46 np0005532585.localdomain podman[332395]: 2025-11-23 10:04:46.309844554 +0000 UTC m=+0.055837633 container died 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:46 np0005532585.localdomain podman[332395]: 2025-11-23 10:04:46.33861463 +0000 UTC m=+0.084607679 container cleanup 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: libpod-conmon-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72.scope: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain podman[332397]: 2025-11-23 10:04:46.390735301 +0000 UTC m=+0.128272275 container remove 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:46.678 263258 INFO neutron.agent.dhcp.agent [None req-2312486c-d399-4809-92a6-da17acbf6088 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-4b86b4f9fba21239c4ef1c266e92b520c83208906cd248f3a235bb0ae25f61f0-merged.mount: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d1afb3f8c\x2de188\x2d49e0\x2db864\x2d15c6a95b0fb6.mount: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-ddc73133ee97a18b5d676ae71aebdcc91b4415d41e32db3c006d1af969d02b28-merged.mount: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:46 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d556c5c2e\x2dc414\x2d4271\x2d8e77\x2dd61a599ccbad.mount: Deactivated successfully.
Nov 23 10:04:47 np0005532585.localdomain ceph-mon[300199]: pgmap v381: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 10 KiB/s wr, 72 op/s
Nov 23 10:04:47 np0005532585.localdomain ceph-mon[300199]: osdmap e185: 6 total, 6 up, 6 in
Nov 23 10:04:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e186 e186: 6 total, 6 up, 6 in
Nov 23 10:04:47 np0005532585.localdomain podman[332441]: 2025-11-23 10:04:47.161113407 +0000 UTC m=+0.056097201 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:47 np0005532585.localdomain dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 0 addresses
Nov 23 10:04:47 np0005532585.localdomain systemd[1]: tmp-crun.vhYg3V.mount: Deactivated successfully.
Nov 23 10:04:47 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host
Nov 23 10:04:47 np0005532585.localdomain dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts
Nov 23 10:04:47 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:47Z|00517|binding|INFO|Releasing lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 from this chassis (sb_readonly=0)
Nov 23 10:04:47 np0005532585.localdomain kernel: device tap1ae8d8fe-51 left promiscuous mode
Nov 23 10:04:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:47.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:47 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:47Z|00518|binding|INFO|Setting lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 down in Southbound
Nov 23 10:04:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:47.331 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:47 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:47.334 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14e80905-5b14-4ffd-9b3d-73c1c6bb812f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=1ae8d8fe-5156-4993-b482-c4b01b921e85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:47 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:47.336 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1ae8d8fe-5156-4993-b482-c4b01b921e85 in datapath 2fa353bd-02c5-4044-adba-b918030b206e unbound from our chassis
Nov 23 10:04:47 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:47.337 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fa353bd-02c5-4044-adba-b918030b206e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:47 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:47.338 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[08e29a15-c222-4202-b15c-d7acbc4ff542]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:47.345 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:04:47 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3161485816' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e187 e187: 6 total, 6 up, 6 in
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3161485816' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: osdmap e186: 6 total, 6 up, 6 in
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4c1787fb-1a00-4288-a97a-0f0da441edd9", "format": "json"}]: dispatch
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4c1787fb-1a00-4288-a97a-0f0da441edd9", "force": true, "format": "json"}]: dispatch
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3161485816' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: osdmap e187: 6 total, 6 up, 6 in
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3161485816' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:04:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:04:48 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:48.561 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: tmp-crun.9iONME.mount: Deactivated successfully.
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:04:49 np0005532585.localdomain podman[332465]: 2025-11-23 10:04:49.039350005 +0000 UTC m=+0.093812726 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350)
Nov 23 10:04:49 np0005532585.localdomain podman[332499]: 2025-11-23 10:04:49.126261923 +0000 UTC m=+0.067928117 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:04:49 np0005532585.localdomain dnsmasq[331198]: exiting on receipt of SIGTERM
Nov 23 10:04:49 np0005532585.localdomain podman[332502]: 2025-11-23 10:04:49.170007771 +0000 UTC m=+0.105594302 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: libpod-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614.scope: Deactivated successfully.
Nov 23 10:04:49 np0005532585.localdomain ceph-mon[300199]: pgmap v384: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 14 KiB/s wr, 98 op/s
Nov 23 10:04:49 np0005532585.localdomain podman[332465]: 2025-11-23 10:04:49.177729514 +0000 UTC m=+0.232192195 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:04:49 np0005532585.localdomain podman[332499]: 2025-11-23 10:04:49.21246537 +0000 UTC m=+0.154131554 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:04:49 np0005532585.localdomain podman[332543]: 2025-11-23 10:04:49.246117444 +0000 UTC m=+0.055330407 container died 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:04:49 np0005532585.localdomain podman[332543]: 2025-11-23 10:04:49.281590663 +0000 UTC m=+0.090803606 container remove 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:04:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:49.312 263258 INFO neutron.agent.dhcp.agent [None req-e41a1c8b-d447-4a66-8a16-b3359df5dcf5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:49 np0005532585.localdomain podman[332501]: 2025-11-23 10:04:49.346065145 +0000 UTC m=+0.281137059 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: libpod-conmon-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614.scope: Deactivated successfully.
Nov 23 10:04:49 np0005532585.localdomain podman[332501]: 2025-11-23 10:04:49.417260519 +0000 UTC m=+0.352332433 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 10:04:49 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:49.418 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:04:49 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:04:49 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:49Z|00519|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:04:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:49.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cf0b02e359ac1721f19ba2c6fc86978494592aa5ddd27403653b646396ae9cc9-merged.mount: Deactivated successfully.
Nov 23 10:04:50 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614-userdata-shm.mount: Deactivated successfully.
Nov 23 10:04:50 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d2fa353bd\x2d02c5\x2d4044\x2dadba\x2db918030b206e.mount: Deactivated successfully.
Nov 23 10:04:51 np0005532585.localdomain ceph-mon[300199]: pgmap v386: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 15 KiB/s wr, 73 op/s
Nov 23 10:04:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:51 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:51.249 2 INFO neutron.agent.securitygroups_rpc [None req-9f10832a-94d7-4e63-bc11-33234c92ec82 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:04:51 np0005532585.localdomain sshd[332585]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:04:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6afacc00-ca40-4edb-aefc-a9b0a3580b7a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:04:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6afacc00-ca40-4edb-aefc-a9b0a3580b7a", "format": "json"}]: dispatch
Nov 23 10:04:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:52.374 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:52.375 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:52 np0005532585.localdomain sshd[332585]: Invalid user bitwarden from 175.126.166.172 port 54510
Nov 23 10:04:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 e188: 6 total, 6 up, 6 in
Nov 23 10:04:53 np0005532585.localdomain sshd[332585]: Received disconnect from 175.126.166.172 port 54510:11: Bye Bye [preauth]
Nov 23 10:04:53 np0005532585.localdomain sshd[332585]: Disconnected from invalid user bitwarden 175.126.166.172 port 54510 [preauth]
Nov 23 10:04:53 np0005532585.localdomain ceph-mon[300199]: pgmap v387: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 11 KiB/s wr, 53 op/s
Nov 23 10:04:53 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:04:53 np0005532585.localdomain ceph-mon[300199]: osdmap e188: 6 total, 6 up, 6 in
Nov 23 10:04:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:04:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:53Z|00520|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:04:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:53.976 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8cc6eb2d-54d1-40d5-92a9-2068282def74", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:04:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8cc6eb2d-54d1-40d5-92a9-2068282def74", "format": "json"}]: dispatch
Nov 23 10:04:55 np0005532585.localdomain ceph-mon[300199]: pgmap v389: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 11 KiB/s wr, 50 op/s
Nov 23 10:04:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "6afacc00-ca40-4edb-aefc-a9b0a3580b7a", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 23 10:04:55 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:55.993 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:04:56 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:56.159 263258 INFO neutron.agent.linux.ip_lib [None req-722cfbd4-d097-45d7-9db4-019a7a1127ef - - - - - -] Device tapb2609925-51 cannot be used as it has no MAC address
Nov 23 10:04:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:56.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:56 np0005532585.localdomain kernel: device tapb2609925-51 entered promiscuous mode
Nov 23 10:04:56 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892296.1956] manager: (tapb2609925-51): new Generic device (/org/freedesktop/NetworkManager/Devices/83)
Nov 23 10:04:56 np0005532585.localdomain systemd-udevd[332597]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:04:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:56Z|00521|binding|INFO|Claiming lport b2609925-5134-4a13-ba79-45e02839b8f7 for this chassis.
Nov 23 10:04:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:56Z|00522|binding|INFO|b2609925-5134-4a13-ba79-45e02839b8f7: Claiming unknown
Nov 23 10:04:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:56.235 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:56.251 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba426e81cfe149da986575955289d04b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e76005b-d8d7-445f-b11a-34d1e82ffc8b, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b2609925-5134-4a13-ba79-45e02839b8f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:56.252 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b2609925-5134-4a13-ba79-45e02839b8f7 in datapath 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece bound to our chassis
Nov 23 10:04:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:56.253 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:04:56 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:56.254 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7695fb0d-b711-46f1-ae6b-018c6507e391]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:56Z|00523|binding|INFO|Setting lport b2609925-5134-4a13-ba79-45e02839b8f7 ovn-installed in OVS
Nov 23 10:04:56 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:04:56Z|00524|binding|INFO|Setting lport b2609925-5134-4a13-ba79-45e02839b8f7 up in Southbound
Nov 23 10:04:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:56.282 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:56.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:56 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:56.342 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:56 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:56.565 2 INFO neutron.agent.securitygroups_rpc [None req-b2e8c328-efaf-49d2-9816-397d8d6e979c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['213f9d65-3629-4053-acee-7e99a128b417']
Nov 23 10:04:57 np0005532585.localdomain podman[332652]: 
Nov 23 10:04:57 np0005532585.localdomain podman[332652]: 2025-11-23 10:04:57.206565867 +0000 UTC m=+0.086242129 container create 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:04:57 np0005532585.localdomain systemd[1]: Started libpod-conmon-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf.scope.
Nov 23 10:04:57 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:04:57 np0005532585.localdomain podman[332652]: 2025-11-23 10:04:57.163808539 +0000 UTC m=+0.043484801 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:04:57 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/deefc43655ec5efae4438c1bba8af5217f7f3c949c030f1c1d0f69134d512516/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:04:57 np0005532585.localdomain ceph-mon[300199]: pgmap v390: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 19 KiB/s wr, 43 op/s
Nov 23 10:04:57 np0005532585.localdomain podman[332652]: 2025-11-23 10:04:57.273651528 +0000 UTC m=+0.153327780 container init 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:04:57 np0005532585.localdomain podman[332652]: 2025-11-23 10:04:57.284149414 +0000 UTC m=+0.163825666 container start 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:04:57 np0005532585.localdomain dnsmasq[332670]: started, version 2.85 cachesize 150
Nov 23 10:04:57 np0005532585.localdomain dnsmasq[332670]: DNS service limited to local subnets
Nov 23 10:04:57 np0005532585.localdomain dnsmasq[332670]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:04:57 np0005532585.localdomain dnsmasq[332670]: warning: no upstream servers configured
Nov 23 10:04:57 np0005532585.localdomain dnsmasq-dhcp[332670]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:04:57 np0005532585.localdomain dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 0 addresses
Nov 23 10:04:57 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host
Nov 23 10:04:57 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts
Nov 23 10:04:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:57.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:57 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:04:57.483 263258 INFO neutron.agent.dhcp.agent [None req-3f05d6d1-f56a-4bd9-9f0f-33675e6c048e - - - - - -] DHCP configuration for ports {'d0dc5460-4915-4418-9406-8337e0482cf3'} is completed
Nov 23 10:04:57 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:57.775 2 INFO neutron.agent.securitygroups_rpc [None req-d7ea7df2-10a0-4360-bfff-447d012be880 6f11688a49fb4deba83327b1cf6539b4 02d402d01a514bbd8ec5543d8bb9b97c - - default default] Security group rule updated ['76c5df30-fcbd-4316-84a0-0d549c3af78d']
Nov 23 10:04:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:04:57 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:04:58 np0005532585.localdomain podman[332671]: 2025-11-23 10:04:58.034243089 +0000 UTC m=+0.090432845 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 10:04:58 np0005532585.localdomain podman[332671]: 2025-11-23 10:04:58.050372325 +0000 UTC m=+0.106562071 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd)
Nov 23 10:04:58 np0005532585.localdomain podman[332672]: 2025-11-23 10:04:58.007980678 +0000 UTC m=+0.062771471 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:04:58 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:04:58 np0005532585.localdomain podman[332672]: 2025-11-23 10:04:58.09234853 +0000 UTC m=+0.147139303 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:04:58 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:04:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6afacc00-ca40-4edb-aefc-a9b0a3580b7a", "format": "json"}]: dispatch
Nov 23 10:04:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6afacc00-ca40-4edb-aefc-a9b0a3580b7a", "force": true, "format": "json"}]: dispatch
Nov 23 10:04:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:04:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:58.586 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5143a1d3-f63b-452c-a57f-85c07d0974c0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d5a5bf10-b94f-4270-9b8b-f5b33fff78ea) old=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:04:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:58.588 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d5a5bf10-b94f-4270-9b8b-f5b33fff78ea in datapath 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe updated
Nov 23 10:04:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:58.591 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:04:58 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:04:58.592 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[07c62593-333d-4f21-bb26-28acd215318a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:04:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:04:58.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:04:59 np0005532585.localdomain ceph-mon[300199]: pgmap v391: 177 pgs: 177 active+clean; 146 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 16 KiB/s wr, 37 op/s
Nov 23 10:04:59 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:04:59.310 2 INFO neutron.agent.securitygroups_rpc [None req-2900c02d-4bae-4668-a3d0-a31f6942bf81 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['213f9d65-3629-4053-acee-7e99a128b417', '05c9de82-0c74-49cb-8524-43dd3dd47f37']
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:04:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:04:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:05:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:00.246 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:59Z, description=, device_id=e37909af-961f-4dfd-8d68-199ed54a6cf8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b45160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b45dc0>], id=29773ea2-bb98-402d-9f2a-492d309a53db, ip_allocation=immediate, mac_address=fa:16:3e:71:7a:d7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:54Z, description=, dns_domain=, id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-2072411749-network, port_security_enabled=True, project_id=ba426e81cfe149da986575955289d04b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42798, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2635, status=ACTIVE, subnets=['82fc8536-8876-439c-9d2b-ecbdd502ad8a'], tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:55Z, vlan_transparent=None, network_id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, port_security_enabled=False, project_id=ba426e81cfe149da986575955289d04b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2673, status=DOWN, tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:59Z on network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece
Nov 23 10:05:00 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:00.283 2 INFO neutron.agent.securitygroups_rpc [None req-31700e8d-00a6-42b0-834e-1388eab5f28c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['05c9de82-0c74-49cb-8524-43dd3dd47f37']
Nov 23 10:05:00 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8cc6eb2d-54d1-40d5-92a9-2068282def74", "format": "json"}]: dispatch
Nov 23 10:05:00 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8cc6eb2d-54d1-40d5-92a9-2068282def74", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/48377154' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/48377154' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:00 np0005532585.localdomain systemd[1]: tmp-crun.MN209f.mount: Deactivated successfully.
Nov 23 10:05:00 np0005532585.localdomain dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 1 addresses
Nov 23 10:05:00 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host
Nov 23 10:05:00 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts
Nov 23 10:05:00 np0005532585.localdomain podman[332731]: 2025-11-23 10:05:00.449721361 +0000 UTC m=+0.047567243 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 10:05:00 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:00.648 263258 INFO neutron.agent.dhcp.agent [None req-edf68da7-5c23-4761-8233-c8fbc7f02102 - - - - - -] DHCP configuration for ports {'29773ea2-bb98-402d-9f2a-492d309a53db'} is completed
Nov 23 10:05:01 np0005532585.localdomain ceph-mon[300199]: pgmap v392: 177 pgs: 177 active+clean; 146 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 17 KiB/s wr, 12 op/s
Nov 23 10:05:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:01 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:01.821 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:59Z, description=, device_id=e37909af-961f-4dfd-8d68-199ed54a6cf8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8be6250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8be6310>], id=29773ea2-bb98-402d-9f2a-492d309a53db, ip_allocation=immediate, mac_address=fa:16:3e:71:7a:d7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:54Z, description=, dns_domain=, id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-2072411749-network, port_security_enabled=True, project_id=ba426e81cfe149da986575955289d04b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42798, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2635, status=ACTIVE, subnets=['82fc8536-8876-439c-9d2b-ecbdd502ad8a'], tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:55Z, vlan_transparent=None, network_id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, port_security_enabled=False, project_id=ba426e81cfe149da986575955289d04b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2673, status=DOWN, tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:59Z on network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece
Nov 23 10:05:02 np0005532585.localdomain dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 1 addresses
Nov 23 10:05:02 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host
Nov 23 10:05:02 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts
Nov 23 10:05:02 np0005532585.localdomain podman[332769]: 2025-11-23 10:05:02.041001446 +0000 UTC m=+0.056225605 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:05:02 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:02.310 263258 INFO neutron.agent.dhcp.agent [None req-55b5b2f1-1ff1-4f38-8fd8-1388baa78d74 - - - - - -] DHCP configuration for ports {'29773ea2-bb98-402d-9f2a-492d309a53db'} is completed
Nov 23 10:05:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9f8bde9a-0660-4024-80ab-8799aae7b4e4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9f8bde9a-0660-4024-80ab-8799aae7b4e4", "format": "json"}]: dispatch
Nov 23 10:05:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:02.472 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:02 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:02.807 2 INFO neutron.agent.securitygroups_rpc [None req-2a25adbb-406f-4488-9290-d86f8fa25b90 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['83b9eb37-6d54-417f-b8aa-c3bd6525a15a']
Nov 23 10:05:03 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:03.116 2 INFO neutron.agent.securitygroups_rpc [None req-48af5a2d-b1ea-400e-a467-c239dff497de 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['83b9eb37-6d54-417f-b8aa-c3bd6525a15a']
Nov 23 10:05:03 np0005532585.localdomain ceph-mon[300199]: pgmap v393: 177 pgs: 177 active+clean; 146 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 17 KiB/s wr, 12 op/s
Nov 23 10:05:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:04 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:04.276 2 INFO neutron.agent.securitygroups_rpc [None req-068f4094-be4e-499c-ac72-326e6af4f870 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:04 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:04.542 2 INFO neutron.agent.securitygroups_rpc [None req-5e059f36-e08b-42c4-9c06-8e7d2c8a7a35 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:04 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:04.743 2 INFO neutron.agent.securitygroups_rpc [None req-4232445f-3fdb-4ab4-af76-a3c636901057 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:04 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:04.978 2 INFO neutron.agent.securitygroups_rpc [None req-bdf55031-0050-45a1-bc2b-6230ff544fa3 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:05 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:05.151 2 INFO neutron.agent.securitygroups_rpc [None req-efdb9984-7352-4eb4-bfb4-32226524bf47 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:05 np0005532585.localdomain ceph-mon[300199]: pgmap v394: 177 pgs: 177 active+clean; 146 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 16 KiB/s wr, 12 op/s
Nov 23 10:05:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9f8bde9a-0660-4024-80ab-8799aae7b4e4", "format": "json"}]: dispatch
Nov 23 10:05:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9f8bde9a-0660-4024-80ab-8799aae7b4e4", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:05 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:05.488 2 INFO neutron.agent.securitygroups_rpc [None req-2d5ed31f-c175-4e95-8ecb-2dfb6c38fae5 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:05 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:05.546 2 INFO neutron.agent.securitygroups_rpc [None req-f12b045a-e1e3-435c-8190-d496fdcf5f2d 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['d1cc26af-765b-45fa-b447-8d13d7399069']
Nov 23 10:05:06 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:06.329 2 INFO neutron.agent.securitygroups_rpc [None req-b7a801b4-8c85-482c-af67-8b6642b94666 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:06 np0005532585.localdomain ceph-mon[300199]: pgmap v395: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 46 op/s
Nov 23 10:05:06 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e189 e189: 6 total, 6 up, 6 in
Nov 23 10:05:06 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:06.867 2 INFO neutron.agent.securitygroups_rpc [None req-0eca1fa9-2850-4efd-aa96-731301b95192 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:07.207 2 INFO neutron.agent.securitygroups_rpc [None req-b76074b6-0a53-43ed-86f9-a06ad1dd7bfb 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:07 np0005532585.localdomain ceph-mon[300199]: osdmap e189: 6 total, 6 up, 6 in
Nov 23 10:05:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:07.508 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:07.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:07.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:05:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:07.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:07.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:07.512 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:07 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:07.585 2 INFO neutron.agent.securitygroups_rpc [None req-5025f2b5-3c8e-4a04-9825-a17ba040ab51 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']
Nov 23 10:05:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:07.789 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5143a1d3-f63b-452c-a57f-85c07d0974c0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d5a5bf10-b94f-4270-9b8b-f5b33fff78ea) old=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:07.791 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d5a5bf10-b94f-4270-9b8b-f5b33fff78ea in datapath 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe updated
Nov 23 10:05:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:07.793 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:05:07 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:07.794 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d7a8f6-9a5c-4947-b3b3-c22484e248a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:05:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e662a31b-8c47-43e3-92e1-6d96aeb393b7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e662a31b-8c47-43e3-92e1-6d96aeb393b7", "format": "json"}]: dispatch
Nov 23 10:05:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:08 np0005532585.localdomain ceph-mon[300199]: pgmap v397: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 53 op/s
Nov 23 10:05:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:08.555 2 INFO neutron.agent.securitygroups_rpc [None req-b418740f-9100-4b44-8438-3f54c0de85da 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['ec91c804-f6c3-4a65-9ba5-93d7c528c909']
Nov 23 10:05:08 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:08.603 2 INFO neutron.agent.securitygroups_rpc [None req-d6be62a7-1074-4184-82e5-b6c7eb9c713f 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['d1cc26af-765b-45fa-b447-8d13d7399069', 'b62406ca-1ad0-471f-83b3-a7b86cb40552', '5041c083-f562-4221-8bac-acacd7a21e13']
Nov 23 10:05:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:05:08 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:05:08 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:08.997 263258 INFO neutron.agent.linux.ip_lib [None req-cd5cdb69-c35a-49c1-ac10-1c8ed85d09c5 - - - - - -] Device tapea48030b-3e cannot be used as it has no MAC address
Nov 23 10:05:09 np0005532585.localdomain systemd[1]: tmp-crun.zTdsre.mount: Deactivated successfully.
Nov 23 10:05:09 np0005532585.localdomain podman[332791]: 2025-11-23 10:05:09.014111269 +0000 UTC m=+0.090818117 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:05:09 np0005532585.localdomain podman[332791]: 2025-11-23 10:05:09.023620565 +0000 UTC m=+0.100327443 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:05:09 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:05:09 np0005532585.localdomain podman[332792]: 2025-11-23 10:05:09.075848089 +0000 UTC m=+0.144856035 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:05:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:09.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:09 np0005532585.localdomain kernel: device tapea48030b-3e entered promiscuous mode
Nov 23 10:05:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:09Z|00525|binding|INFO|Claiming lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 for this chassis.
Nov 23 10:05:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:09Z|00526|binding|INFO|ea48030b-3e14-4f0f-ad85-6ef79695a3f3: Claiming unknown
Nov 23 10:05:09 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892309.0855] manager: (tapea48030b-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/84)
Nov 23 10:05:09 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:09.088 2 INFO neutron.agent.securitygroups_rpc [None req-7eb231ba-0d20-4837-8fea-3273f5df7e61 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['b62406ca-1ad0-471f-83b3-a7b86cb40552', '5041c083-f562-4221-8bac-acacd7a21e13']
Nov 23 10:05:09 np0005532585.localdomain podman[332792]: 2025-11-23 10:05:09.087588062 +0000 UTC m=+0.156596008 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Nov 23 10:05:09 np0005532585.localdomain systemd-udevd[332837]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:05:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:09.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.096 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '472899094c04472c806243e76f122a0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4b884e9-74a6-4227-b3c7-e09f7c6545b9, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ea48030b-3e14-4f0f-ad85-6ef79695a3f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.098 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ea48030b-3e14-4f0f-ad85-6ef79695a3f3 in datapath e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a bound to our chassis
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.100 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 545ac66f-28c5-4366-9ba8-8d6a25ddbb6f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.101 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.101 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[33dcdbfe-21f8-4bbd-ba42-69e916271f87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:05:09 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:05:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:09Z|00527|binding|INFO|Setting lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 ovn-installed in OVS
Nov 23 10:05:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:09Z|00528|binding|INFO|Setting lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 up in Southbound
Nov 23 10:05:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:09.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tapea48030b-3e: No such device
Nov 23 10:05:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:09.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:09.177 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:05:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:05:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1108105406' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1108105406' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:10 np0005532585.localdomain podman[332910]: 
Nov 23 10:05:10 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:10.055 2 INFO neutron.agent.securitygroups_rpc [None req-dc86cddf-8096-4f55-8994-fc155127f219 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['5975f3c0-fffa-4893-9c5f-a50728456ba3']
Nov 23 10:05:10 np0005532585.localdomain podman[332910]: 2025-11-23 10:05:10.059277363 +0000 UTC m=+0.091788836 container create dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:05:10 np0005532585.localdomain podman[332910]: 2025-11-23 10:05:10.015587797 +0000 UTC m=+0.048099300 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:05:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:10.142 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f.scope.
Nov 23 10:05:10 np0005532585.localdomain systemd[1]: tmp-crun.UEmnac.mount: Deactivated successfully.
Nov 23 10:05:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:05:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c01c854a331124ebc60335ecf0975a3084a9f216dced94c1e2f7c47b6d63d45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:05:10 np0005532585.localdomain podman[332910]: 2025-11-23 10:05:10.196794255 +0000 UTC m=+0.229305688 container init dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:05:10 np0005532585.localdomain podman[332910]: 2025-11-23 10:05:10.206423065 +0000 UTC m=+0.238934508 container start dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:05:10 np0005532585.localdomain dnsmasq[332928]: started, version 2.85 cachesize 150
Nov 23 10:05:10 np0005532585.localdomain dnsmasq[332928]: DNS service limited to local subnets
Nov 23 10:05:10 np0005532585.localdomain dnsmasq[332928]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:05:10 np0005532585.localdomain dnsmasq[332928]: warning: no upstream servers configured
Nov 23 10:05:10 np0005532585.localdomain dnsmasq-dhcp[332928]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:05:10 np0005532585.localdomain dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 0 addresses
Nov 23 10:05:10 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host
Nov 23 10:05:10 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts
Nov 23 10:05:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:10.353 263258 INFO neutron.agent.dhcp.agent [None req-4cdcfaab-9ae5-4d66-ae48-3a70e3221ed3 - - - - - -] DHCP configuration for ports {'1e0fd39a-d46f-4ee1-a861-10a0e76d26c2'} is completed
Nov 23 10:05:10 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:10.394 2 INFO neutron.agent.securitygroups_rpc [None req-6f895a1f-dcef-4309-9299-e6c0da113106 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['5975f3c0-fffa-4893-9c5f-a50728456ba3']
Nov 23 10:05:10 np0005532585.localdomain ceph-mon[300199]: pgmap v398: 177 pgs: 177 active+clean; 192 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.1 MiB/s wr, 93 op/s
Nov 23 10:05:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/531532703' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/531532703' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.809 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.840 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.841 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30e318a7-c8cd-48b0-a2e9-22580d2cb513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.810605', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59a21f0-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '8e4916ee45204aa83830e5347fc65add68b211b43456cd6d623acc5a1cea4fed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.810605', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59a34a6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'd3f46f8669f602fb6facec1b7b74986064d4fa0c879170bac3cb277def027e90'}]}, 'timestamp': '2025-11-23 10:05:10.841680', '_unique_id': 'f7740006a9fa48a09832577ff3102c7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.844 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b486a1bf-4393-4619-9d4c-bbd3cf593875', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.844551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59ab5e8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '8cfb74ed7113bacfe5692134079bfb81b40cdb2b5ad869c292e6e69e56d4fcc0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.844551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59aca06-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'd0d21f175a127e6f180b8bf52e3a9a7f0074c3852d93b22742e6f877aaed5d84'}]}, 'timestamp': '2025-11-23 10:05:10.845496', '_unique_id': '345ea9358e194c8ebc2c7d66e996a5d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.847 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd497f729-cf12-4389-a025-c74a6eaa2674', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.847771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59b355e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'd765b8d41424bd32b1caa2713dbe2378c8845ae46bfdf8a6bd71591b39cda49f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.847771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59b45da-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'ffed5527062941412b8804800b9e588742392ff27782a261b852c207dc758654'}]}, 'timestamp': '2025-11-23 10:05:10.848665', '_unique_id': '5277f760838c4c30a0ea3fcda88a4eb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.862 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '815b8162-15a3-4178-b9b9-48069b720baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.850919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59d7904-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '53828373298b08615ae89c21efd1a51a108be8f6451299beb347a87133e01d91'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.850919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59d8af2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '41c93ae05624d78e2f56eb60ef41abb9a0b63f5071ee914a0297baa290cf3194'}]}, 'timestamp': '2025-11-23 10:05:10.863536', '_unique_id': '85afb868c8584e419fdb1f6008a1310f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.865 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b52ae5c3-0d12-453d-9fa7-9df382eae2b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.865801', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e59e7020-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '635ea59849ac537a2d512cc0ef5e7aa98bc8370787ccd8dceed7028bcd5f9bb7'}]}, 'timestamp': '2025-11-23 10:05:10.869438', '_unique_id': '11a749c872b244e29467858ad59e1752'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76066f1b-5986-4ad8-93a7-bd81c08d44ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.872293', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e59ef1e4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': 'e48c472844f980739305b21fa2a8be2fc8d81c39756de029618958104b9fd2db'}]}, 'timestamp': '2025-11-23 10:05:10.872755', '_unique_id': 'f4761d0f6b3b44e18cb39a637c390c6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.874 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.875 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '907ce259-1c3e-4008-b56c-e9a72406f153', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.874934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59f58dc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '5e3da032c6790ef716b267fb5373fb78562358ac3f015fdd778b9315568a6e5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.874934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59f6ade-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '8e53b4340c02f10917cc9a00014256ff844a182c555747e29af32a3a487a057c'}]}, 'timestamp': '2025-11-23 10:05:10.875825', '_unique_id': 'ee124783ac9847c484ba2a2c3e99f64e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.878 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.878 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1733a183-22c6-4786-923d-2bdb572a2011', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.878009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59fd226-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '5a186de4c51c80447379ac433e4defd6aeb298c0c2694843cd6acbd69108b0f3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.878009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59fe522-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '55aadd9d516e13323e3c17822088baa4e9db7610b942debd40b9260f71803b46'}]}, 'timestamp': '2025-11-23 10:05:10.878988', '_unique_id': '1002f5924bfc46feb8ef55eb6d91afa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5acf63cb-808d-4613-8a79-62129796f565', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.881183', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a04d5a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '6028cfad1aaf2276c4a80a23fa7de7b753715605fa5b710cf28f9f4559b24827'}]}, 'timestamp': '2025-11-23 10:05:10.881650', '_unique_id': 'c5fdcc54b3a240fb9c55ef4c447f9873'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfd2b5e2-f654-4971-8532-acfd334c5eae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.883820', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a0b52e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '783477a4d46bfa91949cb5563812cf7ba743427df333961db9b4a341a8b77892'}]}, 'timestamp': '2025-11-23 10:05:10.884304', '_unique_id': '6cdf89c0cccd474dbbfc0dafab9ceeb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4109ca3-55c4-415d-96a1-76524af328fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.886509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5a11cc6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'c53f1cfb2c9c6a4fc65216f3430325c0f2157dc4b059f65e47a6b5455569e231'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.886509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5a12e3c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'eeed5ae09b790780c86a0fdfe3f22cb5e1a947dfa27cbbe3e23fab94f3fcda9f'}]}, 'timestamp': '2025-11-23 10:05:10.887377', '_unique_id': 'f1ded88487004b37800e70f514c90965'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:10.898 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:10Z, description=, device_id=da40d6c5-a255-43a6-9fbf-e2238a7bac71, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afc160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afc850>], id=d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e, ip_allocation=immediate, mac_address=fa:16:3e:95:5c:c7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:06Z, description=, dns_domain=, id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1312470054-network, port_security_enabled=True, project_id=472899094c04472c806243e76f122a0f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11483, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2738, status=ACTIVE, subnets=['ca34f827-438d-47f2-925f-0d6f76807026'], tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:07Z, vlan_transparent=None, network_id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, port_security_enabled=False, project_id=472899094c04472c806243e76f122a0f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2758, status=DOWN, tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:10Z on network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 17880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96487d20-24b4-4e20-a458-3f6e75513dc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17880000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:05:10.889553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e5a46016-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.085355213, 'message_signature': 'c7ac373053712cd9c881ad1864f9ce2a45e0f3227c247bac4e5a99ad17478213'}]}, 'timestamp': '2025-11-23 10:05:10.908333', '_unique_id': '324a4cb4f28c42e6860b08d6d78015a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f77bca34-9131-44d3-8660-9492fe6f1fc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.911148', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a4e0b8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '81a2f2c8e5af16473a36059e0132272857365929619ad093989eccd5973ed07b'}]}, 'timestamp': '2025-11-23 10:05:10.911639', '_unique_id': '9f7f89b68e494a52a0f6fe112a5746f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.913 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b297c1d-8569-4d40-963e-a328e0cb95db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.913730', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a5454e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': 'd80b2a004fc439eb7323471bbbfea8746dc4f14283e7152884e9046cae44d513'}]}, 'timestamp': '2025-11-23 10:05:10.914247', '_unique_id': '2bbe8d04bf6040ec9412f03abf6ca941'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.916 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68e4c23c-5ec5-447e-aa4a-d9e0411ce1d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:05:10.916486', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e5a5b150-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.085355213, 'message_signature': 'abe35384c83561b87efb893158f5fa6f6c3954c7eb50146ba241b5f66b8d7c45'}]}, 'timestamp': '2025-11-23 10:05:10.916994', '_unique_id': '5b6031355864442081664de2919a573f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e233b881-d6b4-4e0d-be44-449461b117f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.919278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5a61ce4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '52ce43ca2012de31c02264bf8ba9e9ef3b12d8a0befe54dd328428cb053d8c1a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.919278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5a62e1e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': 'e8142501eb5c0dc1ecad5a04af7ffdd4a32a7213ef9495dca755fe1727873854'}]}, 'timestamp': '2025-11-23 10:05:10.920145', '_unique_id': '1fd719e9277f4c308d926ed40a51761b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35a1ca82-4183-4115-bd57-790c0efcfb82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.922453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5a6994e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '94564295510375d450d797d8842800642a1d6b95e987250a303991f8e4b2fbf7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.922453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5a6ab6e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'e46265e0ea4c8d8efa42e969fa6320b27557e6903769d9b6e62e063d9ebbea48'}]}, 'timestamp': '2025-11-23 10:05:10.923351', '_unique_id': '2745ad5979c2445b810a05a74afc8d4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2f80e2-5f2c-4608-a62b-96d0c143d022', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.925813', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a71eb4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '85d8059cc3984fe45d4063a8029983559a7d42d738a29cda85ebeccf431019ef'}]}, 'timestamp': '2025-11-23 10:05:10.926396', '_unique_id': 'b9a7e5b522ed45f8ae52a2308cf20aef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd12baa51-a95a-4d09-8523-73e08b16aba2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.928261', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a77986-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '33311baa2b38ddea0d59ed3c3d4f062efa19bdcc0e731e52fba88820b6d930c4'}]}, 'timestamp': '2025-11-23 10:05:10.928582', '_unique_id': 'a2ed77f110f84b0a832ceabc173eef68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0fe9eb5-7130-4b34-8c98-103b71658426', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.930012', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a7be28-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': 'd573b08041567423bb22d5b4db0b9a9b88ae20b301af91f5d383c41d18dfac09'}]}, 'timestamp': '2025-11-23 10:05:10.930324', '_unique_id': 'ad06fdf9ffc346b08866dfc14068a3d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aa6766e-9ab5-4760-9f27-47664cd0d488', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.931599', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a7fb18-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '0fd80ea625ed5ccd7ee42daa5783ff868044575fd9494cab3af34c3cf6513a82'}]}, 'timestamp': '2025-11-23 10:05:10.931920', '_unique_id': '4bd1b3ba958844f18ffffde106563e0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:05:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:05:11 np0005532585.localdomain dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 1 addresses
Nov 23 10:05:11 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host
Nov 23 10:05:11 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts
Nov 23 10:05:11 np0005532585.localdomain podman[332945]: 2025-11-23 10:05:11.074070952 +0000 UTC m=+0.049993537 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118)
Nov 23 10:05:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:11.338 263258 INFO neutron.agent.dhcp.agent [None req-a5227715-408f-4966-8187-45db71ac9c74 - - - - - -] DHCP configuration for ports {'d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e'} is completed
Nov 23 10:05:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e662a31b-8c47-43e3-92e1-6d96aeb393b7", "format": "json"}]: dispatch
Nov 23 10:05:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e662a31b-8c47-43e3-92e1-6d96aeb393b7", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:11 np0005532585.localdomain ceph-mon[300199]: pgmap v399: 177 pgs: 177 active+clean; 192 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.1 MiB/s wr, 93 op/s
Nov 23 10:05:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:05:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:05:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:05:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157510 "" "Go-http-client/1.1"
Nov 23 10:05:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:05:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19723 "" "Go-http-client/1.1"
Nov 23 10:05:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:12.279 2 INFO neutron.agent.securitygroups_rpc [None req-9a1758e4-5d44-475c-9640-9981332a110e 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['2786fa44-4779-49f0-84bb-2a9d4bed5cef']
Nov 23 10:05:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:12.357 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:10Z, description=, device_id=da40d6c5-a255-43a6-9fbf-e2238a7bac71, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a53cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8a532b0>], id=d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e, ip_allocation=immediate, mac_address=fa:16:3e:95:5c:c7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:06Z, description=, dns_domain=, id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1312470054-network, port_security_enabled=True, project_id=472899094c04472c806243e76f122a0f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11483, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2738, status=ACTIVE, subnets=['ca34f827-438d-47f2-925f-0d6f76807026'], tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:07Z, vlan_transparent=None, network_id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, port_security_enabled=False, project_id=472899094c04472c806243e76f122a0f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2758, status=DOWN, tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:10Z on network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a
Nov 23 10:05:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:12.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:12 np0005532585.localdomain dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 1 addresses
Nov 23 10:05:12 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host
Nov 23 10:05:12 np0005532585.localdomain podman[332984]: 2025-11-23 10:05:12.562122926 +0000 UTC m=+0.068744521 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:05:12 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts
Nov 23 10:05:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 e190: 6 total, 6 up, 6 in
Nov 23 10:05:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:12.606 2 INFO neutron.agent.securitygroups_rpc [None req-52408a28-3173-4dbe-afae-862affbfdc2f 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']
Nov 23 10:05:12 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:12.799 2 INFO neutron.agent.securitygroups_rpc [None req-4dab38c7-a234-4065-b71c-d9440ee4c0cc 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['2786fa44-4779-49f0-84bb-2a9d4bed5cef']
Nov 23 10:05:12 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:12.866 263258 INFO neutron.agent.dhcp.agent [None req-1ee21828-7257-45e6-bf87-5d39c00e455d - - - - - -] DHCP configuration for ports {'d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e'} is completed
Nov 23 10:05:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:13 np0005532585.localdomain ceph-mon[300199]: osdmap e190: 6 total, 6 up, 6 in
Nov 23 10:05:13 np0005532585.localdomain ceph-mon[300199]: pgmap v401: 177 pgs: 177 active+clean; 192 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 9.7 KiB/s wr, 62 op/s
Nov 23 10:05:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:14.059 2 INFO neutron.agent.securitygroups_rpc [None req-01487171-b358-4961-aa9c-b003fa4396a5 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']
Nov 23 10:05:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:14.300 2 INFO neutron.agent.securitygroups_rpc [None req-9dfbfcf8-cd5f-4ab2-ad68-710c1a723a6d 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']
Nov 23 10:05:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:14.500 2 INFO neutron.agent.securitygroups_rpc [None req-85d9c75a-2523-4c6c-82af-38821b506d6b 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']
Nov 23 10:05:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:14.745 2 INFO neutron.agent.securitygroups_rpc [None req-82a95b4d-7b43-4234-8627-52e2e708ade0 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']
Nov 23 10:05:14 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:14.943 2 INFO neutron.agent.securitygroups_rpc [None req-b4af07d0-a11f-4847-a586-dfc78999259e 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "f1a4d19f-7421-4f15-9ed3-2d6971368042", "format": "json"}]: dispatch
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/62602639' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/62602639' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:15 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:15.147 2 INFO neutron.agent.securitygroups_rpc [None req-5c022b4a-ac2f-4704-91ea-0edb1d6cec16 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1156921035' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:15 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1156921035' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:15 np0005532585.localdomain neutron_sriov_agent[256124]: 2025-11-23 10:05:15.781 2 INFO neutron.agent.securitygroups_rpc [None req-f781264c-3a54-454a-a5fe-8867df4ebfe6 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['acd8c1db-c86a-40f9-91ab-30bd6f26d43e']
Nov 23 10:05:16 np0005532585.localdomain ceph-mon[300199]: pgmap v402: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 17 KiB/s wr, 115 op/s
Nov 23 10:05:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1156921035' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:16 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1156921035' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b0d4ea08-4592-4af4-b78a-914919545708", "format": "json"}]: dispatch
Nov 23 10:05:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2336678244' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2336678244' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:17.586 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:17.587 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:17.587 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:05:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:17.587 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:17.588 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:17.589 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:18 np0005532585.localdomain ceph-mon[300199]: pgmap v403: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 15 KiB/s wr, 104 op/s
Nov 23 10:05:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b0d4ea08-4592-4af4-b78a-914919545708", "format": "json"}]: dispatch
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "f1a4d19f-7421-4f15-9ed3-2d6971368042_6a97bbfd-a02a-43cc-a083-440db87e5f59", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2599463425' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:19 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2599463425' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:05:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:05:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:05:20 np0005532585.localdomain podman[333008]: 2025-11-23 10:05:20.034028413 +0000 UTC m=+0.077377542 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm)
Nov 23 10:05:20 np0005532585.localdomain podman[333008]: 2025-11-23 10:05:20.051363246 +0000 UTC m=+0.094712445 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc.)
Nov 23 10:05:20 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:05:20 np0005532585.localdomain podman[333007]: 2025-11-23 10:05:20.13618416 +0000 UTC m=+0.184643633 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 23 10:05:20 np0005532585.localdomain podman[333007]: 2025-11-23 10:05:20.174579597 +0000 UTC m=+0.223039160 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:05:20 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:05:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "f1a4d19f-7421-4f15-9ed3-2d6971368042", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:20 np0005532585.localdomain ceph-mon[300199]: pgmap v404: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 16 KiB/s wr, 96 op/s
Nov 23 10:05:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/436952743' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2599463425' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:20 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2599463425' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:20 np0005532585.localdomain podman[333006]: 2025-11-23 10:05:20.248331439 +0000 UTC m=+0.300142983 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:05:20 np0005532585.localdomain podman[333006]: 2025-11-23 10:05:20.290309153 +0000 UTC m=+0.342120667 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:05:20 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.819 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.820 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.820 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.821 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:05:20 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:20.895 263258 INFO neutron.agent.linux.ip_lib [None req-59dba442-d17d-4a0b-984d-3b50180a6fbc - - - - - -] Device tap16ecd221-18 cannot be used as it has no MAC address
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.909 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.909 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.910 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.910 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:20 np0005532585.localdomain kernel: device tap16ecd221-18 entered promiscuous mode
Nov 23 10:05:20 np0005532585.localdomain systemd-udevd[333077]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:05:20 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892320.9297] manager: (tap16ecd221-18): new Generic device (/org/freedesktop/NetworkManager/Devices/85)
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.929 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:20Z|00529|binding|INFO|Claiming lport 16ecd221-1860-4329-907f-7a09499e197a for this chassis.
Nov 23 10:05:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:20Z|00530|binding|INFO|16ecd221-1860-4329-907f-7a09499e197a: Claiming unknown
Nov 23 10:05:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:20.946 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a088503b43e94251822e3c0e9006a74e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d787041c-8601-4ac7-8f35-9655cbd443c2, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=16ecd221-1860-4329-907f-7a09499e197a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:20.948 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 16ecd221-1860-4329-907f-7a09499e197a in datapath ce1d6d57-d515-4264-bb5e-663446a4e7d2 bound to our chassis
Nov 23 10:05:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:20.950 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ce1d6d57-d515-4264-bb5e-663446a4e7d2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:05:20 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:20.952 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd9da97-4c93-496f-bb7a-b811220a5cf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:05:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:20Z|00531|binding|INFO|Setting lport 16ecd221-1860-4329-907f-7a09499e197a ovn-installed in OVS
Nov 23 10:05:20 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:20Z|00532|binding|INFO|Setting lport 16ecd221-1860-4329-907f-7a09499e197a up in Southbound
Nov 23 10:05:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:20.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.046 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:21 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1820956695' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.381 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.399 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.399 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.400 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.400 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.401 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:21.401 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:05:21 np0005532585.localdomain podman[333132]: 
Nov 23 10:05:21 np0005532585.localdomain podman[333132]: 2025-11-23 10:05:21.88002896 +0000 UTC m=+0.078582738 container create 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:05:21 np0005532585.localdomain systemd[1]: Started libpod-conmon-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce.scope.
Nov 23 10:05:21 np0005532585.localdomain podman[333132]: 2025-11-23 10:05:21.834276712 +0000 UTC m=+0.032830450 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:05:21 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:05:21 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abd93c1c89432cbaa2ac778bf0b96cd2b95c4cd061e47c6333d9a0aa2bd7ba03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:05:21 np0005532585.localdomain podman[333132]: 2025-11-23 10:05:21.951521904 +0000 UTC m=+0.150075602 container init 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:05:21 np0005532585.localdomain podman[333132]: 2025-11-23 10:05:21.957827064 +0000 UTC m=+0.156380752 container start 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:05:21 np0005532585.localdomain dnsmasq[333150]: started, version 2.85 cachesize 150
Nov 23 10:05:21 np0005532585.localdomain dnsmasq[333150]: DNS service limited to local subnets
Nov 23 10:05:21 np0005532585.localdomain dnsmasq[333150]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:05:21 np0005532585.localdomain dnsmasq[333150]: warning: no upstream servers configured
Nov 23 10:05:21 np0005532585.localdomain dnsmasq-dhcp[333150]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 23 10:05:21 np0005532585.localdomain dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 0 addresses
Nov 23 10:05:21 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host
Nov 23 10:05:21 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts
Nov 23 10:05:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.010 263258 INFO neutron.agent.dhcp.agent [None req-59dba442-d17d-4a0b-984d-3b50180a6fbc - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:20Z, description=, device_id=292abd12-0f11-4ce1-b8f1-44a94fd7bb57, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad9970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad9ca0>], id=aeaf694d-46ce-4fce-8629-1cf86abe6f8c, ip_allocation=immediate, mac_address=fa:16:3e:db:08:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:18Z, description=, dns_domain=, id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1513913109, port_security_enabled=True, project_id=a088503b43e94251822e3c0e9006a74e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2605, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2814, status=ACTIVE, subnets=['60594d6c-cb2d-4d1f-a4ca-7034647d3116'], tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:19Z, vlan_transparent=None, network_id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, port_security_enabled=False, project_id=a088503b43e94251822e3c0e9006a74e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2829, status=DOWN, tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:20Z on network ce1d6d57-d515-4264-bb5e-663446a4e7d2
Nov 23 10:05:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.064 263258 INFO neutron.agent.dhcp.agent [None req-667050e8-d47e-4830-885d-d7c9660deeb5 - - - - - -] DHCP configuration for ports {'b062f273-d4d6-4ada-a8e8-ed6a5f68566e'} is completed
Nov 23 10:05:22 np0005532585.localdomain systemd[1]: tmp-crun.yM89WP.mount: Deactivated successfully.
Nov 23 10:05:22 np0005532585.localdomain podman[333168]: 2025-11-23 10:05:22.187367878 +0000 UTC m=+0.068015529 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:05:22 np0005532585.localdomain dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 1 addresses
Nov 23 10:05:22 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host
Nov 23 10:05:22 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts
Nov 23 10:05:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b0d4ea08-4592-4af4-b78a-914919545708", "format": "json"}]: dispatch
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b0d4ea08-4592-4af4-b78a-914919545708", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: pgmap v405: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 16 KiB/s wr, 96 op/s
Nov 23 10:05:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.338 263258 INFO neutron.agent.dhcp.agent [None req-59dba442-d17d-4a0b-984d-3b50180a6fbc - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:20Z, description=, device_id=292abd12-0f11-4ce1-b8f1-44a94fd7bb57, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cddb50>], id=aeaf694d-46ce-4fce-8629-1cf86abe6f8c, ip_allocation=immediate, mac_address=fa:16:3e:db:08:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:18Z, description=, dns_domain=, id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1513913109, port_security_enabled=True, project_id=a088503b43e94251822e3c0e9006a74e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2605, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2814, status=ACTIVE, subnets=['60594d6c-cb2d-4d1f-a4ca-7034647d3116'], tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:19Z, vlan_transparent=None, network_id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, port_security_enabled=False, project_id=a088503b43e94251822e3c0e9006a74e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2829, status=DOWN, tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:20Z on network ce1d6d57-d515-4264-bb5e-663446a4e7d2
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2974038667' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:22 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2974038667' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.448 263258 INFO neutron.agent.dhcp.agent [None req-1dc986a0-ecf1-47e1-bc88-f4b04ec83302 - - - - - -] DHCP configuration for ports {'aeaf694d-46ce-4fce-8629-1cf86abe6f8c'} is completed
Nov 23 10:05:22 np0005532585.localdomain dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 1 addresses
Nov 23 10:05:22 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host
Nov 23 10:05:22 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts
Nov 23 10:05:22 np0005532585.localdomain podman[333204]: 2025-11-23 10:05:22.509282416 +0000 UTC m=+0.049744790 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:05:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:22.627 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:22 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.841 263258 INFO neutron.agent.dhcp.agent [None req-0a07625c-9932-4fdb-99aa-828a4cfb891b - - - - - -] DHCP configuration for ports {'aeaf694d-46ce-4fce-8629-1cf86abe6f8c'} is completed
Nov 23 10:05:23 np0005532585.localdomain systemd[1]: tmp-crun.g3OrDg.mount: Deactivated successfully.
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.235 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.236 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:05:23 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2974038667' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:23 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2974038667' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:05:23 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/597883095' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.700 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:05:23 np0005532585.localdomain dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 0 addresses
Nov 23 10:05:23 np0005532585.localdomain podman[333261]: 2025-11-23 10:05:23.709066737 +0000 UTC m=+0.055533803 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:05:23 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host
Nov 23 10:05:23 np0005532585.localdomain dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.772 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.773 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:05:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:23Z|00533|binding|INFO|Releasing lport 16ecd221-1860-4329-907f-7a09499e197a from this chassis (sb_readonly=0)
Nov 23 10:05:23 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:23Z|00534|binding|INFO|Setting lport 16ecd221-1860-4329-907f-7a09499e197a down in Southbound
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.934 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:23 np0005532585.localdomain kernel: device tap16ecd221-18 left promiscuous mode
Nov 23 10:05:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:23.944 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a088503b43e94251822e3c0e9006a74e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d787041c-8601-4ac7-8f35-9655cbd443c2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=16ecd221-1860-4329-907f-7a09499e197a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:23.946 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 16ecd221-1860-4329-907f-7a09499e197a in datapath ce1d6d57-d515-4264-bb5e-663446a4e7d2 unbound from our chassis
Nov 23 10:05:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:23.947 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ce1d6d57-d515-4264-bb5e-663446a4e7d2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 23 10:05:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:23.948 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[757e03c1-cef9-48e7-afca-497e276fefc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:05:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:23.955 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.023 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.024 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11109MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.024 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.024 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.100 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.100 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.100 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.145 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: pgmap v406: 177 pgs: 177 active+clean; 192 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 15 KiB/s wr, 88 op/s
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2942325306' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/597883095' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e191 e191: 6 total, 6 up, 6 in
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2996125607' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.650 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.657 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.671 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.673 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:05:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:24.674 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/620567410' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:24 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/620567410' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "89bb6ad8-e75c-4c65-a495-a2d153e513e0", "format": "json"}]: dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "snap_name": "5aacd1b6-f8f5-4003-8630-0121025e58d0_0a0bb169-42ba-4ed1-8452-a7671d059061", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "snap_name": "5aacd1b6-f8f5-4003-8630-0121025e58d0", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: osdmap e191: 6 total, 6 up, 6 in
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2092633703' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2996125607' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/620567410' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/620567410' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:26 np0005532585.localdomain ceph-mon[300199]: pgmap v408: 177 pgs: 4 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 169 active+clean; 193 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 22 KiB/s wr, 77 op/s
Nov 23 10:05:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e192 e192: 6 total, 6 up, 6 in
Nov 23 10:05:27 np0005532585.localdomain ceph-mon[300199]: osdmap e192: 6 total, 6 up, 6 in
Nov 23 10:05:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:27.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:27 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3318661139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:27 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3318661139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e193 e193: 6 total, 6 up, 6 in
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "format": "json"}]: dispatch
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7e0a27d8-81b8-442f-a3db-fa38a09d28ae", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "b6b84a30-76d0-4c7a-ba18-7c9a32f848c3", "format": "json"}]: dispatch
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: pgmap v410: 177 pgs: 4 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 169 active+clean; 193 MiB data, 959 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 17 KiB/s wr, 44 op/s
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3318661139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3318661139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: osdmap e193: 6 total, 6 up, 6 in
Nov 23 10:05:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:05:28 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:05:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e194 e194: 6 total, 6 up, 6 in
Nov 23 10:05:29 np0005532585.localdomain systemd[1]: tmp-crun.0crp4V.mount: Deactivated successfully.
Nov 23 10:05:29 np0005532585.localdomain dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 0 addresses
Nov 23 10:05:29 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host
Nov 23 10:05:29 np0005532585.localdomain dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts
Nov 23 10:05:29 np0005532585.localdomain podman[333348]: 2025-11-23 10:05:29.157498659 +0000 UTC m=+0.074797303 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:05:29 np0005532585.localdomain podman[333312]: 2025-11-23 10:05:29.158584792 +0000 UTC m=+0.164274199 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:05:29 np0005532585.localdomain podman[333312]: 2025-11-23 10:05:29.196274417 +0000 UTC m=+0.201963784 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 23 10:05:29 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:05:29 np0005532585.localdomain podman[333313]: 2025-11-23 10:05:29.290323751 +0000 UTC m=+0.292743700 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:05:29 np0005532585.localdomain podman[333313]: 2025-11-23 10:05:29.298706093 +0000 UTC m=+0.301126052 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:05:29 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:05:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:29Z|00535|binding|INFO|Releasing lport b2609925-5134-4a13-ba79-45e02839b8f7 from this chassis (sb_readonly=0)
Nov 23 10:05:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:29Z|00536|binding|INFO|Setting lport b2609925-5134-4a13-ba79-45e02839b8f7 down in Southbound
Nov 23 10:05:29 np0005532585.localdomain kernel: device tapb2609925-51 left promiscuous mode
Nov 23 10:05:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:29.344 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:29.354 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba426e81cfe149da986575955289d04b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e76005b-d8d7-445f-b11a-34d1e82ffc8b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=b2609925-5134-4a13-ba79-45e02839b8f7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:29.356 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b2609925-5134-4a13-ba79-45e02839b8f7 in datapath 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece unbound from our chassis
Nov 23 10:05:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:29.358 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:05:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:29.359 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a5fb5d14-41a3-42e7-94e8-7c05e9f79f0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:05:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:29.365 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:29 np0005532585.localdomain dnsmasq[333150]: exiting on receipt of SIGTERM
Nov 23 10:05:29 np0005532585.localdomain podman[333405]: 2025-11-23 10:05:29.480596462 +0000 UTC m=+0.061417101 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:05:29 np0005532585.localdomain systemd[1]: libpod-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce.scope: Deactivated successfully.
Nov 23 10:05:29 np0005532585.localdomain podman[333420]: 2025-11-23 10:05:29.550970632 +0000 UTC m=+0.054402020 container died 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 23 10:05:29 np0005532585.localdomain podman[333420]: 2025-11-23 10:05:29.583057108 +0000 UTC m=+0.086488456 container cleanup 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 10:05:29 np0005532585.localdomain systemd[1]: libpod-conmon-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce.scope: Deactivated successfully.
Nov 23 10:05:29 np0005532585.localdomain podman[333421]: 2025-11-23 10:05:29.628019043 +0000 UTC m=+0.126423059 container remove 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:05:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:29.669 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:05:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:29.875 263258 INFO neutron.agent.dhcp.agent [None req-10282890-f679-4351-a9be-ff391504968e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:05:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:05:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e195 e195: 6 total, 6 up, 6 in
Nov 23 10:05:30 np0005532585.localdomain ceph-mon[300199]: osdmap e194: 6 total, 6 up, 6 in
Nov 23 10:05:30 np0005532585.localdomain ceph-mon[300199]: pgmap v413: 177 pgs: 177 active+clean; 193 MiB data, 985 MiB used, 41 GiB / 42 GiB avail; 130 KiB/s rd, 51 KiB/s wr, 187 op/s
Nov 23 10:05:30 np0005532585.localdomain systemd[1]: tmp-crun.xy8Mmi.mount: Deactivated successfully.
Nov 23 10:05:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-abd93c1c89432cbaa2ac778bf0b96cd2b95c4cd061e47c6333d9a0aa2bd7ba03-merged.mount: Deactivated successfully.
Nov 23 10:05:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce-userdata-shm.mount: Deactivated successfully.
Nov 23 10:05:30 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2dce1d6d57\x2dd515\x2d4264\x2dbb5e\x2d663446a4e7d2.mount: Deactivated successfully.
Nov 23 10:05:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:30.539 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:05:30 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:30Z|00537|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:05:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:30.847 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:31 np0005532585.localdomain ceph-mon[300199]: osdmap e195: 6 total, 6 up, 6 in
Nov 23 10:05:31 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:31Z|00538|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:05:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:31.246 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:31 np0005532585.localdomain dnsmasq[332670]: exiting on receipt of SIGTERM
Nov 23 10:05:31 np0005532585.localdomain systemd[1]: libpod-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf.scope: Deactivated successfully.
Nov 23 10:05:31 np0005532585.localdomain podman[333466]: 2025-11-23 10:05:31.887400313 +0000 UTC m=+0.061603986 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:05:31 np0005532585.localdomain podman[333480]: 2025-11-23 10:05:31.956514525 +0000 UTC m=+0.058750001 container died 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 23 10:05:31 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf-userdata-shm.mount: Deactivated successfully.
Nov 23 10:05:31 np0005532585.localdomain podman[333480]: 2025-11-23 10:05:31.987764647 +0000 UTC m=+0.090000083 container cleanup 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:05:31 np0005532585.localdomain systemd[1]: libpod-conmon-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf.scope: Deactivated successfully.
Nov 23 10:05:32 np0005532585.localdomain podman[333487]: 2025-11-23 10:05:32.038377542 +0000 UTC m=+0.126826073 container remove 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 10:05:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:32.142 263258 INFO neutron.agent.dhcp.agent [None req-25b19362-8d07-456c-bea7-27fcff6e65aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:05:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "faed27e0-c4c3-4299-836c-04ce6fc7a80d", "format": "json"}]: dispatch
Nov 23 10:05:32 np0005532585.localdomain ceph-mon[300199]: pgmap v415: 177 pgs: 177 active+clean; 193 MiB data, 985 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 25 KiB/s wr, 119 op/s
Nov 23 10:05:32 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e196 e196: 6 total, 6 up, 6 in
Nov 23 10:05:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:32.295 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:05:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:32.719 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:32.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:32 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-deefc43655ec5efae4438c1bba8af5217f7f3c949c030f1c1d0f69134d512516-merged.mount: Deactivated successfully.
Nov 23 10:05:32 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d7b3a7ba3\x2de63c\x2d4e55\x2da6fc\x2d444dc25aaece.mount: Deactivated successfully.
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e197 e197: 6 total, 6 up, 6 in
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: osdmap e196: 6 total, 6 up, 6 in
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1626370391' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1626370391' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: osdmap e197: 6 total, 6 up, 6 in
Nov 23 10:05:33 np0005532585.localdomain dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 0 addresses
Nov 23 10:05:33 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host
Nov 23 10:05:33 np0005532585.localdomain dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts
Nov 23 10:05:33 np0005532585.localdomain podman[333527]: 2025-11-23 10:05:33.283066015 +0000 UTC m=+0.060570696 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:05:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:33 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:33Z|00539|binding|INFO|Releasing lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 from this chassis (sb_readonly=0)
Nov 23 10:05:33 np0005532585.localdomain kernel: device tapea48030b-3e left promiscuous mode
Nov 23 10:05:33 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:33Z|00540|binding|INFO|Setting lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 down in Southbound
Nov 23 10:05:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:33.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:33.479 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '472899094c04472c806243e76f122a0f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4b884e9-74a6-4227-b3c7-e09f7c6545b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=ea48030b-3e14-4f0f-ad85-6ef79695a3f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:33.481 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ea48030b-3e14-4f0f-ad85-6ef79695a3f3 in datapath e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a unbound from our chassis
Nov 23 10:05:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:33.482 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:05:33 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:33.483 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[49c4bdc5-f2e9-409e-b9d6-f039fee9c6d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:05:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:33.494 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:34 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f9268685-6943-4f1d-b505-be8ba848d173", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:34 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f9268685-6943-4f1d-b505-be8ba848d173", "format": "json"}]: dispatch
Nov 23 10:05:34 np0005532585.localdomain ceph-mon[300199]: pgmap v418: 177 pgs: 177 active+clean; 193 MiB data, 985 MiB used, 41 GiB / 42 GiB avail; 100 KiB/s rd, 30 KiB/s wr, 140 op/s
Nov 23 10:05:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1150942741' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1150942741' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e198 e198: 6 total, 6 up, 6 in
Nov 23 10:05:35 np0005532585.localdomain ceph-mon[300199]: osdmap e198: 6 total, 6 up, 6 in
Nov 23 10:05:35 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "bdd7b00d-cad7-4915-8d2a-92e849730947", "format": "json"}]: dispatch
Nov 23 10:05:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e199 e199: 6 total, 6 up, 6 in
Nov 23 10:05:35 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:05:35Z|00541|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:05:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:35.322 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:36 np0005532585.localdomain dnsmasq[332928]: exiting on receipt of SIGTERM
Nov 23 10:05:36 np0005532585.localdomain systemd[1]: tmp-crun.fRByvl.mount: Deactivated successfully.
Nov 23 10:05:36 np0005532585.localdomain systemd[1]: libpod-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f.scope: Deactivated successfully.
Nov 23 10:05:36 np0005532585.localdomain podman[333564]: 2025-11-23 10:05:36.200740943 +0000 UTC m=+0.059749611 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:05:36 np0005532585.localdomain ceph-mon[300199]: osdmap e199: 6 total, 6 up, 6 in
Nov 23 10:05:36 np0005532585.localdomain ceph-mon[300199]: pgmap v421: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 81 KiB/s rd, 23 KiB/s wr, 114 op/s
Nov 23 10:05:36 np0005532585.localdomain podman[333578]: 2025-11-23 10:05:36.266850185 +0000 UTC m=+0.059189404 container died dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:05:36 np0005532585.localdomain podman[333578]: 2025-11-23 10:05:36.295003103 +0000 UTC m=+0.087342252 container cleanup dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:05:36 np0005532585.localdomain systemd[1]: libpod-conmon-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f.scope: Deactivated successfully.
Nov 23 10:05:36 np0005532585.localdomain podman[333585]: 2025-11-23 10:05:36.352701751 +0000 UTC m=+0.127084860 container remove dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:05:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:36.376 263258 INFO neutron.agent.dhcp.agent [None req-d327e57b-6666-40b3-b749-83f50dc9721d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:05:36 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:05:36.426 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:05:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-0c01c854a331124ebc60335ecf0975a3084a9f216dced94c1e2f7c47b6d63d45-merged.mount: Deactivated successfully.
Nov 23 10:05:37 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f-userdata-shm.mount: Deactivated successfully.
Nov 23 10:05:37 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2de8c74d81\x2d0df4\x2d43eb\x2d8b7e\x2df7ad7da5cf4a.mount: Deactivated successfully.
Nov 23 10:05:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e200 e200: 6 total, 6 up, 6 in
Nov 23 10:05:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f9268685-6943-4f1d-b505-be8ba848d173", "snap_name": "d4feed24-fd18-43f4-8007-428f7bbb28da", "format": "json"}]: dispatch
Nov 23 10:05:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:37.756 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e201 e201: 6 total, 6 up, 6 in
Nov 23 10:05:38 np0005532585.localdomain ceph-mon[300199]: osdmap e200: 6 total, 6 up, 6 in
Nov 23 10:05:38 np0005532585.localdomain ceph-mon[300199]: pgmap v423: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 21 KiB/s wr, 103 op/s
Nov 23 10:05:38 np0005532585.localdomain ceph-mon[300199]: osdmap e201: 6 total, 6 up, 6 in
Nov 23 10:05:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e202 e202: 6 total, 6 up, 6 in
Nov 23 10:05:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "48839c19-f8e5-4cca-b5e6-993e217ef0c0", "format": "json"}]: dispatch
Nov 23 10:05:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:05:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:05:40 np0005532585.localdomain podman[333607]: 2025-11-23 10:05:40.027996902 +0000 UTC m=+0.081156426 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:05:40 np0005532585.localdomain podman[333608]: 2025-11-23 10:05:40.096141044 +0000 UTC m=+0.148149414 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118)
Nov 23 10:05:40 np0005532585.localdomain podman[333608]: 2025-11-23 10:05:40.111366562 +0000 UTC m=+0.163374962 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:05:40 np0005532585.localdomain podman[333607]: 2025-11-23 10:05:40.111682412 +0000 UTC m=+0.164841936 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:05:40 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:05:40 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:05:40 np0005532585.localdomain ceph-mon[300199]: osdmap e202: 6 total, 6 up, 6 in
Nov 23 10:05:40 np0005532585.localdomain ceph-mon[300199]: pgmap v426: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 26 KiB/s wr, 88 op/s
Nov 23 10:05:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f9268685-6943-4f1d-b505-be8ba848d173", "snap_name": "d4feed24-fd18-43f4-8007-428f7bbb28da_9b2a4eed-44cc-4e60-96ac-820b9b4186d6", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f9268685-6943-4f1d-b505-be8ba848d173", "snap_name": "d4feed24-fd18-43f4-8007-428f7bbb28da", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:41 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e203 e203: 6 total, 6 up, 6 in
Nov 23 10:05:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:05:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:05:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:05:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:05:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:05:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18782 "" "Go-http-client/1.1"
Nov 23 10:05:42 np0005532585.localdomain sudo[333651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:05:42 np0005532585.localdomain sudo[333651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:05:42 np0005532585.localdomain sudo[333651]: pam_unix(sudo:session): session closed for user root
Nov 23 10:05:42 np0005532585.localdomain sudo[333669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 23 10:05:42 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e204 e204: 6 total, 6 up, 6 in
Nov 23 10:05:42 np0005532585.localdomain sudo[333669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:05:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "format": "json"}]: dispatch
Nov 23 10:05:42 np0005532585.localdomain ceph-mon[300199]: osdmap e203: 6 total, 6 up, 6 in
Nov 23 10:05:42 np0005532585.localdomain ceph-mon[300199]: pgmap v428: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 26 KiB/s wr, 88 op/s
Nov 23 10:05:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:42.758 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:42.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:42.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:05:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:42.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:42.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:42 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:42.794 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e205 e205: 6 total, 6 up, 6 in
Nov 23 10:05:43 np0005532585.localdomain podman[333760]: 2025-11-23 10:05:43.26366251 +0000 UTC m=+0.102132578 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12)
Nov 23 10:05:43 np0005532585.localdomain podman[333760]: 2025-11-23 10:05:43.410437681 +0000 UTC m=+0.248907749 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, ceph=True, architecture=x86_64, release=553)
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "48839c19-f8e5-4cca-b5e6-993e217ef0c0_6b0cc57c-2629-485f-bd4c-419d37ac81ed", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "48839c19-f8e5-4cca-b5e6-993e217ef0c0", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: osdmap e204: 6 total, 6 up, 6 in
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1406713589' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1406713589' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: osdmap e205: 6 total, 6 up, 6 in
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/149050981' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/149050981' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2425458825' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:43 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2425458825' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:44 np0005532585.localdomain sudo[333669]: pam_unix(sudo:session): session closed for user root
Nov 23 10:05:44 np0005532585.localdomain sudo[333882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:05:44 np0005532585.localdomain sudo[333882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:05:44 np0005532585.localdomain sudo[333882]: pam_unix(sudo:session): session closed for user root
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e206 e206: 6 total, 6 up, 6 in
Nov 23 10:05:44 np0005532585.localdomain sudo[333900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:05:44 np0005532585.localdomain sudo[333900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f9268685-6943-4f1d-b505-be8ba848d173", "format": "json"}]: dispatch
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f9268685-6943-4f1d-b505-be8ba848d173", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/149050981' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/149050981' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: pgmap v431: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 26 KiB/s wr, 89 op/s
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2425458825' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2425458825' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:44 np0005532585.localdomain ceph-mon[300199]: osdmap e206: 6 total, 6 up, 6 in
Nov 23 10:05:44 np0005532585.localdomain sshd[333935]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:05:44 np0005532585.localdomain sudo[333900]: pam_unix(sudo:session): session closed for user root
Nov 23 10:05:45 np0005532585.localdomain sudo[333951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:05:45 np0005532585.localdomain sudo[333951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:05:45 np0005532585.localdomain sudo[333951]: pam_unix(sudo:session): session closed for user root
Nov 23 10:05:45 np0005532585.localdomain sshd[333935]: Received disconnect from 207.154.194.2 port 52818:11: Bye Bye [preauth]
Nov 23 10:05:45 np0005532585.localdomain sshd[333935]: Disconnected from authenticating user root 207.154.194.2 port 52818 [preauth]
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve49", "tenant_id": "0dcc58766ad44287910595c5c8f14397", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:05:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:46.115 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:05:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:46.119 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:05:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:46.159 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e207 e207: 6 total, 6 up, 6 in
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "bdd7b00d-cad7-4915-8d2a-92e849730947_0d9276b9-0c8a-49a7-bc1f-6fadf5287cf8", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "bdd7b00d-cad7-4915-8d2a-92e849730947", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 23 10:05:46 np0005532585.localdomain ceph-mon[300199]: pgmap v433: 177 pgs: 177 active+clean; 193 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 139 KiB/s rd, 80 KiB/s wr, 196 op/s
Nov 23 10:05:47 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e208 e208: 6 total, 6 up, 6 in
Nov 23 10:05:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ef1efc74-972d-4b02-871d-8dd0c1290f0a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:47 np0005532585.localdomain ceph-mon[300199]: osdmap e207: 6 total, 6 up, 6 in
Nov 23 10:05:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:47.795 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:47 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:47.800 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e209 e209: 6 total, 6 up, 6 in
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ef1efc74-972d-4b02-871d-8dd0c1290f0a", "format": "json"}]: dispatch
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: osdmap e208: 6 total, 6 up, 6 in
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: pgmap v436: 177 pgs: 177 active+clean; 193 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 131 KiB/s rd, 76 KiB/s wr, 184 op/s
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: osdmap e209: 6 total, 6 up, 6 in
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/438476046' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/438476046' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:49 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve48", "tenant_id": "0dcc58766ad44287910595c5c8f14397", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:05:49 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "faed27e0-c4c3-4299-836c-04ce6fc7a80d_d597c998-f063-4969-a445-a54fe8b5c401", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:49 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "faed27e0-c4c3-4299-836c-04ce6fc7a80d", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:05:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3122516944' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3122516944' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:50 np0005532585.localdomain ceph-mon[300199]: pgmap v438: 177 pgs: 177 active+clean; 194 MiB data, 1003 MiB used, 41 GiB / 42 GiB avail; 181 KiB/s rd, 160 KiB/s wr, 262 op/s
Nov 23 10:05:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ef1efc74-972d-4b02-871d-8dd0c1290f0a", "format": "json"}]: dispatch
Nov 23 10:05:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ef1efc74-972d-4b02-871d-8dd0c1290f0a", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:05:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:05:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:05:51 np0005532585.localdomain podman[333971]: 2025-11-23 10:05:51.033928055 +0000 UTC m=+0.082410823 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Nov 23 10:05:51 np0005532585.localdomain podman[333970]: 2025-11-23 10:05:51.076063775 +0000 UTC m=+0.127208373 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 10:05:51 np0005532585.localdomain podman[333971]: 2025-11-23 10:05:51.101797681 +0000 UTC m=+0.150280499 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 23 10:05:51 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:05:51 np0005532585.localdomain podman[333970]: 2025-11-23 10:05:51.157311422 +0000 UTC m=+0.208456090 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 10:05:51 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:05:51 np0005532585.localdomain podman[333969]: 2025-11-23 10:05:51.245133468 +0000 UTC m=+0.298140362 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 10:05:51 np0005532585.localdomain podman[333969]: 2025-11-23 10:05:51.275460932 +0000 UTC m=+0.328467836 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 23 10:05:51 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "23a1953a-4602-4081-b659-88394c7eeb71", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "23a1953a-4602-4081-b659-88394c7eeb71", "format": "json"}]: dispatch
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "b6b84a30-76d0-4c7a-ba18-7c9a32f848c3_9e4afb1c-09af-4f97-babc-10dde86a6298", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "b6b84a30-76d0-4c7a-ba18-7c9a32f848c3", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1438458364' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:05:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1438458364' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: pgmap v439: 177 pgs: 177 active+clean; 194 MiB data, 1003 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 88 KiB/s wr, 100 op/s
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1438458364' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1438458364' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Nov 23 10:05:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 23 10:05:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:52.802 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:52.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:52.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:05:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:52.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:52.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:52 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:52.836 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e210 e210: 6 total, 6 up, 6 in
Nov 23 10:05:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:54 np0005532585.localdomain ceph-mon[300199]: osdmap e210: 6 total, 6 up, 6 in
Nov 23 10:05:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "format": "json"}]: dispatch
Nov 23 10:05:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e211 e211: 6 total, 6 up, 6 in
Nov 23 10:05:55 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:05:55.121 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:05:55 np0005532585.localdomain ceph-mon[300199]: pgmap v441: 177 pgs: 177 active+clean; 194 MiB data, 1003 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 88 KiB/s wr, 99 op/s
Nov 23 10:05:55 np0005532585.localdomain ceph-mon[300199]: osdmap e211: 6 total, 6 up, 6 in
Nov 23 10:05:55 np0005532585.localdomain ceph-mon[300199]: mgrmap e48: np0005532584.naxwxy(active, since 10m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:05:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "73f8c29f-f829-4d6f-b0a1-a020eee73bb5", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:05:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "73f8c29f-f829-4d6f-b0a1-a020eee73bb5", "format": "json"}]: dispatch
Nov 23 10:05:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:05:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "89bb6ad8-e75c-4c65-a495-a2d153e513e0_72e16638-ea70-4008-a6f5-5d7b1af051d5", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "snap_name": "89bb6ad8-e75c-4c65-a495-a2d153e513e0", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve47", "tenant_id": "0dcc58766ad44287910595c5c8f14397", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:05:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 23 10:05:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:05:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:05:57 np0005532585.localdomain ceph-mon[300199]: pgmap v443: 177 pgs: 2 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 173 active+clean; 194 MiB data, 1004 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 131 KiB/s wr, 90 op/s
Nov 23 10:05:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:57.837 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:57.839 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:05:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:57.839 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:05:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:57.839 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:57.882 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:05:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:05:57.883 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:05:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:05:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "snap_name": "b2c90996-bd7d-4844-bb02-0209ec15d89a", "format": "json"}]: dispatch
Nov 23 10:05:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/561193803' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:05:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/561193803' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e212 e212: 6 total, 6 up, 6 in
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: pgmap v444: 177 pgs: 2 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 173 active+clean; 194 MiB data, 1004 MiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 55 KiB/s wr, 8 op/s
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "23a1953a-4602-4081-b659-88394c7eeb71", "format": "json"}]: dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "23a1953a-4602-4081-b659-88394c7eeb71", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "format": "json"}]: dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "307a0389-800d-4c9e-95ec-9b17f1b7da68", "force": true, "format": "json"}]: dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Nov 23 10:05:59 np0005532585.localdomain ceph-mon[300199]: osdmap e212: 6 total, 6 up, 6 in
Nov 23 10:05:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:05:59 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:05:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:05:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:06:00 np0005532585.localdomain podman[334033]: 2025-11-23 10:06:00.047675767 +0000 UTC m=+0.098481078 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:06:00 np0005532585.localdomain podman[334033]: 2025-11-23 10:06:00.05976449 +0000 UTC m=+0.110569811 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:06:00 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:06:00 np0005532585.localdomain systemd[1]: tmp-crun.N0meMN.mount: Deactivated successfully.
Nov 23 10:06:00 np0005532585.localdomain podman[334032]: 2025-11-23 10:06:00.146529784 +0000 UTC m=+0.200973395 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:06:00 np0005532585.localdomain podman[334032]: 2025-11-23 10:06:00.159255988 +0000 UTC m=+0.213699639 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 10:06:00 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:06:00 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 23 10:06:00 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 23 10:06:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1171159827' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1171159827' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:01 np0005532585.localdomain ceph-mon[300199]: pgmap v446: 177 pgs: 177 active+clean; 195 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 172 KiB/s wr, 49 op/s
Nov 23 10:06:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "snap_name": "b2c90996-bd7d-4844-bb02-0209ec15d89a", "target_sub_name": "b2bf9dde-7e11-4be3-ba03-328acaaba40c", "format": "json"}]: dispatch
Nov 23 10:06:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b2bf9dde-7e11-4be3-ba03-328acaaba40c", "format": "json"}]: dispatch
Nov 23 10:06:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "73f8c29f-f829-4d6f-b0a1-a020eee73bb5", "format": "json"}]: dispatch
Nov 23 10:06:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:06:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1789904746' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:06:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1789904746' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:02.886 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:02.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:02.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:02.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:02.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:02 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:02.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e213 e213: 6 total, 6 up, 6 in
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: pgmap v447: 177 pgs: 177 active+clean; 195 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 137 KiB/s wr, 39 op/s
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: mgrmap e49: np0005532584.naxwxy(active, since 10m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1789904746' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1789904746' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: osdmap e213: 6 total, 6 up, 6 in
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "73f8c29f-f829-4d6f-b0a1-a020eee73bb5", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.547517) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363547588, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2879, "num_deletes": 276, "total_data_size": 5425586, "memory_usage": 5595936, "flush_reason": "Manual Compaction"}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363576276, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3552406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26982, "largest_seqno": 29856, "table_properties": {"data_size": 3540546, "index_size": 7725, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27806, "raw_average_key_size": 22, "raw_value_size": 3515998, "raw_average_value_size": 2879, "num_data_blocks": 323, "num_entries": 1221, "num_filter_entries": 1221, "num_deletions": 276, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892236, "oldest_key_time": 1763892236, "file_creation_time": 1763892363, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 28824 microseconds, and 9925 cpu microseconds.
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.576342) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3552406 bytes OK
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.576372) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.579462) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.579487) EVENT_LOG_v1 {"time_micros": 1763892363579480, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.579513) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5412117, prev total WAL file size 5412117, number of live WAL files 2.
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.580631) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3469KB)], [48(14MB)]
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363580686, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 19054683, "oldest_snapshot_seqno": -1}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 13245 keys, 17876505 bytes, temperature: kUnknown
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363670047, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 17876505, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17800781, "index_size": 41511, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33157, "raw_key_size": 356686, "raw_average_key_size": 26, "raw_value_size": 17574981, "raw_average_value_size": 1326, "num_data_blocks": 1553, "num_entries": 13245, "num_filter_entries": 13245, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892363, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.670358) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 17876505 bytes
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.672351) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.0 rd, 199.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 14.8 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(10.4) write-amplify(5.0) OK, records in: 13810, records dropped: 565 output_compression: NoCompression
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.672386) EVENT_LOG_v1 {"time_micros": 1763892363672372, "job": 28, "event": "compaction_finished", "compaction_time_micros": 89459, "compaction_time_cpu_micros": 52678, "output_level": 6, "num_output_files": 1, "total_output_size": 17876505, "num_input_records": 13810, "num_output_records": 13245, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363673155, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363675721, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.580544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:06:03 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: pgmap v449: 177 pgs: 177 active+clean; 195 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 83 KiB/s wr, 30 op/s
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "format": "json"}]: dispatch
Nov 23 10:06:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "23db4718-314d-42ce-b54b-c4702b2fa362", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:05 np0005532585.localdomain sshd[334073]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:06:05 np0005532585.localdomain sshd[334073]: Invalid user odoo from 107.172.15.139 port 48004
Nov 23 10:06:05 np0005532585.localdomain sshd[334073]: Received disconnect from 107.172.15.139 port 48004:11: Bye Bye [preauth]
Nov 23 10:06:05 np0005532585.localdomain sshd[334073]: Disconnected from invalid user odoo 107.172.15.139 port 48004 [preauth]
Nov 23 10:06:06 np0005532585.localdomain ceph-mon[300199]: pgmap v450: 177 pgs: 177 active+clean; 195 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 155 KiB/s wr, 62 op/s
Nov 23 10:06:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1289491411' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1289491411' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:06 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:06:06Z|00542|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.919 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.920 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.920 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:07 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:07.957 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 e214: 6 total, 6 up, 6 in
Nov 23 10:06:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:06:08 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1601258234' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:06:08 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1601258234' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:09 np0005532585.localdomain ceph-mon[300199]: pgmap v451: 177 pgs: 177 active+clean; 195 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 148 KiB/s wr, 60 op/s
Nov 23 10:06:09 np0005532585.localdomain ceph-mon[300199]: osdmap e214: 6 total, 6 up, 6 in
Nov 23 10:06:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1601258234' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1601258234' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/725219692' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:09 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/725219692' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:06:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:06:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:06:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "dfbe99c1-496c-4355-8746-6391072f4d2c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "dfbe99c1-496c-4355-8746-6391072f4d2c", "format": "json"}]: dispatch
Nov 23 10:06:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:10 np0005532585.localdomain sshd[334075]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:06:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:06:10 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:06:11 np0005532585.localdomain podman[334077]: 2025-11-23 10:06:11.040135561 +0000 UTC m=+0.087039923 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:06:11 np0005532585.localdomain podman[334077]: 2025-11-23 10:06:11.076569368 +0000 UTC m=+0.123473770 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:06:11 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:06:11 np0005532585.localdomain podman[334076]: 2025-11-23 10:06:11.081638751 +0000 UTC m=+0.132142741 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:06:11 np0005532585.localdomain podman[334076]: 2025-11-23 10:06:11.165307202 +0000 UTC m=+0.215811162 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:06:11 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:06:11 np0005532585.localdomain ceph-mon[300199]: pgmap v453: 177 pgs: 177 active+clean; 195 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 98 KiB/s wr, 99 op/s
Nov 23 10:06:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:06:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:06:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:06:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:06:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:06:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18787 "" "Go-http-client/1.1"
Nov 23 10:06:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:12.984 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:12.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:12.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:12.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:12.987 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:12 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:12.990 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:13 np0005532585.localdomain ceph-mon[300199]: pgmap v454: 177 pgs: 177 active+clean; 195 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 93 KiB/s wr, 95 op/s
Nov 23 10:06:13 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:13 np0005532585.localdomain sshd[334118]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:06:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "format": "json"}]: dispatch
Nov 23 10:06:14 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 23 10:06:14 np0005532585.localdomain sshd[334118]: Received disconnect from 175.126.166.172 port 46206:11: Bye Bye [preauth]
Nov 23 10:06:14 np0005532585.localdomain sshd[334118]: Disconnected from authenticating user root 175.126.166.172 port 46206 [preauth]
Nov 23 10:06:15 np0005532585.localdomain ceph-mon[300199]: pgmap v455: 177 pgs: 177 active+clean; 195 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 78 KiB/s wr, 79 op/s
Nov 23 10:06:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "format": "json"}]: dispatch
Nov 23 10:06:17 np0005532585.localdomain ceph-mon[300199]: pgmap v456: 177 pgs: 177 active+clean; 195 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 29 KiB/s wr, 55 op/s
Nov 23 10:06:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b2bf9dde-7e11-4be3-ba03-328acaaba40c", "format": "json"}]: dispatch
Nov 23 10:06:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/439873091' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:17 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/439873091' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:17 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:17.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 10:06:18 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1442410396' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.324 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.324 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.325 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.325 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: pgmap v457: 177 pgs: 177 active+clean; 195 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 29 KiB/s wr, 55 op/s
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b2bf9dde-7e11-4be3-ba03-328acaaba40c", "format": "json"}]: dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "snap_name": "fa219110-ac84-4caf-a8ed-b4a2487e0ae3", "format": "json"}]: dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-710186636", "format": "json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-710186636", "caps": ["mds", "allow rw path=/volumes/_nogroup/b0303be3-5e23-424d-935b-a23f10085dfe/5ae76225-892a-444c-90ca-4662bf6a5b64", "osd", "allow rw pool=manila_data namespace=fsvolumens_b0303be3-5e23-424d-935b-a23f10085dfe", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-710186636", "caps": ["mds", "allow rw path=/volumes/_nogroup/b0303be3-5e23-424d-935b-a23f10085dfe/5ae76225-892a-444c-90ca-4662bf6a5b64", "osd", "allow rw pool=manila_data namespace=fsvolumens_b0303be3-5e23-424d-935b-a23f10085dfe", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1442410396' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/697156999' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/697156999' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-710186636", "format": "json"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-710186636"} : dispatch
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-710186636"}]': finished
Nov 23 10:06:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e215 e215: 6 total, 6 up, 6 in
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.804 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.826 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:06:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:19.827 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:06:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:20.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "auth_id": "tempest-cephx-id-710186636", "tenant_id": "ac0f52e853194f9ea9d27590165c645d", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:06:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "auth_id": "tempest-cephx-id-710186636", "format": "json"}]: dispatch
Nov 23 10:06:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "auth_id": "tempest-cephx-id-710186636", "format": "json"}]: dispatch
Nov 23 10:06:20 np0005532585.localdomain ceph-mon[300199]: osdmap e215: 6 total, 6 up, 6 in
Nov 23 10:06:20 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e216 e216: 6 total, 6 up, 6 in
Nov 23 10:06:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:21.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:21.212 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:21 np0005532585.localdomain ceph-mon[300199]: pgmap v458: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 62 KiB/s wr, 87 op/s
Nov 23 10:06:21 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "format": "json"}]: dispatch
Nov 23 10:06:21 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b0303be3-5e23-424d-935b-a23f10085dfe", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:21 np0005532585.localdomain ceph-mon[300199]: osdmap e216: 6 total, 6 up, 6 in
Nov 23 10:06:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:06:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:06:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:06:22 np0005532585.localdomain podman[334122]: 2025-11-23 10:06:22.043588218 +0000 UTC m=+0.091646292 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Nov 23 10:06:22 np0005532585.localdomain podman[334122]: 2025-11-23 10:06:22.057326202 +0000 UTC m=+0.105384266 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 10:06:22 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:06:22 np0005532585.localdomain podman[334120]: 2025-11-23 10:06:22.077569542 +0000 UTC m=+0.133389259 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 23 10:06:22 np0005532585.localdomain podman[334120]: 2025-11-23 10:06:22.11733794 +0000 UTC m=+0.173157697 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 10:06:22 np0005532585.localdomain podman[334121]: 2025-11-23 10:06:22.133378453 +0000 UTC m=+0.185915202 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:06:22 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:06:22 np0005532585.localdomain podman[334121]: 2025-11-23 10:06:22.142134677 +0000 UTC m=+0.194671416 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:06:22 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:06:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:22.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:06:22 np0005532585.localdomain sshd[334075]: error: kex_exchange_identification: read: Connection timed out
Nov 23 10:06:22 np0005532585.localdomain sshd[334075]: banner exchange: Connection from 182.61.18.212 port 56820: Connection timed out
Nov 23 10:06:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/276648477' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2737219998' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2737219998' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4196145382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:22.991 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:22.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:22.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:22.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:23.025 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:23.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e217 e217: 6 total, 6 up, 6 in
Nov 23 10:06:23 np0005532585.localdomain ceph-mon[300199]: pgmap v461: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 63 KiB/s wr, 57 op/s
Nov 23 10:06:23 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b2bf9dde-7e11-4be3-ba03-328acaaba40c", "format": "json"}]: dispatch
Nov 23 10:06:23 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b2bf9dde-7e11-4be3-ba03-328acaaba40c", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e218 e218: 6 total, 6 up, 6 in
Nov 23 10:06:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "snap_name": "fa219110-ac84-4caf-a8ed-b4a2487e0ae3_1633366c-cbc2-4ad1-8aff-e07adea22fbc", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "snap_name": "fa219110-ac84-4caf-a8ed-b4a2487e0ae3", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:24 np0005532585.localdomain ceph-mon[300199]: osdmap e217: 6 total, 6 up, 6 in
Nov 23 10:06:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.231 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.232 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.232 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.232 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.233 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: pgmap v463: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 70 KiB/s wr, 74 op/s
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d956f44c-8366-4a9f-aa0b-f72cc390985d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d956f44c-8366-4a9f-aa0b-f72cc390985d", "format": "json"}]: dispatch
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: osdmap e218: 6 total, 6 up, 6 in
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/351949762' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e219 e219: 6 total, 6 up, 6 in
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:06:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/932944577' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.729 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.812 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.812 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.987 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.988 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11104MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.989 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:06:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:25.989 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.063 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.064 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.064 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.114 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "snap_name": "b2c90996-bd7d-4844-bb02-0209ec15d89a_14ee2b3e-3caa-46fb-9254-578591f72a32", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "snap_name": "b2c90996-bd7d-4844-bb02-0209ec15d89a", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: osdmap e219: 6 total, 6 up, 6 in
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3780591128' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/932944577' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2201396073' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:06:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2425650944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.567 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.574 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.589 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.592 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:06:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:26.592 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:06:27 np0005532585.localdomain ceph-mon[300199]: pgmap v465: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 57 KiB/s wr, 77 op/s
Nov 23 10:06:27 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "format": "json"}]: dispatch
Nov 23 10:06:27 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a415431f-f058-44aa-a0b3-ca2c8202775f", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:27 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2425650944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:06:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e220 e220: 6 total, 6 up, 6 in
Nov 23 10:06:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:28.027 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:28.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:28.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:28.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:28.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:28.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1242299826' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1242299826' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e221 e221: 6 total, 6 up, 6 in
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: pgmap v467: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 55 KiB/s wr, 74 op/s
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: osdmap e220: 6 total, 6 up, 6 in
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "format": "json"}]: dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c801417a-e8bf-4ab6-8efa-c611d6d1d111", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1242299826' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1242299826' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d956f44c-8366-4a9f-aa0b-f72cc390985d", "format": "json"}]: dispatch
Nov 23 10:06:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d956f44c-8366-4a9f-aa0b-f72cc390985d", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:29 np0005532585.localdomain ceph-mon[300199]: osdmap e221: 6 total, 6 up, 6 in
Nov 23 10:06:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/280235995' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/280235995' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:29.801 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:06:29 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:29.802 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:06:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:29.802 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:06:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:06:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:06:30.152 263258 INFO neutron.agent.linux.ip_lib [None req-9b2a92cf-79b1-4b60-8412-cb888fcbdc3e - - - - - -] Device tap6d2dff40-04 cannot be used as it has no MAC address
Nov 23 10:06:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:30.173 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:30 np0005532585.localdomain kernel: device tap6d2dff40-04 entered promiscuous mode
Nov 23 10:06:30 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892390.1823] manager: (tap6d2dff40-04): new Generic device (/org/freedesktop/NetworkManager/Devices/86)
Nov 23 10:06:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:30.181 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:30 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:06:30Z|00543|binding|INFO|Claiming lport 6d2dff40-048d-4175-8e87-c4c88e21141f for this chassis.
Nov 23 10:06:30 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:06:30Z|00544|binding|INFO|6d2dff40-048d-4175-8e87-c4c88e21141f: Claiming unknown
Nov 23 10:06:30 np0005532585.localdomain systemd-udevd[334236]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:06:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:30.193 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33c4eecf43aa413a9f282206f9e9a55b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e2548aa-bcaa-4071-886c-d49df70c86b7, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=6d2dff40-048d-4175-8e87-c4c88e21141f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:06:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:30.195 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6d2dff40-048d-4175-8e87-c4c88e21141f in datapath 1ebb6643-dd69-425a-84e7-f74c46a69f9f bound to our chassis
Nov 23 10:06:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:30.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7ef8df20-b914-4e91-b454-846b497693d6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:06:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:30.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ebb6643-dd69-425a-84e7-f74c46a69f9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:06:30 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:30.198 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[939ffe9c-a6de-49d8-82e1-ea2e3976c71f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:06:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:06:30 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:06:30Z|00545|binding|INFO|Setting lport 6d2dff40-048d-4175-8e87-c4c88e21141f ovn-installed in OVS
Nov 23 10:06:30 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:06:30Z|00546|binding|INFO|Setting lport 6d2dff40-048d-4175-8e87-c4c88e21141f up in Southbound
Nov 23 10:06:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:30.229 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap6d2dff40-04: No such device
Nov 23 10:06:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:30.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:30.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:30 np0005532585.localdomain systemd[1]: tmp-crun.Cy72aH.mount: Deactivated successfully.
Nov 23 10:06:30 np0005532585.localdomain podman[334238]: 2025-11-23 10:06:30.323738333 +0000 UTC m=+0.109886941 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:06:30 np0005532585.localdomain podman[334239]: 2025-11-23 10:06:30.356117518 +0000 UTC m=+0.139627057 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:06:30 np0005532585.localdomain podman[334238]: 2025-11-23 10:06:30.389311719 +0000 UTC m=+0.175460377 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:06:30 np0005532585.localdomain podman[334239]: 2025-11-23 10:06:30.392500194 +0000 UTC m=+0.176009733 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:06:30 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:06:30 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:06:30 np0005532585.localdomain ceph-mon[300199]: pgmap v470: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 184 KiB/s rd, 93 KiB/s wr, 251 op/s
Nov 23 10:06:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dfbe99c1-496c-4355-8746-6391072f4d2c", "format": "json"}]: dispatch
Nov 23 10:06:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "dfbe99c1-496c-4355-8746-6391072f4d2c", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e222 e222: 6 total, 6 up, 6 in
Nov 23 10:06:31 np0005532585.localdomain podman[334349]: 
Nov 23 10:06:31 np0005532585.localdomain podman[334349]: 2025-11-23 10:06:31.176514771 +0000 UTC m=+0.079547188 container create 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 23 10:06:31 np0005532585.localdomain systemd[1]: Started libpod-conmon-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5.scope.
Nov 23 10:06:31 np0005532585.localdomain podman[334349]: 2025-11-23 10:06:31.134959759 +0000 UTC m=+0.037992206 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:06:31 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:06:31 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bbb7d3b4428d8f687dacc74829ec1d2b43a77a0b9743c4105e027e24960cc16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:06:31 np0005532585.localdomain podman[334349]: 2025-11-23 10:06:31.258043327 +0000 UTC m=+0.161075744 container init 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:06:31 np0005532585.localdomain podman[334349]: 2025-11-23 10:06:31.265627906 +0000 UTC m=+0.168660323 container start 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:06:31 np0005532585.localdomain dnsmasq[334367]: started, version 2.85 cachesize 150
Nov 23 10:06:31 np0005532585.localdomain dnsmasq[334367]: DNS service limited to local subnets
Nov 23 10:06:31 np0005532585.localdomain dnsmasq[334367]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:06:31 np0005532585.localdomain dnsmasq[334367]: warning: no upstream servers configured
Nov 23 10:06:31 np0005532585.localdomain dnsmasq-dhcp[334367]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:06:31 np0005532585.localdomain dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 0 addresses
Nov 23 10:06:31 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host
Nov 23 10:06:31 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts
Nov 23 10:06:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:31.349 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:06:31.513 263258 INFO neutron.agent.dhcp.agent [None req-76e000ab-2e09-48cd-82d9-c82e540c898d - - - - - -] DHCP configuration for ports {'38e3101c-d0ed-4881-8ac3-7f252b0ff879'} is completed
Nov 23 10:06:31 np0005532585.localdomain ceph-mon[300199]: osdmap e222: 6 total, 6 up, 6 in
Nov 23 10:06:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3055210354' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3055210354' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e223 e223: 6 total, 6 up, 6 in
Nov 23 10:06:31 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:06:31.952 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:06:31Z, description=, device_id=48f6cfc5-8b79-494b-95ec-92da5372a95b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa7a90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8aa7f10>], id=ceb30ad9-170b-4e72-94b3-c6a64398b0a8, ip_allocation=immediate, mac_address=fa:16:3e:a7:c9:b7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:06:27Z, description=, dns_domain=, id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1404220004-network, port_security_enabled=True, project_id=33c4eecf43aa413a9f282206f9e9a55b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3191, status=ACTIVE, subnets=['eee79f80-643c-4af9-96bb-e3244dbe6f93'], tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:28Z, vlan_transparent=None, network_id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, port_security_enabled=False, project_id=33c4eecf43aa413a9f282206f9e9a55b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3221, status=DOWN, tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:31Z on network 1ebb6643-dd69-425a-84e7-f74c46a69f9f
Nov 23 10:06:32 np0005532585.localdomain dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 1 addresses
Nov 23 10:06:32 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host
Nov 23 10:06:32 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts
Nov 23 10:06:32 np0005532585.localdomain podman[334385]: 2025-11-23 10:06:32.166010728 +0000 UTC m=+0.056546054 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:06:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:06:32.449 263258 INFO neutron.agent.dhcp.agent [None req-4223f2e0-8720-4d35-b49f-5ca212290d91 - - - - - -] DHCP configuration for ports {'ceb30ad9-170b-4e72-94b3-c6a64398b0a8'} is completed
Nov 23 10:06:32 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:06:32.558 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:06:31Z, description=, device_id=48f6cfc5-8b79-494b-95ec-92da5372a95b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c1b8b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8c1b3a0>], id=ceb30ad9-170b-4e72-94b3-c6a64398b0a8, ip_allocation=immediate, mac_address=fa:16:3e:a7:c9:b7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:06:27Z, description=, dns_domain=, id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1404220004-network, port_security_enabled=True, project_id=33c4eecf43aa413a9f282206f9e9a55b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3191, status=ACTIVE, subnets=['eee79f80-643c-4af9-96bb-e3244dbe6f93'], tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:28Z, vlan_transparent=None, network_id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, port_security_enabled=False, project_id=33c4eecf43aa413a9f282206f9e9a55b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3221, status=DOWN, tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:31Z on network 1ebb6643-dd69-425a-84e7-f74c46a69f9f
Nov 23 10:06:32 np0005532585.localdomain ceph-mon[300199]: pgmap v472: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 155 KiB/s rd, 78 KiB/s wr, 211 op/s
Nov 23 10:06:32 np0005532585.localdomain ceph-mon[300199]: osdmap e223: 6 total, 6 up, 6 in
Nov 23 10:06:32 np0005532585.localdomain dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 1 addresses
Nov 23 10:06:32 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host
Nov 23 10:06:32 np0005532585.localdomain podman[334422]: 2025-11-23 10:06:32.769201767 +0000 UTC m=+0.063526173 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 10:06:32 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts
Nov 23 10:06:33 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:06:33.001 263258 INFO neutron.agent.dhcp.agent [None req-f1c471c9-e3d3-465f-be83-59dc2ba0ec87 - - - - - -] DHCP configuration for ports {'ceb30ad9-170b-4e72-94b3-c6a64398b0a8'} is completed
Nov 23 10:06:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:33.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e224 e224: 6 total, 6 up, 6 in
Nov 23 10:06:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:34 np0005532585.localdomain ceph-mon[300199]: osdmap e224: 6 total, 6 up, 6 in
Nov 23 10:06:34 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3749090719' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e225 e225: 6 total, 6 up, 6 in
Nov 23 10:06:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:34.547 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:35 np0005532585.localdomain ceph-mon[300199]: pgmap v475: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 188 KiB/s rd, 95 KiB/s wr, 256 op/s
Nov 23 10:06:35 np0005532585.localdomain ceph-mon[300199]: osdmap e225: 6 total, 6 up, 6 in
Nov 23 10:06:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e226 e226: 6 total, 6 up, 6 in
Nov 23 10:06:35 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:06:35.804 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:06:36 np0005532585.localdomain ceph-mon[300199]: osdmap e226: 6 total, 6 up, 6 in
Nov 23 10:06:37 np0005532585.localdomain ceph-mon[300199]: pgmap v478: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 148 KiB/s rd, 33 KiB/s wr, 202 op/s
Nov 23 10:06:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e227 e227: 6 total, 6 up, 6 in
Nov 23 10:06:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:38.126 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e228 e228: 6 total, 6 up, 6 in
Nov 23 10:06:38 np0005532585.localdomain ceph-mon[300199]: osdmap e227: 6 total, 6 up, 6 in
Nov 23 10:06:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:38 np0005532585.localdomain ceph-mon[300199]: osdmap e228: 6 total, 6 up, 6 in
Nov 23 10:06:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:38.695 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e229 e229: 6 total, 6 up, 6 in
Nov 23 10:06:39 np0005532585.localdomain ceph-mon[300199]: pgmap v480: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 30 KiB/s wr, 185 op/s
Nov 23 10:06:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3c893617-cb25-4de9-8f72-b90c923bf86c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3c893617-cb25-4de9-8f72-b90c923bf86c", "format": "json"}]: dispatch
Nov 23 10:06:40 np0005532585.localdomain ceph-mon[300199]: osdmap e229: 6 total, 6 up, 6 in
Nov 23 10:06:40 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e230 e230: 6 total, 6 up, 6 in
Nov 23 10:06:41 np0005532585.localdomain ceph-mon[300199]: pgmap v483: 177 pgs: 3 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 172 active+clean; 237 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 9.3 MiB/s wr, 128 op/s
Nov 23 10:06:41 np0005532585.localdomain ceph-mon[300199]: osdmap e230: 6 total, 6 up, 6 in
Nov 23 10:06:41 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3196558149' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:41 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3196558149' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:06:41 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:06:41 np0005532585.localdomain podman[334444]: 2025-11-23 10:06:41.783115501 +0000 UTC m=+0.087585629 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:06:41 np0005532585.localdomain podman[334444]: 2025-11-23 10:06:41.795080401 +0000 UTC m=+0.099550539 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:06:41 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:06:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:06:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:06:41 np0005532585.localdomain podman[334445]: 2025-11-23 10:06:41.905001523 +0000 UTC m=+0.208267655 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:06:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:06:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 10:06:42 np0005532585.localdomain podman[334445]: 2025-11-23 10:06:42.037729031 +0000 UTC m=+0.340995193 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:06:42 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:06:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19263 "" "Go-http-client/1.1"
Nov 23 10:06:42 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:06:42 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/395089354' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:42 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/395089354' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:43.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e231 e231: 6 total, 6 up, 6 in
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: pgmap v485: 177 pgs: 3 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 172 active+clean; 237 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 90 KiB/s rd, 9.5 MiB/s wr, 131 op/s
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "format": "json"}]: dispatch
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3c893617-cb25-4de9-8f72-b90c923bf86c", "format": "json"}]: dispatch
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3c893617-cb25-4de9-8f72-b90c923bf86c", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: osdmap e231: 6 total, 6 up, 6 in
Nov 23 10:06:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/620203384' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/620203384' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:45 np0005532585.localdomain sshd[334501]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:06:45 np0005532585.localdomain sudo[334487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:06:45 np0005532585.localdomain sudo[334487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:06:45 np0005532585.localdomain sudo[334487]: pam_unix(sudo:session): session closed for user root
Nov 23 10:06:45 np0005532585.localdomain sudo[334507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 23 10:06:45 np0005532585.localdomain sudo[334507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:06:45 np0005532585.localdomain ceph-mon[300199]: pgmap v487: 177 pgs: 3 active+clean+snaptrim, 2 active+clean+snaptrim_wait, 172 active+clean; 237 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 7.5 MiB/s wr, 103 op/s
Nov 23 10:06:45 np0005532585.localdomain sudo[334507]: pam_unix(sudo:session): session closed for user root
Nov 23 10:06:46 np0005532585.localdomain sudo[334546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:06:46 np0005532585.localdomain sudo[334546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:06:46 np0005532585.localdomain sudo[334546]: pam_unix(sudo:session): session closed for user root
Nov 23 10:06:46 np0005532585.localdomain sudo[334564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:06:46 np0005532585.localdomain sudo[334564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:06:46 np0005532585.localdomain sshd[334501]: Received disconnect from 207.154.194.2 port 52072:11: Bye Bye [preauth]
Nov 23 10:06:46 np0005532585.localdomain sshd[334501]: Disconnected from authenticating user root 207.154.194.2 port 52072 [preauth]
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "snap_name": "85fdd710-c011-4c18-ae03-4b7d2fcb945e", "format": "json"}]: dispatch
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:46 np0005532585.localdomain sudo[334564]: pam_unix(sudo:session): session closed for user root
Nov 23 10:06:47 np0005532585.localdomain sudo[334614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:06:47 np0005532585.localdomain sudo[334614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:06:47 np0005532585.localdomain sudo[334614]: pam_unix(sudo:session): session closed for user root
Nov 23 10:06:47 np0005532585.localdomain ceph-mon[300199]: pgmap v488: 177 pgs: 177 active+clean; 365 MiB data, 1.5 GiB used, 41 GiB / 42 GiB avail; 152 KiB/s rd, 27 MiB/s wr, 216 op/s
Nov 23 10:06:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:06:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:06:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:06:48 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:06:48Z|00547|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:06:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:48.107 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:48.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e232 e232: 6 total, 6 up, 6 in
Nov 23 10:06:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:49 np0005532585.localdomain ceph-mon[300199]: pgmap v489: 177 pgs: 177 active+clean; 365 MiB data, 1.5 GiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 16 MiB/s wr, 98 op/s
Nov 23 10:06:49 np0005532585.localdomain ceph-mon[300199]: osdmap e232: 6 total, 6 up, 6 in
Nov 23 10:06:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:06:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "snap_name": "85fdd710-c011-4c18-ae03-4b7d2fcb945e_dda399c2-11f7-4703-ab1c-2bfe2cc11a4a", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "snap_name": "85fdd710-c011-4c18-ae03-4b7d2fcb945e", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4230430634' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e233 e233: 6 total, 6 up, 6 in
Nov 23 10:06:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e234 e234: 6 total, 6 up, 6 in
Nov 23 10:06:51 np0005532585.localdomain ceph-mon[300199]: pgmap v491: 177 pgs: 177 active+clean; 493 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 83 KiB/s rd, 32 MiB/s wr, 123 op/s
Nov 23 10:06:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "format": "json"}]: dispatch
Nov 23 10:06:51 np0005532585.localdomain ceph-mon[300199]: osdmap e233: 6 total, 6 up, 6 in
Nov 23 10:06:52 np0005532585.localdomain ceph-mon[300199]: osdmap e234: 6 total, 6 up, 6 in
Nov 23 10:06:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:06:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:06:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:06:53 np0005532585.localdomain systemd[1]: tmp-crun.VilIgc.mount: Deactivated successfully.
Nov 23 10:06:53 np0005532585.localdomain podman[334634]: 2025-11-23 10:06:53.06781045 +0000 UTC m=+0.104726835 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible)
Nov 23 10:06:53 np0005532585.localdomain podman[334633]: 2025-11-23 10:06:53.086319718 +0000 UTC m=+0.128012167 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 23 10:06:53 np0005532585.localdomain podman[334634]: 2025-11-23 10:06:53.106329471 +0000 UTC m=+0.143245826 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 10:06:53 np0005532585.localdomain podman[334633]: 2025-11-23 10:06:53.119154868 +0000 UTC m=+0.160847307 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 23 10:06:53 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:06:53 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:06:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:53.166 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:53.168 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:53.168 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:53.169 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:53.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:53.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:53 np0005532585.localdomain podman[334632]: 2025-11-23 10:06:53.213988823 +0000 UTC m=+0.256881479 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 23 10:06:53 np0005532585.localdomain podman[334632]: 2025-11-23 10:06:53.25931939 +0000 UTC m=+0.302212005 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:06:53 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e235 e235: 6 total, 6 up, 6 in
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: pgmap v494: 177 pgs: 177 active+clean; 493 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 21 MiB/s wr, 34 op/s
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "format": "json"}]: dispatch
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8d530d74-a4c8-47dd-aed4-d2152c10ac50", "force": true, "format": "json"}]: dispatch
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:06:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:54 np0005532585.localdomain systemd[1]: tmp-crun.3H9WOk.mount: Deactivated successfully.
Nov 23 10:06:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:06:54 np0005532585.localdomain ceph-mon[300199]: osdmap e235: 6 total, 6 up, 6 in
Nov 23 10:06:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e236 e236: 6 total, 6 up, 6 in
Nov 23 10:06:55 np0005532585.localdomain ceph-mon[300199]: pgmap v496: 177 pgs: 177 active+clean; 493 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 20 KiB/s rd, 24 MiB/s wr, 38 op/s
Nov 23 10:06:55 np0005532585.localdomain ceph-mon[300199]: osdmap e236: 6 total, 6 up, 6 in
Nov 23 10:06:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:06:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e237 e237: 6 total, 6 up, 6 in
Nov 23 10:06:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "87b8129a-7de1-44bc-958e-150b793b403a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:06:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "87b8129a-7de1-44bc-958e-150b793b403a", "format": "json"}]: dispatch
Nov 23 10:06:56 np0005532585.localdomain ceph-mon[300199]: osdmap e237: 6 total, 6 up, 6 in
Nov 23 10:06:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/926987801' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/926987801' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: pgmap v499: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 621 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 119 KiB/s rd, 30 MiB/s wr, 178 op/s
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1384345740' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1384345740' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:06:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:06:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e238 e238: 6 total, 6 up, 6 in
Nov 23 10:06:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:58.210 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:58.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:06:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:58.213 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:06:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:58.213 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:58.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:06:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:06:58.251 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:06:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:06:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:06:58 np0005532585.localdomain ceph-mon[300199]: osdmap e238: 6 total, 6 up, 6 in
Nov 23 10:06:59 np0005532585.localdomain ceph-mon[300199]: pgmap v500: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 621 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 85 KiB/s rd, 21 MiB/s wr, 127 op/s
Nov 23 10:06:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "87b8129a-7de1-44bc-958e-150b793b403a", "snap_name": "9a00f796-c1f9-4801-901b-d1066a695eff", "format": "json"}]: dispatch
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:06:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:06:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:07:00 np0005532585.localdomain sshd[334694]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:07:00 np0005532585.localdomain sshd[334694]: Connection closed by 182.61.18.212 port 58012 [preauth]
Nov 23 10:07:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:07:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/451286883' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/451286883' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:07:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:07:01 np0005532585.localdomain podman[334696]: 2025-11-23 10:07:01.03739777 +0000 UTC m=+0.088435415 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 10:07:01 np0005532585.localdomain podman[334696]: 2025-11-23 10:07:01.054423093 +0000 UTC m=+0.105460728 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:07:01 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:07:01 np0005532585.localdomain podman[334697]: 2025-11-23 10:07:01.138194166 +0000 UTC m=+0.186787988 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:07:01 np0005532585.localdomain podman[334697]: 2025-11-23 10:07:01.152502677 +0000 UTC m=+0.201096519 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:07:01 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:07:01 np0005532585.localdomain ceph-mon[300199]: pgmap v502: 177 pgs: 177 active+clean; 742 MiB data, 2.6 GiB used, 39 GiB / 42 GiB avail; 151 KiB/s rd, 41 MiB/s wr, 228 op/s
Nov 23 10:07:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:07:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:07:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2643180548' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:07:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2643180548' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e239 e239: 6 total, 6 up, 6 in
Nov 23 10:07:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:03.252 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:03.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:03.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:03.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:03.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:03.292 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: pgmap v503: 177 pgs: 177 active+clean; 742 MiB data, 2.6 GiB used, 39 GiB / 42 GiB avail; 103 KiB/s rd, 30 MiB/s wr, 161 op/s
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "87b8129a-7de1-44bc-958e-150b793b403a", "snap_name": "9a00f796-c1f9-4801-901b-d1066a695eff_fbd2935e-439f-4401-9aa6-78d8ed8fda87", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "87b8129a-7de1-44bc-958e-150b793b403a", "snap_name": "9a00f796-c1f9-4801-901b-d1066a695eff", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2643180548' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2643180548' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:03 np0005532585.localdomain ceph-mon[300199]: osdmap e239: 6 total, 6 up, 6 in
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: pgmap v505: 177 pgs: 177 active+clean; 742 MiB data, 2.6 GiB used, 39 GiB / 42 GiB avail; 50 KiB/s rd, 15 MiB/s wr, 75 op/s
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4047953706' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4047953706' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3e36dbf7-3485-45bc-a6bc-a043d0884a88", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:06 np0005532585.localdomain ceph-mon[300199]: pgmap v506: 177 pgs: 177 active+clean; 870 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 91 KiB/s rd, 31 MiB/s wr, 142 op/s
Nov 23 10:07:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3e36dbf7-3485-45bc-a6bc-a043d0884a88", "format": "json"}]: dispatch
Nov 23 10:07:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "87b8129a-7de1-44bc-958e-150b793b403a", "format": "json"}]: dispatch
Nov 23 10:07:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "87b8129a-7de1-44bc-958e-150b793b403a", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:06 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e240 e240: 6 total, 6 up, 6 in
Nov 23 10:07:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e241 e241: 6 total, 6 up, 6 in
Nov 23 10:07:07 np0005532585.localdomain ceph-mon[300199]: osdmap e240: 6 total, 6 up, 6 in
Nov 23 10:07:07 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:07:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.191681) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428191732, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1726, "num_deletes": 271, "total_data_size": 2337193, "memory_usage": 2372272, "flush_reason": "Manual Compaction"}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428201419, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1529660, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29861, "largest_seqno": 31582, "table_properties": {"data_size": 1522091, "index_size": 4398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18108, "raw_average_key_size": 21, "raw_value_size": 1506165, "raw_average_value_size": 1821, "num_data_blocks": 185, "num_entries": 827, "num_filter_entries": 827, "num_deletions": 271, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892363, "oldest_key_time": 1763892363, "file_creation_time": 1763892428, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 9787 microseconds, and 4425 cpu microseconds.
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.201467) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1529660 bytes OK
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.201489) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.203335) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.203356) EVENT_LOG_v1 {"time_micros": 1763892428203349, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.203375) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 2328704, prev total WAL file size 2329028, number of live WAL files 2.
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.204177) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323736' seq:72057594037927935, type:22 .. '6C6F676D0034353330' seq:0, type:0; will stop at (end)
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(1493KB)], [51(17MB)]
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428204363, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 19406165, "oldest_snapshot_seqno": -1}
Nov 23 10:07:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:08.293 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:08.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:08.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:08.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 13509 keys, 18794195 bytes, temperature: kUnknown
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428316716, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 18794195, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18715615, "index_size": 43727, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33797, "raw_key_size": 364104, "raw_average_key_size": 26, "raw_value_size": 18483977, "raw_average_value_size": 1368, "num_data_blocks": 1631, "num_entries": 13509, "num_filter_entries": 13509, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892428, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.317096) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 18794195 bytes
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.318665) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.6 rd, 167.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 17.0 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(25.0) write-amplify(12.3) OK, records in: 14072, records dropped: 563 output_compression: NoCompression
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.318699) EVENT_LOG_v1 {"time_micros": 1763892428318684, "job": 30, "event": "compaction_finished", "compaction_time_micros": 112423, "compaction_time_cpu_micros": 54497, "output_level": 6, "num_output_files": 1, "total_output_size": 18794195, "num_input_records": 14072, "num_output_records": 13509, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428319080, "job": 30, "event": "table_file_deletion", "file_number": 53}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428322152, "job": 30, "event": "table_file_deletion", "file_number": 51}
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.204046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322271) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:07:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:08.325 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:08.325 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: pgmap v508: 177 pgs: 177 active+clean; 870 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 16 MiB/s wr, 66 op/s
Nov 23 10:07:08 np0005532585.localdomain ceph-mon[300199]: osdmap e241: 6 total, 6 up, 6 in
Nov 23 10:07:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:07:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:09.305 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:07:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:09.305 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:07:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "39ff48c9-2d4a-44e8-8b2a-3315c482323f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "39ff48c9-2d4a-44e8-8b2a-3315c482323f", "format": "json"}]: dispatch
Nov 23 10:07:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: pgmap v510: 177 pgs: 177 active+clean; 982 MiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 92 KiB/s rd, 38 MiB/s wr, 151 op/s
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1905485083' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:10 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1905485083' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.809 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.811 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.834 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 18510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702ebf00-67e7-4667-b41c-b1bfe98b23c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18510000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:07:10.811212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2d1fc278-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.011661606, 'message_signature': '6fd29692fa780f32ad13bc269a42be5b9cc44c12359b2f3af77a261b399354cf'}]}, 'timestamp': '2025-11-23 10:07:10.835244', '_unique_id': 'ea68ad94b3964feebe712461762ee582'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.838 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910cdb68-81be-4af0-985c-3547252c47e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.838151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d24e24e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'cc82390c85ff8115669df78cc4a28826c0ef0f1ebb9aca20e7c8a57bcb0cbec8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.838151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d24f69e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '3a6e80cf3caec83700c8307a71372773639739e60455c6a7f3a26fc06e051733'}]}, 'timestamp': '2025-11-23 10:07:10.869299', '_unique_id': '18ef625ddb704309b7681e70f1991b15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1944a8c-7b1c-4ebb-ba6f-735d2c71c82b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.872469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2587ee-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'c2c5533fb367025416b885fce924189890239146f91dd77344ac75081f97fa2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.872469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d25a38c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '9ea18120944125e828210bf579eb1f6cd4e912697c01b766c1abce5361422e7b'}]}, 'timestamp': '2025-11-23 10:07:10.873808', '_unique_id': '348a85024f1e4d87bc46d6a5c5cbc952'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37713d6b-58c9-4d97-bd7b-e507c0bf326b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.876230', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2699b8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'd4999ae9c3a1b73ebfd803a147b86f813f0b0f3872cfed562f94573231cfe609'}]}, 'timestamp': '2025-11-23 10:07:10.880053', '_unique_id': '89407cbdff294aa196340eefc534feea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd659421a-2eb8-4054-9bf0-3634abc4db0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.882544', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d270e84-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '326568e96ba3ddf239dcfbe78c573fb81bd5d76b7ad564eefdd255d7a927a507'}]}, 'timestamp': '2025-11-23 10:07:10.883095', '_unique_id': '6f041623dfcc4d9fb6536a1e987d86a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '836d5f87-8bfd-478d-84e0-cbbcd4245b4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.885342', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d277d38-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'cca6dc8c1f1bfac62f7089d95e65a3c6ad62102fb288c1cf758f7e80b0c870b9'}]}, 'timestamp': '2025-11-23 10:07:10.885869', '_unique_id': '79938a5c2d584720a8b6fc6d3cb7d3ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdb3af6b-e771-483e-89b9-c910da77379c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.888219', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d27ee12-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'c29253a0a1f37f78ab961042fd733126f6a9186cea07ede5fd719cfd433492ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.888219', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2800f0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'd8ffde8b82395dc1efc63967d1772b4da0e47a7e8491ceaa9550a345f412fcf9'}]}, 'timestamp': '2025-11-23 10:07:10.889205', '_unique_id': '137489f9b15244da902aa3d1dd066cfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff88234c-74c7-44d4-978d-ab8a1e9488c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.891863', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d287c1a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '1803a46c35ce7cd8167258a6cf0160675df80da67db7dac6fe6f64104a768bf4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.891863', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d28938a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '17328311758274f0ccb1c7d24335d3e50b7325d7709fc94105e340dd03bb2429'}]}, 'timestamp': '2025-11-23 10:07:10.893048', '_unique_id': '37531b99d83043f1b8f7389d63db4211'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d87fde-9430-498c-978e-8151fd66e6c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.895318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2af40e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '799dcfc4040890a05d24f9575006b444c9dd61b51988e5bf69565a98062807a8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.895318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2b0fe8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': 'f2a5484b774d529d25085012e6b8ccf0e6f7cb9e8f54fab1c1f62e0d541cd778'}]}, 'timestamp': '2025-11-23 10:07:10.909258', '_unique_id': '58a340c9dbb247c4a4c8218410ca8aa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8cdfebb-0aae-47cc-bbd4-7165dfa454c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.911755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2b8446-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '8cd5327536e1cdbb24e1fc94e11432c19ece66307cb0fbf0ede1700037f1b349'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.911755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2b9490-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'c868d0227d31e5c8230f5110a5ff712cb0fd210820c9a7a9036b2b11481636ae'}]}, 'timestamp': '2025-11-23 10:07:10.912677', '_unique_id': 'fa06a6803f9244699b00e94c6036f89a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1abd3eab-8554-4d28-9aa5-aef99a232b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.914867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2bfdea-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '0baa8921a4a50c16ea1f131a4606edbe7a045d8c6a924da7caeddc66b1330ef9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.914867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2c0dee-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'f869ccb597f5a9e81187f5245150f89677356378882cba1c65b8629e29f1e961'}]}, 'timestamp': '2025-11-23 10:07:10.915732', '_unique_id': '8df1a57963584ff195d243ecbc82a7b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c246764-65c4-4ce8-80ae-26d6346bb8ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.918274', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2c82b0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '15ae8ca5a53c61d8eb179e5ddcfdd78c3b511287e966e69629ab75bf1b1144f2'}]}, 'timestamp': '2025-11-23 10:07:10.918752', '_unique_id': '5235d88f1b794a0d951a9a349a693b04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bd6003a-fde6-416b-968e-b96587292186', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.921574', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2d03f2-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '665fa84ff51cf2059a6eee7e698bfb2ef370dc5986fad9587a48f5901655563e'}]}, 'timestamp': '2025-11-23 10:07:10.922162', '_unique_id': 'c31fbbb92ebd40279b8188f6c3c78dd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c62e230c-dc70-4b79-8952-f057f0a94d62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.923779', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2d5834-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '6724ef73740848bcdd8172b20eb246307ff130ee0f823ed2f1957d636c9ea65a'}]}, 'timestamp': '2025-11-23 10:07:10.924180', '_unique_id': 'aa2541e71c20446f9ad3d8e4ccebe222'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d25186a-47d9-4d71-a994-9eba4d134104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.925635', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2d9e3e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'cdcbe87e64549613eaf91b45a41fc11a8bb15f3be3d8486055d70cbdb549c891'}]}, 'timestamp': '2025-11-23 10:07:10.925945', '_unique_id': '7ec1f9eaa1ca4173b6e39453f6410da2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15f88fea-9dfb-4381-a82a-5413fa23e2be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.927450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2de51a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '49a6b9ce1a787d9ae5dc5f945d061bf25ace37eb456d58eb2b9b1b006471e349'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.927450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2def2e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '3dfcdbcaf280193bf5a1a6ceb0a8f80fc6602c59ea9992fd82989600efdad4b9'}]}, 'timestamp': '2025-11-23 10:07:10.928001', '_unique_id': '9dfcce2296024a72905adb94bfd2df40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb7e998d-1fdc-429d-b548-914528c9bd59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.929367', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2e2fde-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'c6d9cec669f13139885dd38d49866edbad00b5f915fb3721a3bd5317f28ea52f'}]}, 'timestamp': '2025-11-23 10:07:10.929672', '_unique_id': 'd83d542fcac046299064dbc0fa5369f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dc709d0-d6f1-49ec-a733-9069dd02599e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.931135', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2e7638-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'bc2c59b51f11d7c7cdad482c996cbcad033530d6f44f65d54bc5343ac6780487'}]}, 'timestamp': '2025-11-23 10:07:10.931467', '_unique_id': 'd44260105f9949069dae2d64690f3ffe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e321060-4e55-4221-8e42-cb2211aacbcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.933011', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2ebe40-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '4c735549a849b89e608c274814dcd845eb397d2dde795a09f6a66b388366304a'}]}, 'timestamp': '2025-11-23 10:07:10.933297', '_unique_id': '0a32465bec49487388b1b146d9cd8e6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dae0b80-dec5-4c6b-b326-afc1d0d2f4b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.934770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2f03aa-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '60c37a78afba2c2267fc8858a21a81c8408ea2dd991e983d8f13322a8829cbd5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.934770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2f0db4-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '95d6cbae914e000e25b46ea751a6d73fa3f3346a3f3408eb8320c901948227a2'}]}, 'timestamp': '2025-11-23 10:07:10.935361', '_unique_id': 'fdb7e5b9cd9546c38c75bb6392c90255'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '629130f0-a4a2-4821-9179-9b552bbbbac7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:07:10.936853', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2d2f5562-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.011661606, 'message_signature': '9b9aa5fe46d45c3dcdea27c85801d419ed3d01ec5793793ce52bd67874add10e'}]}, 'timestamp': '2025-11-23 10:07:10.937156', '_unique_id': '1a2bbfe107544d51ae8ef7281a8e3e98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:07:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:07:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:07:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:07:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:07:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 10:07:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:07:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:07:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19274 "" "Go-http-client/1.1"
Nov 23 10:07:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6f7ac182-1c2a-4635-8587-b5ecb5678c06", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:12 np0005532585.localdomain systemd[1]: tmp-crun.UmYHJv.mount: Deactivated successfully.
Nov 23 10:07:12 np0005532585.localdomain podman[334738]: 2025-11-23 10:07:12.039869767 +0000 UTC m=+0.095380964 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:07:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:07:12 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/217994253' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:12 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:07:12 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/217994253' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:12 np0005532585.localdomain podman[334738]: 2025-11-23 10:07:12.077421808 +0000 UTC m=+0.132932965 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:07:12 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:07:12 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:07:12 np0005532585.localdomain podman[334759]: 2025-11-23 10:07:12.196439853 +0000 UTC m=+0.080046362 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:07:12 np0005532585.localdomain podman[334759]: 2025-11-23 10:07:12.232809749 +0000 UTC m=+0.116416278 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm)
Nov 23 10:07:12 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: pgmap v511: 177 pgs: 177 active+clean; 982 MiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 73 KiB/s rd, 30 MiB/s wr, 120 op/s
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6f7ac182-1c2a-4635-8587-b5ecb5678c06", "format": "json"}]: dispatch
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/217994253' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/217994253' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "39ff48c9-2d4a-44e8-8b2a-3315c482323f", "format": "json"}]: dispatch
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "39ff48c9-2d4a-44e8-8b2a-3315c482323f", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e242 e242: 6 total, 6 up, 6 in
Nov 23 10:07:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:13.326 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:13.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:13.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:13.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:13.350 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:13.351 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: osdmap e242: 6 total, 6 up, 6 in
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3008702058' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3008702058' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:15 np0005532585.localdomain ceph-mon[300199]: pgmap v513: 177 pgs: 177 active+clean; 982 MiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 37 KiB/s rd, 16 MiB/s wr, 62 op/s
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: pgmap v514: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 83 KiB/s rd, 29 MiB/s wr, 133 op/s
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3e36dbf7-3485-45bc-a6bc-a043d0884a88", "format": "json"}]: dispatch
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3e36dbf7-3485-45bc-a6bc-a043d0884a88", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6f7ac182-1c2a-4635-8587-b5ecb5678c06", "format": "json"}]: dispatch
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6f7ac182-1c2a-4635-8587-b5ecb5678c06", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:07:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.352 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.353 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.354 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.354 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.383 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.384 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:18.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: pgmap v515: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 67 KiB/s rd, 23 MiB/s wr, 107 op/s
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "22bdd1a7-cbe9-40dd-ac66-a0446867da56", "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: pgmap v516: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 2.1 MiB/s rd, 22 MiB/s wr, 89 op/s
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:20 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e243 e243: 6 total, 6 up, 6 in
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.265 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.594 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.594 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.595 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:07:21 np0005532585.localdomain ceph-mon[300199]: osdmap e243: 6 total, 6 up, 6 in
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.669 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.670 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.670 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:07:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:21.671 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:07:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:22.131 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:07:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:22.149 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:07:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:22.150 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:07:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:22.151 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:22.151 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:22 np0005532585.localdomain ceph-mon[300199]: pgmap v518: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 2.5 MiB/s rd, 27 MiB/s wr, 106 op/s
Nov 23 10:07:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "22bdd1a7-cbe9-40dd-ac66-a0446867da56", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:22 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3208999179' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:22 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e244 e244: 6 total, 6 up, 6 in
Nov 23 10:07:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:23.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:23.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:23.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:23 np0005532585.localdomain ceph-mon[300199]: osdmap e244: 6 total, 6 up, 6 in
Nov 23 10:07:23 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2705449898' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:07:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:07:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:07:24 np0005532585.localdomain podman[334781]: 2025-11-23 10:07:24.049105489 +0000 UTC m=+0.095504229 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64)
Nov 23 10:07:24 np0005532585.localdomain podman[334779]: 2025-11-23 10:07:24.09631773 +0000 UTC m=+0.146746581 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 10:07:24 np0005532585.localdomain podman[334779]: 2025-11-23 10:07:24.144353018 +0000 UTC m=+0.194781929 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:07:24 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:07:24 np0005532585.localdomain podman[334781]: 2025-11-23 10:07:24.166727422 +0000 UTC m=+0.213126192 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 10:07:24 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:07:24 np0005532585.localdomain podman[334780]: 2025-11-23 10:07:24.148364298 +0000 UTC m=+0.197013445 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:07:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:24.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:24.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:24.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:07:24 np0005532585.localdomain podman[334780]: 2025-11-23 10:07:24.232477892 +0000 UTC m=+0.281126939 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:07:24 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: pgmap v520: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 13 MiB/s wr, 31 op/s
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/563644150' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/563644150' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/987695658' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:24 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/987695658' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:25 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:25.078 263258 INFO neutron.agent.linux.ip_lib [None req-f341419d-469a-487c-8bc9-f2728a88ac44 - - - - - -] Device tap93604dfc-74 cannot be used as it has no MAC address
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.136 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:25 np0005532585.localdomain kernel: device tap93604dfc-74 entered promiscuous mode
Nov 23 10:07:25 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892445.1451] manager: (tap93604dfc-74): new Generic device (/org/freedesktop/NetworkManager/Devices/87)
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.144 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:25 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:25Z|00548|binding|INFO|Claiming lport 93604dfc-7461-4299-8236-45aa7b97320e for this chassis.
Nov 23 10:07:25 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:25Z|00549|binding|INFO|93604dfc-7461-4299-8236-45aa7b97320e: Claiming unknown
Nov 23 10:07:25 np0005532585.localdomain systemd-udevd[334850]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:07:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:25.157 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f48fa865c4047a080902678e51be06e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5f5ef48e-aaab-411a-9f8a-ffb4deb9829f, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=93604dfc-7461-4299-8236-45aa7b97320e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:07:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:25.160 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 93604dfc-7461-4299-8236-45aa7b97320e in datapath 47f35c3f-de9a-4c96-9b45-f5782c8e0808 bound to our chassis
Nov 23 10:07:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:25.163 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port eece88c9-9692-4eda-92a8-4540577b0aac IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:07:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:25.164 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47f35c3f-de9a-4c96-9b45-f5782c8e0808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:07:25 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:25.165 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f9aa14-17fa-49e2-abdb-26c8af7fa6ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:25Z|00550|binding|INFO|Setting lport 93604dfc-7461-4299-8236-45aa7b97320e ovn-installed in OVS
Nov 23 10:07:25 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:25Z|00551|binding|INFO|Setting lport 93604dfc-7461-4299-8236-45aa7b97320e up in Southbound
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.186 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:25 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap93604dfc-74: No such device
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.227 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.235 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/872234812' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.690 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:07:25 np0005532585.localdomain dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 0 addresses
Nov 23 10:07:25 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host
Nov 23 10:07:25 np0005532585.localdomain dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts
Nov 23 10:07:25 np0005532585.localdomain systemd[1]: tmp-crun.NCFrQw.mount: Deactivated successfully.
Nov 23 10:07:25 np0005532585.localdomain podman[334930]: 2025-11-23 10:07:25.735326372 +0000 UTC m=+0.072980579 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: mgrmap e50: np0005532584.naxwxy(active, since 12m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2281623741' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2281623741' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "33decdd7-a730-4b1d-9649-96ef50bb9878", "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1381072346' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/872234812' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.781 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:07:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:25.782 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.013 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.015 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11061MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.015 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.016 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.086 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.086 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.086 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.147 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:26Z|00552|binding|INFO|Releasing lport 6d2dff40-048d-4175-8e87-c4c88e21141f from this chassis (sb_readonly=0)
Nov 23 10:07:26 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:26Z|00553|binding|INFO|Setting lport 6d2dff40-048d-4175-8e87-c4c88e21141f down in Southbound
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.227 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:26 np0005532585.localdomain kernel: device tap6d2dff40-04 left promiscuous mode
Nov 23 10:07:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:26.237 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33c4eecf43aa413a9f282206f9e9a55b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e2548aa-bcaa-4071-886c-d49df70c86b7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=6d2dff40-048d-4175-8e87-c4c88e21141f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:07:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:26.239 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6d2dff40-048d-4175-8e87-c4c88e21141f in datapath 1ebb6643-dd69-425a-84e7-f74c46a69f9f unbound from our chassis
Nov 23 10:07:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:26.242 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ebb6643-dd69-425a-84e7-f74c46a69f9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:07:26 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:26.243 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ab7b47-5436-4c23-b7b7-2e7472829035]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:26 np0005532585.localdomain podman[334983]: 
Nov 23 10:07:26 np0005532585.localdomain podman[334983]: 2025-11-23 10:07:26.343371049 +0000 UTC m=+0.078412793 container create d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:07:26 np0005532585.localdomain systemd[1]: Started libpod-conmon-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790.scope.
Nov 23 10:07:26 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:07:26 np0005532585.localdomain podman[334983]: 2025-11-23 10:07:26.299435555 +0000 UTC m=+0.034477289 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:07:26 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f836387aefaba3bec188b51689961ff9b6a2c957f76e955e6437c8b363a115f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:07:26 np0005532585.localdomain podman[334983]: 2025-11-23 10:07:26.412617334 +0000 UTC m=+0.147659078 container init d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 23 10:07:26 np0005532585.localdomain podman[334983]: 2025-11-23 10:07:26.421547513 +0000 UTC m=+0.156589257 container start d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:07:26 np0005532585.localdomain dnsmasq[335020]: started, version 2.85 cachesize 150
Nov 23 10:07:26 np0005532585.localdomain dnsmasq[335020]: DNS service limited to local subnets
Nov 23 10:07:26 np0005532585.localdomain dnsmasq[335020]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:07:26 np0005532585.localdomain dnsmasq[335020]: warning: no upstream servers configured
Nov 23 10:07:26 np0005532585.localdomain dnsmasq-dhcp[335020]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:07:26 np0005532585.localdomain dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 0 addresses
Nov 23 10:07:26 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host
Nov 23 10:07:26 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts
Nov 23 10:07:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:07:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1410162980' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:26 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:26.569 263258 INFO neutron.agent.dhcp.agent [None req-5e7a3a60-401e-44ac-9642-6923c08b11e5 - - - - - -] DHCP configuration for ports {'94b64606-a070-4d9f-a082-720cda79ef20'} is completed
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.582 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.588 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.604 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.627 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:07:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:26.627 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:07:26 np0005532585.localdomain ceph-mon[300199]: pgmap v521: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 18 MiB/s wr, 193 op/s
Nov 23 10:07:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2535752392' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1410162980' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:07:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:27.119 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:07:26Z, description=, device_id=911d76ef-8c35-4e07-a4bf-332c60f42359, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afcc40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8afc8b0>], id=757091d9-2f57-42df-833a-3480faaa4574, ip_allocation=immediate, mac_address=fa:16:3e:34:8e:7d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:07:22Z, description=, dns_domain=, id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2143275487-network, port_security_enabled=True, project_id=7f48fa865c4047a080902678e51be06e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3388, status=ACTIVE, subnets=['8c56fce8-3a69-4c2c-9d2a-3dee5aae6f6a'], tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:23Z, vlan_transparent=None, network_id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, port_security_enabled=False, project_id=7f48fa865c4047a080902678e51be06e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3411, status=DOWN, tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:26Z on network 47f35c3f-de9a-4c96-9b45-f5782c8e0808
Nov 23 10:07:27 np0005532585.localdomain dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 1 addresses
Nov 23 10:07:27 np0005532585.localdomain podman[335039]: 2025-11-23 10:07:27.324620477 +0000 UTC m=+0.054659368 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:07:27 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host
Nov 23 10:07:27 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts
Nov 23 10:07:27 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:27.514 263258 INFO neutron.agent.dhcp.agent [None req-650de9cb-e1f6-4cee-ae94-d084be5bb08b - - - - - -] DHCP configuration for ports {'757091d9-2f57-42df-833a-3480faaa4574'} is completed
Nov 23 10:07:27 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:07:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:07:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e245 e245: 6 total, 6 up, 6 in
Nov 23 10:07:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:28.225 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:07:26Z, description=, device_id=911d76ef-8c35-4e07-a4bf-332c60f42359, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8cdd280>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4d8b0>], id=757091d9-2f57-42df-833a-3480faaa4574, ip_allocation=immediate, mac_address=fa:16:3e:34:8e:7d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:07:22Z, description=, dns_domain=, id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2143275487-network, port_security_enabled=True, project_id=7f48fa865c4047a080902678e51be06e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3388, status=ACTIVE, subnets=['8c56fce8-3a69-4c2c-9d2a-3dee5aae6f6a'], tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:23Z, vlan_transparent=None, network_id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, port_security_enabled=False, project_id=7f48fa865c4047a080902678e51be06e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3411, status=DOWN, tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:26Z on network 47f35c3f-de9a-4c96-9b45-f5782c8e0808
Nov 23 10:07:28 np0005532585.localdomain systemd[1]: tmp-crun.gIjIuB.mount: Deactivated successfully.
Nov 23 10:07:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:28.464 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:28 np0005532585.localdomain dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 1 addresses
Nov 23 10:07:28 np0005532585.localdomain podman[335077]: 2025-11-23 10:07:28.467128383 +0000 UTC m=+0.089018332 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:07:28 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host
Nov 23 10:07:28 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts
Nov 23 10:07:28 np0005532585.localdomain sshd[335099]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:07:28 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:28.814 263258 INFO neutron.agent.dhcp.agent [None req-7b6982cd-4091-40a7-bb11-ef619302e77f - - - - - -] DHCP configuration for ports {'757091d9-2f57-42df-833a-3480faaa4574'} is completed
Nov 23 10:07:29 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:29Z|00554|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:07:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:29.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:29 np0005532585.localdomain ceph-mon[300199]: pgmap v522: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 98 KiB/s rd, 4.7 MiB/s wr, 161 op/s
Nov 23 10:07:29 np0005532585.localdomain ceph-mon[300199]: osdmap e245: 6 total, 6 up, 6 in
Nov 23 10:07:29 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "33decdd7-a730-4b1d-9649-96ef50bb9878", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:29 np0005532585.localdomain sshd[335099]: Received disconnect from 107.172.15.139 port 45778:11: Bye Bye [preauth]
Nov 23 10:07:29 np0005532585.localdomain sshd[335099]: Disconnected from authenticating user root 107.172.15.139 port 45778 [preauth]
Nov 23 10:07:29 np0005532585.localdomain podman[335116]: 2025-11-23 10:07:29.700267729 +0000 UTC m=+0.064318988 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:07:29 np0005532585.localdomain systemd[1]: tmp-crun.ZxfNHE.mount: Deactivated successfully.
Nov 23 10:07:29 np0005532585.localdomain dnsmasq[334367]: exiting on receipt of SIGTERM
Nov 23 10:07:29 np0005532585.localdomain systemd[1]: libpod-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5.scope: Deactivated successfully.
Nov 23 10:07:29 np0005532585.localdomain podman[335128]: 2025-11-23 10:07:29.769786423 +0000 UTC m=+0.056471833 container died 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:07:29 np0005532585.localdomain podman[335128]: 2025-11-23 10:07:29.810694725 +0000 UTC m=+0.097380095 container cleanup 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:07:29 np0005532585.localdomain systemd[1]: libpod-conmon-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5.scope: Deactivated successfully.
Nov 23 10:07:29 np0005532585.localdomain podman[335130]: 2025-11-23 10:07:29.849554116 +0000 UTC m=+0.127836422 container remove 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:07:29 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:29.876 263258 INFO neutron.agent.dhcp.agent [None req-dd374a4a-4bcb-4679-9310-025822dfd3c4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:07:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:07:30 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:30.064 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:07:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-3bbb7d3b4428d8f687dacc74829ec1d2b43a77a0b9743c4105e027e24960cc16-merged.mount: Deactivated successfully.
Nov 23 10:07:30 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5-userdata-shm.mount: Deactivated successfully.
Nov 23 10:07:30 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d1ebb6643\x2ddd69\x2d425a\x2d84e7\x2df74c46a69f9f.mount: Deactivated successfully.
Nov 23 10:07:31 np0005532585.localdomain ceph-mon[300199]: pgmap v524: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 112 KiB/s rd, 4.8 MiB/s wr, 187 op/s
Nov 23 10:07:31 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:07:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:07:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:07:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:07:31 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:07:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:07:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:07:32 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:32.033 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:07:32 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:32.034 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:07:32 np0005532585.localdomain systemd[1]: tmp-crun.ajT5gZ.mount: Deactivated successfully.
Nov 23 10:07:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:32.069 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:32 np0005532585.localdomain podman[335157]: 2025-11-23 10:07:32.110197434 +0000 UTC m=+0.156397893 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:07:32 np0005532585.localdomain podman[335157]: 2025-11-23 10:07:32.123849515 +0000 UTC m=+0.170049954 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:07:32 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:07:32 np0005532585.localdomain podman[335156]: 2025-11-23 10:07:32.075959332 +0000 UTC m=+0.122005506 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 23 10:07:32 np0005532585.localdomain podman[335156]: 2025-11-23 10:07:32.206445052 +0000 UTC m=+0.252491176 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:07:32 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:07:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:33 np0005532585.localdomain ceph-mon[300199]: pgmap v525: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 4.3 MiB/s wr, 169 op/s
Nov 23 10:07:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9757f6f1-7e41-4cb6-a23b-fdd5de1e9889", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9757f6f1-7e41-4cb6-a23b-fdd5de1e9889", "format": "json"}]: dispatch
Nov 23 10:07:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e246 e246: 6 total, 6 up, 6 in
Nov 23 10:07:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:33.505 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:33.624 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8dd48fc8-4cc9-428a-b704-1fe8f3b6e8e6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: osdmap e246: 6 total, 6 up, 6 in
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8dd48fc8-4cc9-428a-b704-1fe8f3b6e8e6", "format": "json"}]: dispatch
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e247 e247: 6 total, 6 up, 6 in
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: pgmap v527: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 36 KiB/s wr, 25 op/s
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/174658289' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/174658289' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: osdmap e247: 6 total, 6 up, 6 in
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2326622654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:07:35 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2326622654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9757f6f1-7e41-4cb6-a23b-fdd5de1e9889", "format": "json"}]: dispatch
Nov 23 10:07:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9757f6f1-7e41-4cb6-a23b-fdd5de1e9889", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:36 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2326622654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:36 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2326622654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:37.036 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/491875440' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/491875440' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: pgmap v529: 177 pgs: 177 active+clean; 221 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 2.9 MiB/s wr, 97 op/s
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/491875440' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/491875440' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:37 np0005532585.localdomain sshd[335198]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:07:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8dd48fc8-4cc9-428a-b704-1fe8f3b6e8e6", "format": "json"}]: dispatch
Nov 23 10:07:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8dd48fc8-4cc9-428a-b704-1fe8f3b6e8e6", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:07:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:07:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:38.507 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:38.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:38.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:38.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:38.543 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:38.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:39 np0005532585.localdomain sshd[335198]: Received disconnect from 175.126.166.172 port 42560:11: Bye Bye [preauth]
Nov 23 10:07:39 np0005532585.localdomain sshd[335198]: Disconnected from authenticating user root 175.126.166.172 port 42560 [preauth]
Nov 23 10:07:39 np0005532585.localdomain ceph-mon[300199]: pgmap v530: 177 pgs: 177 active+clean; 221 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.6 MiB/s wr, 63 op/s
Nov 23 10:07:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:39 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:39Z|00555|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:07:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:39.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:40 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 23 10:07:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "dd86bcfb-cedc-449f-b13c-69c8336baf06", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "dd86bcfb-cedc-449f-b13c-69c8336baf06", "format": "json"}]: dispatch
Nov 23 10:07:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:07:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:40 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:41 np0005532585.localdomain ceph-mon[300199]: pgmap v531: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 2.7 MiB/s wr, 131 op/s
Nov 23 10:07:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:07:41 np0005532585.localdomain podman[335217]: 2025-11-23 10:07:41.608027606 +0000 UTC m=+0.064951767 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 10:07:41 np0005532585.localdomain dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 0 addresses
Nov 23 10:07:41 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host
Nov 23 10:07:41 np0005532585.localdomain dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts
Nov 23 10:07:41 np0005532585.localdomain systemd[1]: tmp-crun.JNEBJD.mount: Deactivated successfully.
Nov 23 10:07:41 np0005532585.localdomain kernel: device tap93604dfc-74 left promiscuous mode
Nov 23 10:07:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:41Z|00556|binding|INFO|Releasing lport 93604dfc-7461-4299-8236-45aa7b97320e from this chassis (sb_readonly=0)
Nov 23 10:07:41 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:41Z|00557|binding|INFO|Setting lport 93604dfc-7461-4299-8236-45aa7b97320e down in Southbound
Nov 23 10:07:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:41.798 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:41.808 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f48fa865c4047a080902678e51be06e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5f5ef48e-aaab-411a-9f8a-ffb4deb9829f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=93604dfc-7461-4299-8236-45aa7b97320e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:07:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:41.811 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 93604dfc-7461-4299-8236-45aa7b97320e in datapath 47f35c3f-de9a-4c96-9b45-f5782c8e0808 unbound from our chassis
Nov 23 10:07:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:41.814 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47f35c3f-de9a-4c96-9b45-f5782c8e0808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:07:41 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:07:41.815 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d39f61b2-4da1-40ff-9ccd-ee4ad4923457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:07:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:41.820 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:41 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:41.822 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:07:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:07:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:07:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 10:07:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:07:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19261 "" "Go-http-client/1.1"
Nov 23 10:07:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:07:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:07:42 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:07:42Z|00558|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:07:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:43.060 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:43 np0005532585.localdomain podman[335240]: 2025-11-23 10:07:43.066632404 +0000 UTC m=+0.125159202 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:07:43 np0005532585.localdomain podman[335241]: 2025-11-23 10:07:43.105356421 +0000 UTC m=+0.160659531 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:07:43 np0005532585.localdomain podman[335241]: 2025-11-23 10:07:43.144281273 +0000 UTC m=+0.199584373 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 10:07:43 np0005532585.localdomain podman[335240]: 2025-11-23 10:07:43.155397958 +0000 UTC m=+0.213924766 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:07:43 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:07:43 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:07:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e248 e248: 6 total, 6 up, 6 in
Nov 23 10:07:43 np0005532585.localdomain dnsmasq[335020]: exiting on receipt of SIGTERM
Nov 23 10:07:43 np0005532585.localdomain podman[335299]: 2025-11-23 10:07:43.43135151 +0000 UTC m=+0.064415761 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 10:07:43 np0005532585.localdomain systemd[1]: libpod-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790.scope: Deactivated successfully.
Nov 23 10:07:43 np0005532585.localdomain ceph-mon[300199]: pgmap v532: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 2.6 MiB/s wr, 127 op/s
Nov 23 10:07:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dd86bcfb-cedc-449f-b13c-69c8336baf06", "format": "json"}]: dispatch
Nov 23 10:07:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "dd86bcfb-cedc-449f-b13c-69c8336baf06", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:43 np0005532585.localdomain ceph-mon[300199]: osdmap e248: 6 total, 6 up, 6 in
Nov 23 10:07:43 np0005532585.localdomain podman[335311]: 2025-11-23 10:07:43.50338691 +0000 UTC m=+0.060606167 container died d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:07:43 np0005532585.localdomain podman[335311]: 2025-11-23 10:07:43.533822697 +0000 UTC m=+0.091041914 container cleanup d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:07:43 np0005532585.localdomain systemd[1]: libpod-conmon-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790.scope: Deactivated successfully.
Nov 23 10:07:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:43.546 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:43.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:43 np0005532585.localdomain podman[335313]: 2025-11-23 10:07:43.591318099 +0000 UTC m=+0.140616407 container remove d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:07:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:43.615 263258 INFO neutron.agent.dhcp.agent [None req-8522ce37-688f-49e9-b944-3c116d7c1182 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:07:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:07:43.616 263258 INFO neutron.agent.dhcp.agent [None req-8522ce37-688f-49e9-b944-3c116d7c1182 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:07:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-f836387aefaba3bec188b51689961ff9b6a2c957f76e955e6437c8b363a115f6-merged.mount: Deactivated successfully.
Nov 23 10:07:44 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790-userdata-shm.mount: Deactivated successfully.
Nov 23 10:07:44 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d47f35c3f\x2dde9a\x2d4c96\x2d9b45\x2df5782c8e0808.mount: Deactivated successfully.
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e249 e249: 6 total, 6 up, 6 in
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: osdmap e249: 6 total, 6 up, 6 in
Nov 23 10:07:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: pgmap v534: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 55 KiB/s wr, 61 op/s
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "format": "json"}]: dispatch
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "format": "json"}]: dispatch
Nov 23 10:07:46 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e250 e250: 6 total, 6 up, 6 in
Nov 23 10:07:47 np0005532585.localdomain sudo[335339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:07:47 np0005532585.localdomain sudo[335339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:07:47 np0005532585.localdomain sudo[335339]: pam_unix(sudo:session): session closed for user root
Nov 23 10:07:47 np0005532585.localdomain sudo[335357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:07:47 np0005532585.localdomain sudo[335357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:07:47 np0005532585.localdomain sshd[335375]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:07:47 np0005532585.localdomain ceph-mon[300199]: pgmap v536: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 138 KiB/s wr, 87 op/s
Nov 23 10:07:47 np0005532585.localdomain ceph-mon[300199]: osdmap e250: 6 total, 6 up, 6 in
Nov 23 10:07:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:47 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:48 np0005532585.localdomain sudo[335357]: pam_unix(sudo:session): session closed for user root
Nov 23 10:07:48 np0005532585.localdomain sshd[335375]: Invalid user p from 207.154.194.2 port 56184
Nov 23 10:07:48 np0005532585.localdomain sshd[335375]: Received disconnect from 207.154.194.2 port 56184:11: Bye Bye [preauth]
Nov 23 10:07:48 np0005532585.localdomain sshd[335375]: Disconnected from invalid user p 207.154.194.2 port 56184 [preauth]
Nov 23 10:07:48 np0005532585.localdomain sudo[335409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:07:48 np0005532585.localdomain sudo[335409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:07:48 np0005532585.localdomain sudo[335409]: pam_unix(sudo:session): session closed for user root
Nov 23 10:07:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:48.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:48.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:48.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:48.552 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:48.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:48.556 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "snap_name": "faa5e687-7f2b-4bd4-b60d-23c771ca94bd", "format": "json"}]: dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "snap_name": "87f14237-1545-404e-9124-daefa8c44022", "format": "json"}]: dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: pgmap v538: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 104 KiB/s wr, 26 op/s
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/41927298' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:07:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/41927298' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:07:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:07:50 np0005532585.localdomain ceph-mon[300199]: pgmap v539: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 154 KiB/s wr, 74 op/s
Nov 23 10:07:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:07:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:07:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "snap_name": "1d874a0b-7765-41f2-a385-f7185c843be2", "format": "json"}]: dispatch
Nov 23 10:07:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "snap_name": "87f14237-1545-404e-9124-daefa8c44022_0bb34153-e629-41c3-8e6d-cc12429b03ff", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "snap_name": "87f14237-1545-404e-9124-daefa8c44022", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:52 np0005532585.localdomain ceph-mon[300199]: pgmap v540: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 122 KiB/s wr, 58 op/s
Nov 23 10:07:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e251 e251: 6 total, 6 up, 6 in
Nov 23 10:07:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:53.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:54 np0005532585.localdomain ceph-mon[300199]: osdmap e251: 6 total, 6 up, 6 in
Nov 23 10:07:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:07:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:07:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:07:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:07:54 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:07:55 np0005532585.localdomain systemd[1]: tmp-crun.DnZIWP.mount: Deactivated successfully.
Nov 23 10:07:55 np0005532585.localdomain podman[335428]: 2025-11-23 10:07:55.094060467 +0000 UTC m=+0.144586235 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 23 10:07:55 np0005532585.localdomain podman[335428]: 2025-11-23 10:07:55.103319257 +0000 UTC m=+0.153845045 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:07:55 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:07:55 np0005532585.localdomain podman[335427]: 2025-11-23 10:07:55.186876314 +0000 UTC m=+0.237161595 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:07:55 np0005532585.localdomain podman[335429]: 2025-11-23 10:07:55.057144765 +0000 UTC m=+0.103493168 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, managed_by=edpm_ansible)
Nov 23 10:07:55 np0005532585.localdomain podman[335429]: 2025-11-23 10:07:55.243452628 +0000 UTC m=+0.289801081 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Nov 23 10:07:55 np0005532585.localdomain podman[335427]: 2025-11-23 10:07:55.25649776 +0000 UTC m=+0.306783001 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:07:55 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: pgmap v542: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 44 KiB/s wr, 38 op/s
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "snap_name": "1d874a0b-7765-41f2-a385-f7185c843be2_ee5d1a85-59c9-41ad-a62d-2d9f920dd3f4", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "snap_name": "1d874a0b-7765-41f2-a385-f7185c843be2", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0e32d4cd-9733-47da-92f6-d332e549133b", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:55 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:07:56 np0005532585.localdomain systemd[1]: tmp-crun.DkAK4M.mount: Deactivated successfully.
Nov 23 10:07:57 np0005532585.localdomain ceph-mon[300199]: pgmap v543: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 107 KiB/s wr, 42 op/s
Nov 23 10:07:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:07:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:07:57 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:07:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:07:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "snap_name": "faa5e687-7f2b-4bd4-b60d-23c771ca94bd_6277c15f-b0cf-47a5-ab1e-cf7a29aa510b", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "snap_name": "faa5e687-7f2b-4bd4-b60d-23c771ca94bd", "force": true, "format": "json"}]: dispatch
Nov 23 10:07:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:58.557 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:58.559 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:07:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:58.559 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:07:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:58.560 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:58.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:07:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:07:58.593 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:07:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e252 e252: 6 total, 6 up, 6 in
Nov 23 10:07:59 np0005532585.localdomain ceph-mon[300199]: pgmap v544: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 96 KiB/s wr, 37 op/s
Nov 23 10:07:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "snap_name": "b0e1eebd-2fc1-48d8-874f-e40a8f23a682", "format": "json"}]: dispatch
Nov 23 10:07:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:07:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:07:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3643344025' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3643344025' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: osdmap e252: 6 total, 6 up, 6 in
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3643344025' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/3643344025' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.376458) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480376490, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1381, "num_deletes": 254, "total_data_size": 1935404, "memory_usage": 1959552, "flush_reason": "Manual Compaction"}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480386164, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1268743, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31587, "largest_seqno": 32963, "table_properties": {"data_size": 1262817, "index_size": 3076, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15367, "raw_average_key_size": 21, "raw_value_size": 1249959, "raw_average_value_size": 1775, "num_data_blocks": 133, "num_entries": 704, "num_filter_entries": 704, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892428, "oldest_key_time": 1763892428, "file_creation_time": 1763892480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 9762 microseconds, and 4767 cpu microseconds.
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.386215) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1268743 bytes OK
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.386239) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.388782) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.388806) EVENT_LOG_v1 {"time_micros": 1763892480388799, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.388826) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 1928387, prev total WAL file size 1928997, number of live WAL files 2.
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.391716) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1239KB)], [54(17MB)]
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480391872, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20062938, "oldest_snapshot_seqno": -1}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13678 keys, 18455108 bytes, temperature: kUnknown
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480489842, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 18455108, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18375768, "index_size": 44039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34245, "raw_key_size": 368943, "raw_average_key_size": 26, "raw_value_size": 18141616, "raw_average_value_size": 1326, "num_data_blocks": 1637, "num_entries": 13678, "num_filter_entries": 13678, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.490228) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 18455108 bytes
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.493873) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 204.6 rd, 188.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 17.9 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(30.4) write-amplify(14.5) OK, records in: 14213, records dropped: 535 output_compression: NoCompression
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.493969) EVENT_LOG_v1 {"time_micros": 1763892480493953, "job": 32, "event": "compaction_finished", "compaction_time_micros": 98060, "compaction_time_cpu_micros": 56004, "output_level": 6, "num_output_files": 1, "total_output_size": 18455108, "num_input_records": 14213, "num_output_records": 13678, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480494376, "job": 32, "event": "table_file_deletion", "file_number": 56}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480497231, "job": 32, "event": "table_file_deletion", "file_number": 54}
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.391514) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:08:00 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:08:01 np0005532585.localdomain ceph-mon[300199]: pgmap v546: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 143 KiB/s wr, 14 op/s
Nov 23 10:08:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:08:02 np0005532585.localdomain podman[335490]: 2025-11-23 10:08:02.296967343 +0000 UTC m=+0.088442926 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:08:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:08:02 np0005532585.localdomain podman[335490]: 2025-11-23 10:08:02.3347391 +0000 UTC m=+0.126214633 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:08:02 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:08:02 np0005532585.localdomain systemd[1]: tmp-crun.pzolgs.mount: Deactivated successfully.
Nov 23 10:08:02 np0005532585.localdomain podman[335512]: 2025-11-23 10:08:02.405079149 +0000 UTC m=+0.083416893 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:08:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "format": "json"}]: dispatch
Nov 23 10:08:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bcb84003-ea75-4d99-8d94-e5e6e07ace02", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:02 np0005532585.localdomain podman[335512]: 2025-11-23 10:08:02.417305448 +0000 UTC m=+0.095643182 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:08:02 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:08:03 np0005532585.localdomain ceph-mon[300199]: pgmap v547: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 491 B/s rd, 138 KiB/s wr, 14 op/s
Nov 23 10:08:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "snap_name": "b0e1eebd-2fc1-48d8-874f-e40a8f23a682_a21d3a0e-bd0b-4f17-8e3f-05e089d29de3", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "snap_name": "b0e1eebd-2fc1-48d8-874f-e40a8f23a682", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:03.594 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:03.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:03.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:03.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:03.634 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:03.635 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:08:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:08:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:05 np0005532585.localdomain ceph-mon[300199]: pgmap v548: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 115 KiB/s wr, 11 op/s
Nov 23 10:08:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "format": "json"}]: dispatch
Nov 23 10:08:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4206c3a8-effc-409e-87ea-4b9d53523041", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:07 np0005532585.localdomain ceph-mon[300199]: pgmap v549: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 118 KiB/s wr, 11 op/s
Nov 23 10:08:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e253 e253: 6 total, 6 up, 6 in
Nov 23 10:08:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:08:08 np0005532585.localdomain ceph-mon[300199]: osdmap e253: 6 total, 6 up, 6 in
Nov 23 10:08:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:08.636 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:08.637 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:08.638 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:08.638 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:08.659 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:08.660 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e254 e254: 6 total, 6 up, 6 in
Nov 23 10:08:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:08:09.305 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:08:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:08:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:08:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:08:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:08:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:09 np0005532585.localdomain ceph-mon[300199]: pgmap v550: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 118 KiB/s wr, 11 op/s
Nov 23 10:08:09 np0005532585.localdomain ceph-mon[300199]: osdmap e254: 6 total, 6 up, 6 in
Nov 23 10:08:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: pgmap v553: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 143 KiB/s wr, 14 op/s
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cd85cd26-79d5-413c-8071-7faf10d1a803", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cd85cd26-79d5-413c-8071-7faf10d1a803", "format": "json"}]: dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "format": "json"}]: dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:08:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:08:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:08:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:08:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:08:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:08:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18781 "" "Go-http-client/1.1"
Nov 23 10:08:12 np0005532585.localdomain ceph-mon[300199]: pgmap v554: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 143 KiB/s wr, 14 op/s
Nov 23 10:08:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ad2078c1-9958-4c4f-9787-6f0cb9391b42", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ad2078c1-9958-4c4f-9787-6f0cb9391b42", "format": "json"}]: dispatch
Nov 23 10:08:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:13.660 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:13.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:13.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:13.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:13.689 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:13 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:13.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:13 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:08:13Z|00559|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 23 10:08:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:08:13 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:08:14 np0005532585.localdomain podman[335531]: 2025-11-23 10:08:14.04834855 +0000 UTC m=+0.104308384 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:08:14 np0005532585.localdomain podman[335532]: 2025-11-23 10:08:14.088412416 +0000 UTC m=+0.140592466 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:08:14 np0005532585.localdomain podman[335532]: 2025-11-23 10:08:14.095205211 +0000 UTC m=+0.147385211 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible)
Nov 23 10:08:14 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:08:14 np0005532585.localdomain podman[335531]: 2025-11-23 10:08:14.112328126 +0000 UTC m=+0.168287960 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:08:14 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:08:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:08:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:15 np0005532585.localdomain ceph-mon[300199]: pgmap v555: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 63 KiB/s wr, 5 op/s
Nov 23 10:08:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "snap_name": "28900e81-a351-45cb-9e4a-bbc830e56ec4", "format": "json"}]: dispatch
Nov 23 10:08:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cd85cd26-79d5-413c-8071-7faf10d1a803", "format": "json"}]: dispatch
Nov 23 10:08:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cd85cd26-79d5-413c-8071-7faf10d1a803", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0e4b9e6e-03a5-4c23-a27b-85faef83156b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0e4b9e6e-03a5-4c23-a27b-85faef83156b", "format": "json"}]: dispatch
Nov 23 10:08:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:17 np0005532585.localdomain ceph-mon[300199]: pgmap v556: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 163 KiB/s wr, 13 op/s
Nov 23 10:08:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:08:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:08:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:08:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:18.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:18.692 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:18.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:18.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:18.721 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:18 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:18.722 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e255 e255: 6 total, 6 up, 6 in
Nov 23 10:08:18 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:08:18 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:08:18 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "snap_name": "28900e81-a351-45cb-9e4a-bbc830e56ec4_6f083e24-ed97-436a-a5aa-b0c285833d8a", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:18 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "snap_name": "28900e81-a351-45cb-9e4a-bbc830e56ec4", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:19.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:19 np0005532585.localdomain ceph-mon[300199]: pgmap v557: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 439 B/s rd, 140 KiB/s wr, 11 op/s
Nov 23 10:08:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0e4b9e6e-03a5-4c23-a27b-85faef83156b", "format": "json"}]: dispatch
Nov 23 10:08:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0e4b9e6e-03a5-4c23-a27b-85faef83156b", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:19 np0005532585.localdomain ceph-mon[300199]: osdmap e255: 6 total, 6 up, 6 in
Nov 23 10:08:20 np0005532585.localdomain ceph-mon[300199]: pgmap v559: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 142 KiB/s wr, 11 op/s
Nov 23 10:08:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:08:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:08:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:21 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:21.223 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "format": "json"}]: dispatch
Nov 23 10:08:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b2b184f0-0dcc-43ed-9860-79f9073a2732", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.358 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.359 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.360 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.360 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.853 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.875 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.876 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:08:22 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:22.876 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:23 np0005532585.localdomain ceph-mon[300199]: pgmap v560: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 142 KiB/s wr, 11 op/s
Nov 23 10:08:23 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ad2078c1-9958-4c4f-9787-6f0cb9391b42", "format": "json"}]: dispatch
Nov 23 10:08:23 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ad2078c1-9958-4c4f-9787-6f0cb9391b42", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:23.722 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:23.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:23.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:23.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:23.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:23.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e256 e256: 6 total, 6 up, 6 in
Nov 23 10:08:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:08:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:08:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:08:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.239 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.240 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.241 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: pgmap v561: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 142 KiB/s wr, 11 op/s
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: osdmap e256: 6 total, 6 up, 6 in
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2263582808' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:08:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/371791075' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.689 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.748 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.748 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.933 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.934 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11067MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.935 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:08:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:25.935 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:08:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:08:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:08:25 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:08:26 np0005532585.localdomain podman[335597]: 2025-11-23 10:08:26.043240681 +0000 UTC m=+0.093756725 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:08:26 np0005532585.localdomain podman[335597]: 2025-11-23 10:08:26.075668158 +0000 UTC m=+0.126184202 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 10:08:26 np0005532585.localdomain systemd[1]: tmp-crun.snWwBl.mount: Deactivated successfully.
Nov 23 10:08:26 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:08:26 np0005532585.localdomain podman[335598]: 2025-11-23 10:08:26.094273258 +0000 UTC m=+0.140693369 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:08:26 np0005532585.localdomain podman[335598]: 2025-11-23 10:08:26.126294553 +0000 UTC m=+0.172714634 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:08:26 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.174 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.174 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.175 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:08:26 np0005532585.localdomain podman[335599]: 2025-11-23 10:08:26.200043855 +0000 UTC m=+0.245116425 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Nov 23 10:08:26 np0005532585.localdomain podman[335599]: 2025-11-23 10:08:26.220963685 +0000 UTC m=+0.266036265 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 23 10:08:26 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "format": "json"}]: dispatch
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7bed2221-edb9-43df-b343-216aa2aa2b37", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/371791075' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3762470076' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.404 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:08:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2136971538' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.892 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.898 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.915 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.951 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:08:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:26.951 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:08:27 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7bed2221-edb9-43df-b343-216aa2aa2b37", "format": "json"}]: dispatch
Nov 23 10:08:27 np0005532585.localdomain ceph-mon[300199]: pgmap v563: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 163 KiB/s wr, 14 op/s
Nov 23 10:08:27 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2840074883' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:27 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2136971538' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:08:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:27.952 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:27.953 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:28 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3185027811' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:08:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:28.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:28.765 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:28.765 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:28.765 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:28.801 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:28.802 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:29 np0005532585.localdomain ceph-mon[300199]: pgmap v564: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 151 KiB/s wr, 13 op/s
Nov 23 10:08:29 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:29 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:08:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:08:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "11d20cc2-2db4-4032-9422-8ec84b489c39", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "11d20cc2-2db4-4032-9422-8ec84b489c39", "format": "json"}]: dispatch
Nov 23 10:08:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7bed2221-edb9-43df-b343-216aa2aa2b37", "format": "json"}]: dispatch
Nov 23 10:08:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7bed2221-edb9-43df-b343-216aa2aa2b37", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "format": "json"}]: dispatch
Nov 23 10:08:31 np0005532585.localdomain ceph-mon[300199]: pgmap v565: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 146 KiB/s wr, 13 op/s
Nov 23 10:08:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:08:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:08:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:08:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:08:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:08:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/f99c71c2-29eb-4c61-ab24-07800f6e4b24/458725ea-f0f9-482a-b03b-0e03f2cd7019", "osd", "allow rw pool=manila_data namespace=fsvolumens_f99c71c2-29eb-4c61-ab24-07800f6e4b24", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/f99c71c2-29eb-4c61-ab24-07800f6e4b24/458725ea-f0f9-482a-b03b-0e03f2cd7019", "osd", "allow rw pool=manila_data namespace=fsvolumens_f99c71c2-29eb-4c61-ab24-07800f6e4b24", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:08:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:08:33 np0005532585.localdomain systemd[1]: tmp-crun.3fmRYD.mount: Deactivated successfully.
Nov 23 10:08:33 np0005532585.localdomain podman[335682]: 2025-11-23 10:08:33.03482567 +0000 UTC m=+0.084003631 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 23 10:08:33 np0005532585.localdomain podman[335683]: 2025-11-23 10:08:33.100196389 +0000 UTC m=+0.147922246 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:08:33 np0005532585.localdomain podman[335682]: 2025-11-23 10:08:33.128313156 +0000 UTC m=+0.177491147 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 23 10:08:33 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:08:33 np0005532585.localdomain podman[335683]: 2025-11-23 10:08:33.185061486 +0000 UTC m=+0.232787363 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:08:33 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.232 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 10:08:33 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 e257: 6 total, 6 up, 6 in
Nov 23 10:08:33 np0005532585.localdomain ceph-mon[300199]: pgmap v566: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 146 KiB/s wr, 13 op/s
Nov 23 10:08:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:33 np0005532585.localdomain ceph-mon[300199]: osdmap e257: 6 total, 6 up, 6 in
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.847 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:33.848 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:08:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:35 np0005532585.localdomain ceph-mon[300199]: pgmap v568: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 989 B/s rd, 157 KiB/s wr, 14 op/s
Nov 23 10:08:35 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:08:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:36.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:08:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:36.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 10:08:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:08:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: pgmap v569: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 141 KiB/s wr, 11 op/s
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "format": "json"}]: dispatch
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f99c71c2-29eb-4c61-ab24-07800f6e4b24", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "11d20cc2-2db4-4032-9422-8ec84b489c39", "format": "json"}]: dispatch
Nov 23 10:08:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "11d20cc2-2db4-4032-9422-8ec84b489c39", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:08:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:08:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:08:38 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:08:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:38.849 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:38.851 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:38.851 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:38.851 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:38.880 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:38 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:38.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:39 np0005532585.localdomain ceph-mon[300199]: pgmap v570: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 141 KiB/s wr, 11 op/s
Nov 23 10:08:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:08:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "format": "json"}]: dispatch
Nov 23 10:08:41 np0005532585.localdomain ceph-mon[300199]: pgmap v571: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 144 KiB/s wr, 12 op/s
Nov 23 10:08:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:41 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:08:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:08:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:08:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:08:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:08:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18786 "" "Go-http-client/1.1"
Nov 23 10:08:42 np0005532585.localdomain ceph-mon[300199]: pgmap v572: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 144 KiB/s wr, 12 op/s
Nov 23 10:08:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/0066f586-ad80-42ee-9cb5-57bd65fc15e2/7acd4543-b091-4fe9-abe4-01b783b9e751", "osd", "allow rw pool=manila_data namespace=fsvolumens_0066f586-ad80-42ee-9cb5-57bd65fc15e2", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/0066f586-ad80-42ee-9cb5-57bd65fc15e2/7acd4543-b091-4fe9-abe4-01b783b9e751", "osd", "allow rw pool=manila_data namespace=fsvolumens_0066f586-ad80-42ee-9cb5-57bd65fc15e2", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:43.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:43.910 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:43.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:43.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:43.912 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:43.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:08:44 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:08:45 np0005532585.localdomain systemd[1]: tmp-crun.8snAH1.mount: Deactivated successfully.
Nov 23 10:08:45 np0005532585.localdomain podman[335723]: 2025-11-23 10:08:45.043682448 +0000 UTC m=+0.092415435 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:08:45 np0005532585.localdomain podman[335723]: 2025-11-23 10:08:45.080476256 +0000 UTC m=+0.129209263 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:08:45 np0005532585.localdomain systemd[1]: tmp-crun.wQRhyl.mount: Deactivated successfully.
Nov 23 10:08:45 np0005532585.localdomain podman[335724]: 2025-11-23 10:08:45.097969413 +0000 UTC m=+0.143437002 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:08:45 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:08:45 np0005532585.localdomain podman[335724]: 2025-11-23 10:08:45.133562965 +0000 UTC m=+0.179030554 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 23 10:08:45 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:08:45 np0005532585.localdomain ceph-mon[300199]: pgmap v573: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 695 B/s rd, 140 KiB/s wr, 12 op/s
Nov 23 10:08:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:08:45 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:08:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:08:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:08:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:08:46.895 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:08:46 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:08:46.897 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:08:46 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:46.899 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:47 np0005532585.localdomain ceph-mon[300199]: pgmap v574: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 169 KiB/s wr, 15 op/s
Nov 23 10:08:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:08:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:08:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "format": "json"}]: dispatch
Nov 23 10:08:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0066f586-ad80-42ee-9cb5-57bd65fc15e2", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:48 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:48 np0005532585.localdomain sudo[335765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:08:48 np0005532585.localdomain sudo[335765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:08:48 np0005532585.localdomain sudo[335765]: pam_unix(sudo:session): session closed for user root
Nov 23 10:08:48 np0005532585.localdomain sudo[335783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:08:48 np0005532585.localdomain sudo[335783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:08:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:48.937 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:48 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:48.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:49 np0005532585.localdomain sudo[335783]: pam_unix(sudo:session): session closed for user root
Nov 23 10:08:49 np0005532585.localdomain sshd[335832]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:08:49 np0005532585.localdomain ceph-mon[300199]: pgmap v575: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 117 KiB/s wr, 10 op/s
Nov 23 10:08:49 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:08:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:49 np0005532585.localdomain sudo[335833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:08:49 np0005532585.localdomain sudo[335833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:08:49 np0005532585.localdomain sudo[335833]: pam_unix(sudo:session): session closed for user root
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "format": "json"}]: dispatch
Nov 23 10:08:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:50 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:08:50.899 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:08:51 np0005532585.localdomain ceph-mon[300199]: pgmap v576: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 178 KiB/s wr, 15 op/s
Nov 23 10:08:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:08:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:08:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:08:51 np0005532585.localdomain sshd[335852]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:08:52 np0005532585.localdomain sshd[335852]: Invalid user ftpuser from 107.172.15.139 port 49876
Nov 23 10:08:52 np0005532585.localdomain sshd[335852]: Received disconnect from 107.172.15.139 port 49876:11: Bye Bye [preauth]
Nov 23 10:08:52 np0005532585.localdomain sshd[335852]: Disconnected from invalid user ftpuser 107.172.15.139 port 49876 [preauth]
Nov 23 10:08:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:08:52 np0005532585.localdomain sshd[335832]: Connection closed by authenticating user root 185.156.73.233 port 36782 [preauth]
Nov 23 10:08:53 np0005532585.localdomain ceph-mon[300199]: pgmap v577: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 111 KiB/s wr, 9 op/s
Nov 23 10:08:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/01ed1422-dec9-4d44-991d-9583c95296ac/7b7be716-7e6a-49c4-a977-3707f1cc08b5", "osd", "allow rw pool=manila_data namespace=fsvolumens_01ed1422-dec9-4d44-991d-9583c95296ac", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/01ed1422-dec9-4d44-991d-9583c95296ac/7b7be716-7e6a-49c4-a977-3707f1cc08b5", "osd", "allow rw pool=manila_data namespace=fsvolumens_01ed1422-dec9-4d44-991d-9583c95296ac", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:53.969 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:53.971 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:53.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:53.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:53.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:53.977 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:54 np0005532585.localdomain sshd[335854]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:08:55 np0005532585.localdomain sshd[335854]: Invalid user tiptop from 207.154.194.2 port 52514
Nov 23 10:08:55 np0005532585.localdomain ceph-mon[300199]: pgmap v578: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 111 KiB/s wr, 9 op/s
Nov 23 10:08:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "185a1676-923a-42e3-bf21-4993a35dc3a8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "185a1676-923a-42e3-bf21-4993a35dc3a8", "format": "json"}]: dispatch
Nov 23 10:08:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:55 np0005532585.localdomain sshd[335854]: Received disconnect from 207.154.194.2 port 52514:11: Bye Bye [preauth]
Nov 23 10:08:55 np0005532585.localdomain sshd[335854]: Disconnected from invalid user tiptop 207.154.194.2 port 52514 [preauth]
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: pgmap v579: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 166 KiB/s wr, 14 op/s
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "format": "json"}]: dispatch
Nov 23 10:08:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "01ed1422-dec9-4d44-991d-9583c95296ac", "force": true, "format": "json"}]: dispatch
Nov 23 10:08:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:08:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:08:56 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:08:57 np0005532585.localdomain podman[335857]: 2025-11-23 10:08:57.041261722 +0000 UTC m=+0.090424934 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 23 10:08:57 np0005532585.localdomain podman[335857]: 2025-11-23 10:08:57.050279625 +0000 UTC m=+0.099442877 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:08:57 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:08:57 np0005532585.localdomain podman[335858]: 2025-11-23 10:08:57.139984167 +0000 UTC m=+0.185901172 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41)
Nov 23 10:08:57 np0005532585.localdomain podman[335858]: 2025-11-23 10:08:57.152671389 +0000 UTC m=+0.198588404 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 10:08:57 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:08:57 np0005532585.localdomain podman[335856]: 2025-11-23 10:08:57.243573537 +0000 UTC m=+0.292547984 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 23 10:08:57 np0005532585.localdomain podman[335856]: 2025-11-23 10:08:57.284388286 +0000 UTC m=+0.333362773 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 10:08:57 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:08:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "30bb7bc1-b3ad-4313-b9fb-0702c023b9cf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:08:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "30bb7bc1-b3ad-4313-b9fb-0702c023b9cf", "format": "json"}]: dispatch
Nov 23 10:08:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:08:58 np0005532585.localdomain ceph-mon[300199]: pgmap v580: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 117 KiB/s wr, 9 op/s
Nov 23 10:08:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:08:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:08:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:08:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:08:58 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:08:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:58.978 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:58.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:08:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:58.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:08:58 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:58.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:59.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:08:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:08:59.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:08:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:08:59 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:08:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:08:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:08:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:08:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:08:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:09:00 np0005532585.localdomain ceph-mon[300199]: pgmap v581: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 190 KiB/s wr, 16 op/s
Nov 23 10:09:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1447695815' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:09:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1447695815' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:09:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:09:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:09:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:01 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "30bb7bc1-b3ad-4313-b9fb-0702c023b9cf", "format": "json"}]: dispatch
Nov 23 10:09:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "30bb7bc1-b3ad-4313-b9fb-0702c023b9cf", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:02 np0005532585.localdomain ceph-mon[300199]: pgmap v582: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 128 KiB/s wr, 10 op/s
Nov 23 10:09:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:09:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:09:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:09:03 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:09:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:03 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:04 np0005532585.localdomain podman[335919]: 2025-11-23 10:09:04.007833627 +0000 UTC m=+0.067820783 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:09:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:04.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:04.016 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:04.016 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:04.016 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:04.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:04.051 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:04 np0005532585.localdomain podman[335919]: 2025-11-23 10:09:04.078504777 +0000 UTC m=+0.138491883 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 10:09:04 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:09:04 np0005532585.localdomain systemd[1]: tmp-crun.swIPAv.mount: Deactivated successfully.
Nov 23 10:09:04 np0005532585.localdomain podman[335920]: 2025-11-23 10:09:04.116599944 +0000 UTC m=+0.173250720 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:09:04 np0005532585.localdomain podman[335920]: 2025-11-23 10:09:04.128335257 +0000 UTC m=+0.184985993 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:09:04 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:09:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d74ea3e8-4393-4bff-9c03-d13b44062e75", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: pgmap v583: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 128 KiB/s wr, 10 op/s
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d74ea3e8-4393-4bff-9c03-d13b44062e75", "format": "json"}]: dispatch
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice", "format": "json"}]: dispatch
Nov 23 10:09:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:05 np0005532585.localdomain sshd[335962]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:09:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "dce3a629-c5fc-45c8-88f2-5a0994d73c89", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "dce3a629-c5fc-45c8-88f2-5a0994d73c89", "format": "json"}]: dispatch
Nov 23 10:09:06 np0005532585.localdomain sshd[335962]: Invalid user richard from 175.126.166.172 port 37556
Nov 23 10:09:06 np0005532585.localdomain sshd[335962]: Received disconnect from 175.126.166.172 port 37556:11: Bye Bye [preauth]
Nov 23 10:09:06 np0005532585.localdomain sshd[335962]: Disconnected from invalid user richard 175.126.166.172 port 37556 [preauth]
Nov 23 10:09:07 np0005532585.localdomain ceph-mon[300199]: pgmap v584: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 198 KiB/s wr, 17 op/s
Nov 23 10:09:07 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:07 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:09:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:08 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:09.051 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:09.053 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:09.054 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:09.054 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:09.094 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:09.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:09:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:09:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:09:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:09:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:09:09.307 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:09:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d74ea3e8-4393-4bff-9c03-d13b44062e75", "format": "json"}]: dispatch
Nov 23 10:09:09 np0005532585.localdomain ceph-mon[300199]: pgmap v585: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 142 KiB/s wr, 12 op/s
Nov 23 10:09:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d74ea3e8-4393-4bff-9c03-d13b44062e75", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dce3a629-c5fc-45c8-88f2-5a0994d73c89", "format": "json"}]: dispatch
Nov 23 10:09:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "dce3a629-c5fc-45c8-88f2-5a0994d73c89", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:09:10 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.810 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.813 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7777f726-5b83-4975-a13b-d0701770fe49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.813354', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a3f7b8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '6bdf39f3e705e56e3131f71ac15541bc29580bbd5536506e95120fc04071f1e4'}]}, 'timestamp': '2025-11-23 10:09:10.819993', '_unique_id': 'cdb8060dc1124f03809c862e76acebd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.824 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4b5d7f6-cda6-42c2-aa54-34fbe2436d95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.824426', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a4bfc2-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': 'de46a22337f6e2ee85d7b4f55d9efbea32efd98ea289bec831b100d8a8491970'}]}, 'timestamp': '2025-11-23 10:09:10.825101', '_unique_id': 'a7e547c913ff4676a5eed98dc42291dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.827 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.828 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7e996a5-897d-42b0-bb12-56e23cc23b6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.828011', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a5491a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '61e2661dfe0a164d5562f1fdff7194074ac26c7968524846ae5df14b172958a5'}]}, 'timestamp': '2025-11-23 10:09:10.828522', '_unique_id': 'bd1e859dd82d4d03aec2d8277dcbcdc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.830 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.844 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '507c2d21-ca30-4654-a611-98267928dfd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.830919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74a7dbbc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '45b7173f89095abe1c469b89d8801933bfc301096a56c8bb80c4e43b8a1fdaec'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.830919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74a7f0b6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': 'c6144a47c7d69f825d31a1c8505cc245c013d821fad7f490d9dec7ad908f0c3b'}]}, 'timestamp': '2025-11-23 10:09:10.845876', '_unique_id': 'f87e30ebdfe24629b90f4b501cc2bb57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.848 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '320f096e-dd22-4467-9ced-2a073c55432f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.848367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74a86406-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '899616b4e36d2686f8047ca76f516d5cee0928f140b2bfe5a52e77a27a89b832'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.848367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74a8761c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '47fc14b2fd8f167602892495a7459577e0311cb72c63e36260d4874784127a53'}]}, 'timestamp': '2025-11-23 10:09:10.849282', '_unique_id': '726bb5a44bce462397b88ffa707b92f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.851 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6174530-1ac4-4d3e-a7ad-d60d0335ee47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.851928', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a8ef02-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '221bacf79790c5cb81697acafbd618900a9fc8f775beea10449de1cec69ff410'}]}, 'timestamp': '2025-11-23 10:09:10.852410', '_unique_id': '386c6a9a530a42acb3b7482cca055262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.854 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.854 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75015848-d116-4dd4-8605-206fcc60be53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.854684', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a95be0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '980b6bd66e80b76f05761846a05b379597e6f05b37e7aec5d7359a0c259ab405'}]}, 'timestamp': '2025-11-23 10:09:10.855199', '_unique_id': '962b6cb628fd4b9dba1ccb487a5b317c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c445cc9f-cfc7-4d1e-a3c2-07e14eb245ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:09:10.857450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '74ac44a4-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.051080084, 'message_signature': 'b65eac0ff552f12d7b9aaf84aea93dbe3b3a2fa6b895ae15b4d6992aa340ac47'}]}, 'timestamp': '2025-11-23 10:09:10.874338', '_unique_id': 'acb7f7b625464afc98bd3c843971345b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6c10830-feab-4a4b-80ff-a152d25b84b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.877052', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74acc438-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '1b568509196e656a030f32949643259f7a17ee3de3848c3aab5fc3df12de2319'}]}, 'timestamp': '2025-11-23 10:09:10.877581', '_unique_id': '2148469aedc6490db3cc51d04214e7b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.880 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dda24e44-2c50-40b6-b245-519249754180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.880303', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74ad437c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '429d6a5df70f19f001322f1200fa1beb55787fbe1dba36c3c143f0851889325d'}]}, 'timestamp': '2025-11-23 10:09:10.880793', '_unique_id': '56baef8356c14bea89c6b6744b66d75f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93c3a3f5-5148-4598-a1b2-b87d71aa7eb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.883392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74adbbb8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': 'e11df3f07916355c95af7629e6b9a8bfbf5938f5f169d86716c67998926e4382'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.883392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74adce32-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '03ba519e9f7521726f21315546d2646a8090e738707a65a2259905f7fdf6dc9c'}]}, 'timestamp': '2025-11-23 10:09:10.884306', '_unique_id': '82c151f722054962841189e0472b358f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.913 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '856a24cb-7a89-4e08-a5d9-1bcda583e8b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.886755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b22fcc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '16793775dfaae36bb33f5bccfe40b5b7d4c0660010c21bb1f2aacd1d45cea2f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.886755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b2482c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '8850c42f846f7ea84cdf45fe720fb2feb77b7e8710e9307238d4e2c82f5be8d3'}]}, 'timestamp': '2025-11-23 10:09:10.913684', '_unique_id': 'f475d5fe816546f39d71eed3286afa5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57ade1fa-7593-40ba-8e59-0e9840c829ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.917097', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b2e110-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '975c37f3df41b55a6989d5cef2c85e8336a8a7d546a9571d65ef05c6b4c50490'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.917097', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b2f79a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'd7f7fab5f55970dda02982caa22116d1304d73ebac1e8805c9c4f1b2bb448a08'}]}, 'timestamp': '2025-11-23 10:09:10.918201', '_unique_id': 'a687bbf5bca54438af003ea76d849525'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4bd14e3-101c-4363-b4e5-4df8f82bc4c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.920930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b37756-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'c034db4930dbd26fff653c9198feb4116ca7ff56530c0d88307ecd4bade5105b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.920930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b38d22-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'c9221814fad82b6a122fef89666e80fe48007adc578cfd1804ad730b352a90da'}]}, 'timestamp': '2025-11-23 10:09:10.922018', '_unique_id': 'c3f71d8dee7b48c3b7405835ce2c39fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1281d4d-0c9e-4163-b914-cb216c275362', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.924788', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b41396-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '3aeeb77ddb35ad73d4bc583aaffe063303899ce3641a0254d6d564c59bc82469'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.924788', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b42b7e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '7c93112dc8a47b0070c065d4f7ba32101d8cc706282716e4f610050cc32060a6'}]}, 'timestamp': '2025-11-23 10:09:10.926110', '_unique_id': '00ed2e286dc34cae8aee061e01ca553e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 19160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '901f8448-4523-4919-964a-9a10c8a3449f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19160000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:09:10.929129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '74b4b828-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.051080084, 'message_signature': 'd31991ddb85b33220d067d5b04520f0d861e2f0b38536c9e48f0a9e1257772a5'}]}, 'timestamp': '2025-11-23 10:09:10.929653', '_unique_id': 'a90b1e80c9dd4328a9aadfcf103ae2da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad922ec-2a2f-40e1-b41f-9078ea7b8f8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.932222', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74b531b8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': 'cb175cd43986f18a0bd6f2fadadb2f6c9848ce28937d6709472dd0960e732187'}]}, 'timestamp': '2025-11-23 10:09:10.932789', '_unique_id': 'cfe541e6826c4ef1bfbf22d712c07e1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b3f7c93-da73-4033-9592-0cc9d382c56b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.935292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b5a74c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'e44d4925c3793e27aa56fc1d60a632a5ed27f54803d4e0f5fb36d75df117920f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.935292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b5bbe2-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '89d1168bf4bd01c8fce6fbc922b4248a2f105e5ad0f28ea8e0780ae94d2d99a0'}]}, 'timestamp': '2025-11-23 10:09:10.936287', '_unique_id': '6e700391597242718a5ac47d2d9acffe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.939 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a74ef4f-a9f2-4d48-9c62-070bc54ab63f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.939122', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74b63d06-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '930ccd4bc1201414ec7cb9aaf860918fa7771719a60d1ff494959738a266e79b'}]}, 'timestamp': '2025-11-23 10:09:10.939618', '_unique_id': 'e96f68bb3e2142d2b91a96b5f88f92ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '637b73e3-e30a-4d35-8556-f65697835a55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.942380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b6bc2c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'dce375e97a805a841e3b1d086f3381e10a0480d39f27030ea95ba46ed0531977'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.942380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b6d0e0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'eed851b50249efdb75074f112dce89ea9a6f03c268f6aa4cab493db68ad32df4'}]}, 'timestamp': '2025-11-23 10:09:10.943379', '_unique_id': '2d5d04b18d4343edbf341525f56ad6d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.946 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80128c08-c122-4e5d-a878-d53208dd6fd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.946072', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74b74caa-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '2d99af144745d7427ce0c80fdc622256b32564875b1e9736c8626df796afcbc5'}]}, 'timestamp': '2025-11-23 10:09:10.946652', '_unique_id': 'a1a4bc49764a4ad282cb35b2a21e899e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:09:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: pgmap v586: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 210 KiB/s wr, 18 op/s
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:09:11 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:09:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:09:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:09:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:09:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:09:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:09:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18789 "" "Go-http-client/1.1"
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "13173eeb-f983-4f00-b7e6-2bd2cef8c0e1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "13173eeb-f983-4f00-b7e6-2bd2cef8c0e1", "format": "json"}]: dispatch
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: pgmap v587: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 137 KiB/s wr, 11 op/s
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6f0e6dc5-2184-4beb-90f6-26a08efe35ab", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:13 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6f0e6dc5-2184-4beb-90f6-26a08efe35ab", "format": "json"}]: dispatch
Nov 23 10:09:13 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:13 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:14.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:14.097 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:14.098 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:14.098 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:14.130 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:14.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: pgmap v588: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 137 KiB/s wr, 11 op/s
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "13173eeb-f983-4f00-b7e6-2bd2cef8c0e1", "format": "json"}]: dispatch
Nov 23 10:09:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "13173eeb-f983-4f00-b7e6-2bd2cef8c0e1", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:09:15 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:09:16 np0005532585.localdomain systemd[1]: tmp-crun.piUVU2.mount: Deactivated successfully.
Nov 23 10:09:16 np0005532585.localdomain podman[335964]: 2025-11-23 10:09:16.037168858 +0000 UTC m=+0.091243159 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:09:16 np0005532585.localdomain podman[335964]: 2025-11-23 10:09:16.074495843 +0000 UTC m=+0.128570234 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:09:16 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:09:16 np0005532585.localdomain podman[335965]: 2025-11-23 10:09:16.086409542 +0000 UTC m=+0.136444802 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 10:09:16 np0005532585.localdomain podman[335965]: 2025-11-23 10:09:16.099310391 +0000 UTC m=+0.149345651 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 23 10:09:16 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:09:16 np0005532585.localdomain ceph-mon[300199]: pgmap v589: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 228 KiB/s wr, 20 op/s
Nov 23 10:09:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6f0e6dc5-2184-4beb-90f6-26a08efe35ab", "format": "json"}]: dispatch
Nov 23 10:09:16 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6f0e6dc5-2184-4beb-90f6-26a08efe35ab", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:09:16 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:09:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:09:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 23 10:09:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 23 10:09:17 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 23 10:09:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:19.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:19.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:19.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:19.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:19.176 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:19.177 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 23 10:09:19 np0005532585.localdomain ceph-mon[300199]: pgmap v590: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 158 KiB/s wr, 13 op/s
Nov 23 10:09:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "317c3401-a8ad-479f-9a0e-c3503597e474", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "317c3401-a8ad-479f-9a0e-c3503597e474", "format": "json"}]: dispatch
Nov 23 10:09:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:20 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:21 np0005532585.localdomain ceph-mon[300199]: pgmap v591: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 219 KiB/s wr, 19 op/s
Nov 23 10:09:21 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "tenant_id": "4d4586d45fe545c4bf78ea7720359b10", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:21 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:09:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:21 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:22 np0005532585.localdomain ceph-mon[300199]: pgmap v592: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 152 KiB/s wr, 14 op/s
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.231 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.232 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.651 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.652 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.652 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:09:23 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:23.653 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.160 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.178 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.182 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.183 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.186 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:24.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:09:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.242 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.243 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.244 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.244 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.245 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "auth_id": "tempest-cephx-id-1431575460", "format": "json"}]: dispatch
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: pgmap v593: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 152 KiB/s wr, 14 op/s
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "317c3401-a8ad-479f-9a0e-c3503597e474", "format": "json"}]: dispatch
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "317c3401-a8ad-479f-9a0e-c3503597e474", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2165963606' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:09:25 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2374868706' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.683 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.761 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:09:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:25.762 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.012 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.015 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11047MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.015 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.016 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.093 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.114 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.142 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.143 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.168 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.192 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.247 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:09:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2374868706' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2697729236' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:26 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:09:26 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/610067357' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.713 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.719 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.738 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.740 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:09:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:26.740 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: pgmap v594: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 229 KiB/s wr, 21 op/s
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "format": "json"}]: dispatch
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/610067357' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:27 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:27.735 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:27.736 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:27.736 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:27.737 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:09:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:09:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:09:27 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:09:28 np0005532585.localdomain systemd[1]: tmp-crun.SkKjzu.mount: Deactivated successfully.
Nov 23 10:09:28 np0005532585.localdomain podman[336050]: 2025-11-23 10:09:28.038393823 +0000 UTC m=+0.087703463 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 23 10:09:28 np0005532585.localdomain podman[336052]: 2025-11-23 10:09:28.062281674 +0000 UTC m=+0.103662754 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 23 10:09:28 np0005532585.localdomain podman[336052]: 2025-11-23 10:09:28.148301785 +0000 UTC m=+0.189682795 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-type=git, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1755695350)
Nov 23 10:09:28 np0005532585.localdomain podman[336050]: 2025-11-23 10:09:28.160212064 +0000 UTC m=+0.209521663 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 23 10:09:28 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:09:28 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:09:28 np0005532585.localdomain podman[336051]: 2025-11-23 10:09:28.1521176 +0000 UTC m=+0.197880973 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:09:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:28.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:28 np0005532585.localdomain podman[336051]: 2025-11-23 10:09:28.231933733 +0000 UTC m=+0.277697146 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:09:28 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "561ab685-42f2-4920-b91d-2420296f93f0", "format": "json"}]: dispatch
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "561ab685-42f2-4920-b91d-2420296f93f0", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "r", "format": "json"}]: dispatch
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: pgmap v595: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 139 KiB/s wr, 12 op/s
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1279803891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "68ff3efc-3173-4e54-ae87-43e935ed3d0f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "68ff3efc-3173-4e54-ae87-43e935ed3d0f", "format": "json"}]: dispatch
Nov 23 10:09:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:29.185 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:29.187 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:29.187 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:29.187 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:29.221 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:29.221 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2081433797' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:09:29 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "05a84a88-73a6-4e8e-8d02-fae54fb3ac71", "format": "json"}]: dispatch
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:09:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:09:30 np0005532585.localdomain ceph-mon[300199]: pgmap v596: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 196 KiB/s wr, 17 op/s
Nov 23 10:09:31 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:09:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 23 10:09:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 23 10:09:31 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 23 10:09:31 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 23 10:09:32 np0005532585.localdomain ceph-mon[300199]: pgmap v597: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 135 KiB/s wr, 11 op/s
Nov 23 10:09:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "36ea254b-85e6-42e7-9ff4-edd1917fb520", "format": "json"}]: dispatch
Nov 23 10:09:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "68ff3efc-3173-4e54-ae87-43e935ed3d0f", "format": "json"}]: dispatch
Nov 23 10:09:32 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "68ff3efc-3173-4e54-ae87-43e935ed3d0f", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:34.222 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:34.224 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:34.225 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:34.225 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:34.262 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:34.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 23 10:09:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:09:34 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:09:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:09:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:09:35 np0005532585.localdomain podman[336112]: 2025-11-23 10:09:35.025121034 +0000 UTC m=+0.077028831 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:09:35 np0005532585.localdomain podman[336112]: 2025-11-23 10:09:35.036330512 +0000 UTC m=+0.088238279 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:09:35 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:09:35 np0005532585.localdomain podman[336111]: 2025-11-23 10:09:35.086161133 +0000 UTC m=+0.135797031 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:09:35 np0005532585.localdomain podman[336111]: 2025-11-23 10:09:35.097456523 +0000 UTC m=+0.147092671 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:09:35 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:09:35 np0005532585.localdomain ceph-mon[300199]: pgmap v598: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 135 KiB/s wr, 11 op/s
Nov 23 10:09:35 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:37.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:09:37 np0005532585.localdomain ceph-mon[300199]: pgmap v599: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 188 KiB/s wr, 15 op/s
Nov 23 10:09:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "36ea254b-85e6-42e7-9ff4-edd1917fb520_39664d2d-8ed1-4c9f-9b99-02052c6cf2a5", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "36ea254b-85e6-42e7-9ff4-edd1917fb520", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "185a1676-923a-42e3-bf21-4993a35dc3a8", "format": "json"}]: dispatch
Nov 23 10:09:38 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "185a1676-923a-42e3-bf21-4993a35dc3a8", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:38 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e258 e258: 6 total, 6 up, 6 in
Nov 23 10:09:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:39.264 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:39.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:39.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:39.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:39.298 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:39.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:39 np0005532585.localdomain ceph-mon[300199]: pgmap v600: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 111 KiB/s wr, 9 op/s
Nov 23 10:09:39 np0005532585.localdomain ceph-mon[300199]: osdmap e258: 6 total, 6 up, 6 in
Nov 23 10:09:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6734401e-5573-4709-85e4-c69140f6c86e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6734401e-5573-4709-85e4-c69140f6c86e", "format": "json"}]: dispatch
Nov 23 10:09:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:41 np0005532585.localdomain ceph-mon[300199]: pgmap v602: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 613 B/s rd, 112 KiB/s wr, 9 op/s
Nov 23 10:09:41 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "d5e337f3-9c6d-416c-9ea4-ca3cff9c20ec", "format": "json"}]: dispatch
Nov 23 10:09:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:09:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:09:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:09:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:09:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:09:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1"
Nov 23 10:09:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 23 10:09:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6,allow rw path=/volumes/_nogroup/6734401e-5573-4709-85e4-c69140f6c86e/7aacd64e-4c3a-45c6-ac2d-a36c3c0cca66", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4,allow rw pool=manila_data namespace=fsvolumens_6734401e-5573-4709-85e4-c69140f6c86e"]} : dispatch
Nov 23 10:09:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e259 e259: 6 total, 6 up, 6 in
Nov 23 10:09:43 np0005532585.localdomain ceph-mon[300199]: pgmap v603: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 613 B/s rd, 112 KiB/s wr, 9 op/s
Nov 23 10:09:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6734401e-5573-4709-85e4-c69140f6c86e", "auth_id": "bob", "tenant_id": "836ae7fa48674096a03624b407ebbbc9", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:09:43 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6,allow rw path=/volumes/_nogroup/6734401e-5573-4709-85e4-c69140f6c86e/7aacd64e-4c3a-45c6-ac2d-a36c3c0cca66", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4,allow rw pool=manila_data namespace=fsvolumens_6734401e-5573-4709-85e4-c69140f6c86e"]}]': finished
Nov 23 10:09:43 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 23 10:09:43 np0005532585.localdomain ceph-mon[300199]: osdmap e259: 6 total, 6 up, 6 in
Nov 23 10:09:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:44.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:44.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:44.302 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:44.302 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:44.339 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:44.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:45 np0005532585.localdomain ceph-mon[300199]: pgmap v605: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 61 KiB/s wr, 4 op/s
Nov 23 10:09:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "d5e337f3-9c6d-416c-9ea4-ca3cff9c20ec_3554bede-d6bc-42a0-880f-3aeb45030174", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "d5e337f3-9c6d-416c-9ea4-ca3cff9c20ec", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 23 10:09:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4"]} : dispatch
Nov 23 10:09:46 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4"]}]': finished
Nov 23 10:09:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:09:46 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:09:47 np0005532585.localdomain systemd[1]: tmp-crun.svnan1.mount: Deactivated successfully.
Nov 23 10:09:47 np0005532585.localdomain podman[336153]: 2025-11-23 10:09:47.087653618 +0000 UTC m=+0.093947261 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:09:47 np0005532585.localdomain podman[336154]: 2025-11-23 10:09:47.128447204 +0000 UTC m=+0.131400814 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:09:47 np0005532585.localdomain podman[336153]: 2025-11-23 10:09:47.146637484 +0000 UTC m=+0.152931117 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:09:47 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:09:47 np0005532585.localdomain podman[336154]: 2025-11-23 10:09:47.161880153 +0000 UTC m=+0.164833773 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 10:09:47 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:09:47 np0005532585.localdomain ceph-mon[300199]: pgmap v606: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 113 KiB/s wr, 9 op/s
Nov 23 10:09:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6734401e-5573-4709-85e4-c69140f6c86e", "auth_id": "bob", "format": "json"}]: dispatch
Nov 23 10:09:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6734401e-5573-4709-85e4-c69140f6c86e", "auth_id": "bob", "format": "json"}]: dispatch
Nov 23 10:09:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "44e59f1f-b497-47bb-85b6-28f51fe17e15", "format": "json"}]: dispatch
Nov 23 10:09:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e260 e260: 6 total, 6 up, 6 in
Nov 23 10:09:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:49.341 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:49.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:09:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:49.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:09:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:49.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:49 np0005532585.localdomain ceph-mon[300199]: pgmap v607: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 555 B/s rd, 99 KiB/s wr, 7 op/s
Nov 23 10:09:49 np0005532585.localdomain ceph-mon[300199]: osdmap e260: 6 total, 6 up, 6 in
Nov 23 10:09:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:49.386 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:49.387 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:09:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:49 np0005532585.localdomain sudo[336196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:09:49 np0005532585.localdomain sudo[336196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:09:49 np0005532585.localdomain sudo[336196]: pam_unix(sudo:session): session closed for user root
Nov 23 10:09:49 np0005532585.localdomain sudo[336214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:09:49 np0005532585.localdomain sudo[336214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:09:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "bob", "format": "json"}]: dispatch
Nov 23 10:09:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 23 10:09:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 23 10:09:50 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Nov 23 10:09:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "auth_id": "bob", "format": "json"}]: dispatch
Nov 23 10:09:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:50 np0005532585.localdomain sudo[336214]: pam_unix(sudo:session): session closed for user root
Nov 23 10:09:50 np0005532585.localdomain sudo[336263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:09:50 np0005532585.localdomain sudo[336263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:09:50 np0005532585.localdomain sudo[336263]: pam_unix(sudo:session): session closed for user root
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: pgmap v609: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 121 KiB/s wr, 9 op/s
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "60134a7e-4ba8-46c5-95a6-e7b9f66ce444", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "60134a7e-4ba8-46c5-95a6-e7b9f66ce444", "format": "json"}]: dispatch
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:09:51 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:09:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "44e59f1f-b497-47bb-85b6-28f51fe17e15_6955baeb-e48a-4415-962c-c531e9f0e365", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:52 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "44e59f1f-b497-47bb-85b6-28f51fe17e15", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e261 e261: 6 total, 6 up, 6 in
Nov 23 10:09:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:09:53.517 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:09:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:09:53.519 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:09:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:09:53.519 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:09:53 np0005532585.localdomain ceph-mon[300199]: pgmap v610: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 370 B/s rd, 117 KiB/s wr, 9 op/s
Nov 23 10:09:53 np0005532585.localdomain ceph-mon[300199]: osdmap e261: 6 total, 6 up, 6 in
Nov 23 10:09:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:53.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:54.388 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:54.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6734401e-5573-4709-85e4-c69140f6c86e", "format": "json"}]: dispatch
Nov 23 10:09:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6734401e-5573-4709-85e4-c69140f6c86e", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:09:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:55 np0005532585.localdomain ceph-mon[300199]: pgmap v612: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 68 KiB/s wr, 5 op/s
Nov 23 10:09:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "50ef3b94-fa58-4aa7-8f5f-5b4e06f993be", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "50ef3b94-fa58-4aa7-8f5f-5b4e06f993be", "format": "json"}]: dispatch
Nov 23 10:09:55 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "3d69f3d9-93bd-4c9d-9fd9-2da996580ba0", "format": "json"}]: dispatch
Nov 23 10:09:56 np0005532585.localdomain ceph-mon[300199]: pgmap v613: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 149 KiB/s wr, 11 op/s
Nov 23 10:09:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "format": "json"}]: dispatch
Nov 23 10:09:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "force": true, "format": "json"}]: dispatch
Nov 23 10:09:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "128bea3f-de1e-4023-85ec-3b82c2e536a6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:09:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "128bea3f-de1e-4023-85ec-3b82c2e536a6", "format": "json"}]: dispatch
Nov 23 10:09:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:09:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e262 e262: 6 total, 6 up, 6 in
Nov 23 10:09:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:09:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:09:58 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:09:59 np0005532585.localdomain podman[336282]: 2025-11-23 10:09:59.043974525 +0000 UTC m=+0.096435498 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:09:59 np0005532585.localdomain podman[336283]: 2025-11-23 10:09:59.087214866 +0000 UTC m=+0.135907254 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Nov 23 10:09:59 np0005532585.localdomain podman[336281]: 2025-11-23 10:09:59.129861008 +0000 UTC m=+0.183554630 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 10:09:59 np0005532585.localdomain podman[336282]: 2025-11-23 10:09:59.156193038 +0000 UTC m=+0.208654011 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:09:59 np0005532585.localdomain podman[336283]: 2025-11-23 10:09:59.180751914 +0000 UTC m=+0.229444292 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 23 10:09:59 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:09:59 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:09:59 np0005532585.localdomain podman[336281]: 2025-11-23 10:09:59.248145378 +0000 UTC m=+0.301838970 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:09:59 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:09:59 np0005532585.localdomain ceph-mon[300199]: pgmap v614: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 567 B/s rd, 133 KiB/s wr, 10 op/s
Nov 23 10:09:59 np0005532585.localdomain ceph-mon[300199]: osdmap e262: 6 total, 6 up, 6 in
Nov 23 10:09:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:59.390 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:09:59.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:09:59 np0005532585.localdomain sshd[336342]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:09:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:09:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:09:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:10:00 np0005532585.localdomain sshd[336342]: Invalid user admin from 207.154.194.2 port 53260
Nov 23 10:10:00 np0005532585.localdomain sshd[336342]: Received disconnect from 207.154.194.2 port 53260:11: Bye Bye [preauth]
Nov 23 10:10:00 np0005532585.localdomain sshd[336342]: Disconnected from invalid user admin 207.154.194.2 port 53260 [preauth]
Nov 23 10:10:00 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "3d69f3d9-93bd-4c9d-9fd9-2da996580ba0_3667dc95-9604-47c1-b81e-3a48d1dffa79", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:00 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "3d69f3d9-93bd-4c9d-9fd9-2da996580ba0", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:00 np0005532585.localdomain ceph-mon[300199]: overall HEALTH_OK
Nov 23 10:10:01 np0005532585.localdomain ceph-mon[300199]: pgmap v616: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 155 KiB/s wr, 11 op/s
Nov 23 10:10:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0f05cf61-7702-4ec2-9212-d2c2293260bd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0f05cf61-7702-4ec2-9212-d2c2293260bd", "format": "json"}]: dispatch
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.440354) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602440378, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2770, "num_deletes": 256, "total_data_size": 3083694, "memory_usage": 3133792, "flush_reason": "Manual Compaction"}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602449421, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1997751, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32968, "largest_seqno": 35733, "table_properties": {"data_size": 1987130, "index_size": 6490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27052, "raw_average_key_size": 22, "raw_value_size": 1963988, "raw_average_value_size": 1608, "num_data_blocks": 281, "num_entries": 1221, "num_filter_entries": 1221, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892480, "oldest_key_time": 1763892480, "file_creation_time": 1763892602, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 9120 microseconds, and 3763 cpu microseconds.
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.449469) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1997751 bytes OK
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.449488) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.451538) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.451552) EVENT_LOG_v1 {"time_micros": 1763892602451548, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.451568) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3070480, prev total WAL file size 3070480, number of live WAL files 2.
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.452093) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1950KB)], [57(17MB)]
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602452124, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 20452859, "oldest_snapshot_seqno": -1}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 14366 keys, 18624489 bytes, temperature: kUnknown
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602518234, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 18624489, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18539868, "index_size": 47592, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 385495, "raw_average_key_size": 26, "raw_value_size": 18293034, "raw_average_value_size": 1273, "num_data_blocks": 1778, "num_entries": 14366, "num_filter_entries": 14366, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892602, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.518619) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 18624489 bytes
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.521276) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 308.7 rd, 281.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 17.6 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(19.6) write-amplify(9.3) OK, records in: 14899, records dropped: 533 output_compression: NoCompression
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.521303) EVENT_LOG_v1 {"time_micros": 1763892602521291, "job": 34, "event": "compaction_finished", "compaction_time_micros": 66258, "compaction_time_cpu_micros": 33290, "output_level": 6, "num_output_files": 1, "total_output_size": 18624489, "num_input_records": 14899, "num_output_records": 14366, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602521707, "job": 34, "event": "table_file_deletion", "file_number": 59}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602524255, "job": 34, "event": "table_file_deletion", "file_number": 57}
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.452052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524378) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524381) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:10:02 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524384) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:10:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e263 e263: 6 total, 6 up, 6 in
Nov 23 10:10:03 np0005532585.localdomain ceph-mon[300199]: pgmap v617: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 865 B/s rd, 150 KiB/s wr, 10 op/s
Nov 23 10:10:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:04.421 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:04 np0005532585.localdomain ceph-mon[300199]: osdmap e263: 6 total, 6 up, 6 in
Nov 23 10:10:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "ff791c0e-f88c-4daa-a7d3-3098e61f94ca", "format": "json"}]: dispatch
Nov 23 10:10:05 np0005532585.localdomain ceph-mon[300199]: pgmap v619: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 74 KiB/s wr, 5 op/s
Nov 23 10:10:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:10:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:10:06 np0005532585.localdomain systemd[1]: tmp-crun.6mUJxt.mount: Deactivated successfully.
Nov 23 10:10:06 np0005532585.localdomain podman[336344]: 2025-11-23 10:10:06.04225159 +0000 UTC m=+0.098963406 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 10:10:06 np0005532585.localdomain podman[336345]: 2025-11-23 10:10:06.08352259 +0000 UTC m=+0.137035108 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:10:06 np0005532585.localdomain podman[336344]: 2025-11-23 10:10:06.103708361 +0000 UTC m=+0.160420187 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 10:10:06 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:10:06 np0005532585.localdomain podman[336345]: 2025-11-23 10:10:06.119161737 +0000 UTC m=+0.172674245 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:10:06 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:10:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0f05cf61-7702-4ec2-9212-d2c2293260bd", "format": "json"}]: dispatch
Nov 23 10:10:06 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0f05cf61-7702-4ec2-9212-d2c2293260bd", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:07 np0005532585.localdomain ceph-mon[300199]: pgmap v620: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 112 KiB/s wr, 7 op/s
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e264 e264: 6 total, 6 up, 6 in
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: pgmap v621: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 662 B/s rd, 97 KiB/s wr, 6 op/s
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "format": "json"}]: dispatch
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: osdmap e264: 6 total, 6 up, 6 in
Nov 23 10:10:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "128bea3f-de1e-4023-85ec-3b82c2e536a6", "format": "json"}]: dispatch
Nov 23 10:10:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:10:09.307 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:10:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:10:09.307 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:10:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:10:09.308 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:10:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:09.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:09.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:09.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:09.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:09.457 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:09.458 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "128bea3f-de1e-4023-85ec-3b82c2e536a6", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "ff791c0e-f88c-4daa-a7d3-3098e61f94ca_f78b03f5-7a9b-4515-9034-ad766e103e0a", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:09 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "ff791c0e-f88c-4daa-a7d3-3098e61f94ca", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:10 np0005532585.localdomain ceph-mon[300199]: pgmap v623: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 107 KiB/s wr, 6 op/s
Nov 23 10:10:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:10:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:10:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:10:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:10:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:10:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1"
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: pgmap v624: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 500 B/s rd, 105 KiB/s wr, 6 op/s
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "format": "json"}]: dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "50ef3b94-fa58-4aa7-8f5f-5b4e06f993be", "format": "json"}]: dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "50ef3b94-fa58-4aa7-8f5f-5b4e06f993be", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "format": "json"}]: dispatch
Nov 23 10:10:12 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:13 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e265 e265: 6 total, 6 up, 6 in
Nov 23 10:10:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:14.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:14.491 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:14.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:14.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:14.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:14.493 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:14 np0005532585.localdomain ceph-mon[300199]: pgmap v625: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 86 KiB/s wr, 5 op/s
Nov 23 10:10:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "05a84a88-73a6-4e8e-8d02-fae54fb3ac71_e44acd9b-b075-41e3-8a7b-1ec7c3a6ac20", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:14 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "snap_name": "05a84a88-73a6-4e8e-8d02-fae54fb3ac71", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:14 np0005532585.localdomain ceph-mon[300199]: osdmap e265: 6 total, 6 up, 6 in
Nov 23 10:10:15 np0005532585.localdomain sshd[336390]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:10:15 np0005532585.localdomain sshd[336390]: Received disconnect from 107.172.15.139 port 56404:11: Bye Bye [preauth]
Nov 23 10:10:15 np0005532585.localdomain sshd[336390]: Disconnected from authenticating user root 107.172.15.139 port 56404 [preauth]
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "60134a7e-4ba8-46c5-95a6-e7b9f66ce444", "format": "json"}]: dispatch
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "60134a7e-4ba8-46c5-95a6-e7b9f66ce444", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "auth_id": "Joe", "tenant_id": "00b5d8dd3ef84698af3b5d7d8243bc4a", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/346654a1-8043-457a-94b6-3b076c21a1d5/cb129bf7-168e-4c36-9f32-82b35d885736", "osd", "allow rw pool=manila_data namespace=fsvolumens_346654a1-8043-457a-94b6-3b076c21a1d5", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/346654a1-8043-457a-94b6-3b076c21a1d5/cb129bf7-168e-4c36-9f32-82b35d885736", "osd", "allow rw pool=manila_data namespace=fsvolumens_346654a1-8043-457a-94b6-3b076c21a1d5", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:10:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "snap_name": "4827a0db-6fdf-413e-96d9-4e5308156408", "format": "json"}]: dispatch
Nov 23 10:10:17 np0005532585.localdomain ceph-mon[300199]: pgmap v627: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 150 KiB/s wr, 9 op/s
Nov 23 10:10:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:10:17 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:10:18 np0005532585.localdomain systemd[1]: tmp-crun.kYhvw4.mount: Deactivated successfully.
Nov 23 10:10:18 np0005532585.localdomain podman[336392]: 2025-11-23 10:10:18.049810712 +0000 UTC m=+0.100490223 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:10:18 np0005532585.localdomain systemd[1]: tmp-crun.mT7iwP.mount: Deactivated successfully.
Nov 23 10:10:18 np0005532585.localdomain podman[336393]: 2025-11-23 10:10:18.092005701 +0000 UTC m=+0.136314696 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 23 10:10:18 np0005532585.localdomain podman[336393]: 2025-11-23 10:10:18.100077579 +0000 UTC m=+0.144386554 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:10:18 np0005532585.localdomain podman[336392]: 2025-11-23 10:10:18.113993638 +0000 UTC m=+0.164673219 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:10:18 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:10:18 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:10:18 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e266 e266: 6 total, 6 up, 6 in
Nov 23 10:10:18 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "format": "json"}]: dispatch
Nov 23 10:10:18 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "57912552-9fe8-4f36-8630-fac59a7e7ddf", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:19.494 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:19.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:19.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:19.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:19.520 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:19.520 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:19 np0005532585.localdomain ceph-mon[300199]: pgmap v628: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 662 B/s rd, 130 KiB/s wr, 8 op/s
Nov 23 10:10:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:19 np0005532585.localdomain ceph-mon[300199]: osdmap e266: 6 total, 6 up, 6 in
Nov 23 10:10:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "format": "json"}]: dispatch
Nov 23 10:10:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "snap_name": "4827a0db-6fdf-413e-96d9-4e5308156408", "target_sub_name": "7d4a8f6e-3803-4134-a981-5fbbe44d23bd", "format": "json"}]: dispatch
Nov 23 10:10:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7d4a8f6e-3803-4134-a981-5fbbe44d23bd", "format": "json"}]: dispatch
Nov 23 10:10:21 np0005532585.localdomain ceph-mon[300199]: pgmap v630: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 195 KiB/s wr, 13 op/s
Nov 23 10:10:21 np0005532585.localdomain ceph-mon[300199]: mgrmap e51: np0005532584.naxwxy(active, since 14m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:10:22 np0005532585.localdomain ceph-mon[300199]: pgmap v631: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 195 KiB/s wr, 13 op/s
Nov 23 10:10:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "auth_id": "Joe", "tenant_id": "7380a21b16c54c28b8907c4cc3ce9099", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:10:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 e267: 6 total, 6 up, 6 in
Nov 23 10:10:23 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 23 10:10:23 np0005532585.localdomain ceph-mon[300199]: osdmap e267: 6 total, 6 up, 6 in
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:10:24 np0005532585.localdomain ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.106:6810/2037590349
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.403 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.403 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.404 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.404 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.521 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.523 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.523 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.523 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.571 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.572 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.844 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.866 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.866 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.867 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:24.868 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:25 np0005532585.localdomain ceph-mon[300199]: pgmap v633: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 114 KiB/s wr, 8 op/s
Nov 23 10:10:25 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-272049507", "format": "json"} : dispatch
Nov 23 10:10:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:26.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:26.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:26.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:26.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:10:26 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "auth_id": "tempest-cephx-id-272049507", "tenant_id": "7380a21b16c54c28b8907c4cc3ce9099", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:10:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-272049507", "caps": ["mds", "allow rw path=/volumes/_nogroup/b9ef5055-ce0d-4f29-b449-58f39c1f00af/ad7774b9-dfc0-45be-aa87-07b19521a8eb", "osd", "allow rw pool=manila_data namespace=fsvolumens_b9ef5055-ce0d-4f29-b449-58f39c1f00af", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:10:26 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-272049507", "caps": ["mds", "allow rw path=/volumes/_nogroup/b9ef5055-ce0d-4f29-b449-58f39c1f00af/ad7774b9-dfc0-45be-aa87-07b19521a8eb", "osd", "allow rw pool=manila_data namespace=fsvolumens_b9ef5055-ce0d-4f29-b449-58f39c1f00af", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:10:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/953996953' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:26 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1750128800' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.253 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.254 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.254 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.255 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.255 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:10:27 np0005532585.localdomain ceph-mon[300199]: pgmap v634: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 165 KiB/s wr, 11 op/s
Nov 23 10:10:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:10:27 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1336387364' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.711 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.796 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:10:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:27.797 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.027 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.028 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11035MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.029 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.029 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.118 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.118 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.118 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.184 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:10:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/466529091' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1336387364' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3979830438' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:10:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2277080277' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.661 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.669 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.703 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.706 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:10:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:28.706 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:10:29 np0005532585.localdomain ceph-mon[300199]: pgmap v635: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 145 KiB/s wr, 10 op/s
Nov 23 10:10:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2277080277' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:10:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:29.570 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:29.706 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:29.707 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:10:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:10:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:10:29 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:10:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:10:30 np0005532585.localdomain podman[336477]: 2025-11-23 10:10:30.043452276 +0000 UTC m=+0.095316214 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:10:30 np0005532585.localdomain podman[336479]: 2025-11-23 10:10:30.108971371 +0000 UTC m=+0.154161265 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 23 10:10:30 np0005532585.localdomain podman[336477]: 2025-11-23 10:10:30.12841508 +0000 UTC m=+0.180278988 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:10:30 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:10:30 np0005532585.localdomain podman[336478]: 2025-11-23 10:10:30.148130276 +0000 UTC m=+0.199576022 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 10:10:30 np0005532585.localdomain podman[336478]: 2025-11-23 10:10:30.182527056 +0000 UTC m=+0.233972812 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:10:30 np0005532585.localdomain podman[336479]: 2025-11-23 10:10:30.19436331 +0000 UTC m=+0.239553194 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 10:10:30 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:10:30 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:10:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 23 10:10:30 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 23 10:10:31 np0005532585.localdomain systemd[1]: tmp-crun.XLMcqY.mount: Deactivated successfully.
Nov 23 10:10:31 np0005532585.localdomain ceph-mon[300199]: pgmap v636: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 72 KiB/s wr, 5 op/s
Nov 23 10:10:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-272049507", "format": "json"} : dispatch
Nov 23 10:10:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-272049507"} : dispatch
Nov 23 10:10:32 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-272049507"}]': finished
Nov 23 10:10:32 np0005532585.localdomain sshd[336540]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:10:33 np0005532585.localdomain ceph-mon[300199]: pgmap v637: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 72 KiB/s wr, 5 op/s
Nov 23 10:10:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "auth_id": "tempest-cephx-id-272049507", "format": "json"}]: dispatch
Nov 23 10:10:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "auth_id": "tempest-cephx-id-272049507", "format": "json"}]: dispatch
Nov 23 10:10:34 np0005532585.localdomain sshd[336540]: Received disconnect from 175.126.166.172 port 56624:11: Bye Bye [preauth]
Nov 23 10:10:34 np0005532585.localdomain sshd[336540]: Disconnected from authenticating user root 175.126.166.172 port 56624 [preauth]
Nov 23 10:10:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:34.573 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:34.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:34.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:34.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:34.619 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:34.620 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:35 np0005532585.localdomain ceph-mon[300199]: pgmap v638: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 598 B/s rd, 70 KiB/s wr, 5 op/s
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: pgmap v639: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 98 KiB/s wr, 6 op/s
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e5534047-ca04-4edc-8696-f70777010bec", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e5534047-ca04-4edc-8696-f70777010bec", "format": "json"}]: dispatch
Nov 23 10:10:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:10:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:10:37 np0005532585.localdomain podman[336542]: 2025-11-23 10:10:37.047144928 +0000 UTC m=+0.090034481 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:10:37 np0005532585.localdomain podman[336542]: 2025-11-23 10:10:37.058177048 +0000 UTC m=+0.101066621 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251118)
Nov 23 10:10:37 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:10:37 np0005532585.localdomain podman[336543]: 2025-11-23 10:10:37.14761652 +0000 UTC m=+0.188836911 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:10:37 np0005532585.localdomain podman[336543]: 2025-11-23 10:10:37.156366589 +0000 UTC m=+0.197586940 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:10:37 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:10:39 np0005532585.localdomain ceph-mon[300199]: pgmap v640: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 64 KiB/s wr, 4 op/s
Nov 23 10:10:39 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Nov 23 10:10:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:39.621 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:39.623 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:39.623 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:39.624 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:39.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:39.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "auth_id": "admin", "tenant_id": "00b5d8dd3ef84698af3b5d7d8243bc4a", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:10:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e5534047-ca04-4edc-8696-f70777010bec", "snap_name": "cde814c2-996b-4a4f-8643-46a807416ffa", "format": "json"}]: dispatch
Nov 23 10:10:41 np0005532585.localdomain ceph-mon[300199]: pgmap v641: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 113 KiB/s wr, 7 op/s
Nov 23 10:10:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:10:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:10:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:10:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:10:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:10:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18784 "" "Go-http-client/1.1"
Nov 23 10:10:42 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 23 10:10:43 np0005532585.localdomain ceph-mon[300199]: pgmap v642: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 87 KiB/s wr, 5 op/s
Nov 23 10:10:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "auth_id": "david", "tenant_id": "00b5d8dd3ef84698af3b5d7d8243bc4a", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:10:43 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/73463559-de38-4e65-91fe-e256e1993ef1/5697482a-5566-4614-b3e6-cbfdb66450d2", "osd", "allow rw pool=manila_data namespace=fsvolumens_73463559-de38-4e65-91fe-e256e1993ef1", "mon", "allow r"], "format": "json"} : dispatch
Nov 23 10:10:43 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/73463559-de38-4e65-91fe-e256e1993ef1/5697482a-5566-4614-b3e6-cbfdb66450d2", "osd", "allow rw pool=manila_data namespace=fsvolumens_73463559-de38-4e65-91fe-e256e1993ef1", "mon", "allow r"], "format": "json"}]': finished
Nov 23 10:10:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:44 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f868dbf5-74f9-4f9b-8554-4f9dd9c9915e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:44 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f868dbf5-74f9-4f9b-8554-4f9dd9c9915e", "format": "json"}]: dispatch
Nov 23 10:10:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:44.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:45 np0005532585.localdomain ceph-mon[300199]: pgmap v643: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 87 KiB/s wr, 5 op/s
Nov 23 10:10:46 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:47 np0005532585.localdomain ceph-mon[300199]: pgmap v644: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 126 KiB/s wr, 7 op/s
Nov 23 10:10:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "format": "json"}]: dispatch
Nov 23 10:10:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f868dbf5-74f9-4f9b-8554-4f9dd9c9915e", "format": "json"}]: dispatch
Nov 23 10:10:48 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f868dbf5-74f9-4f9b-8554-4f9dd9c9915e", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:10:48 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:10:49 np0005532585.localdomain podman[336586]: 2025-11-23 10:10:49.019430784 +0000 UTC m=+0.073701920 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 23 10:10:49 np0005532585.localdomain podman[336586]: 2025-11-23 10:10:49.032740853 +0000 UTC m=+0.087011929 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:10:49 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:10:49 np0005532585.localdomain podman[336585]: 2025-11-23 10:10:49.07551993 +0000 UTC m=+0.130234560 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:10:49 np0005532585.localdomain podman[336585]: 2025-11-23 10:10:49.087305032 +0000 UTC m=+0.142019662 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:10:49 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:10:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:49 np0005532585.localdomain ceph-mon[300199]: pgmap v645: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 88 KiB/s wr, 5 op/s
Nov 23 10:10:49 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 23 10:10:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:49.654 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:49.656 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:49.656 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:49.657 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:49.689 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:49 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:49.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "auth_id": "david", "tenant_id": "7380a21b16c54c28b8907c4cc3ce9099", "access_level": "rw", "format": "json"}]: dispatch
Nov 23 10:10:50 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:50 np0005532585.localdomain sudo[336626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:10:50 np0005532585.localdomain sudo[336626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:10:50 np0005532585.localdomain sudo[336626]: pam_unix(sudo:session): session closed for user root
Nov 23 10:10:51 np0005532585.localdomain sudo[336644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:10:51 np0005532585.localdomain sudo[336644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:10:51 np0005532585.localdomain ceph-mon[300199]: pgmap v646: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 123 KiB/s wr, 7 op/s
Nov 23 10:10:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "026d03c4-9bd7-4b23-aab1-5ab6c0e12032", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "026d03c4-9bd7-4b23-aab1-5ab6c0e12032", "format": "json"}]: dispatch
Nov 23 10:10:51 np0005532585.localdomain sudo[336644]: pam_unix(sudo:session): session closed for user root
Nov 23 10:10:52 np0005532585.localdomain sudo[336694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:10:52 np0005532585.localdomain sudo[336694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:10:52 np0005532585.localdomain sudo[336694]: pam_unix(sudo:session): session closed for user root
Nov 23 10:10:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:10:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:10:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:10:52 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:10:53 np0005532585.localdomain ceph-mon[300199]: pgmap v647: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 75 KiB/s wr, 3 op/s
Nov 23 10:10:53 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "auth_id": "david", "format": "json"}]: dispatch
Nov 23 10:10:53 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "auth_id": "david", "format": "json"}]: dispatch
Nov 23 10:10:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:54.691 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:54.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:10:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:54.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:10:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:54.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "026d03c4-9bd7-4b23-aab1-5ab6c0e12032", "format": "json"}]: dispatch
Nov 23 10:10:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "026d03c4-9bd7-4b23-aab1-5ab6c0e12032", "force": true, "format": "json"}]: dispatch
Nov 23 10:10:54 np0005532585.localdomain ceph-mon[300199]: pgmap v648: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 75 KiB/s wr, 3 op/s
Nov 23 10:10:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:10:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:54.727 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:54.728 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:10:56 np0005532585.localdomain ceph-mon[300199]: pgmap v649: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 113 KiB/s wr, 5 op/s
Nov 23 10:10:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "auth_id": "david", "format": "json"}]: dispatch
Nov 23 10:10:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 23 10:10:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 23 10:10:56 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Nov 23 10:10:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "auth_id": "david", "format": "json"}]: dispatch
Nov 23 10:10:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7d1e8c28-ca2d-4cc0-aed2-5196c93505b5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:10:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7d1e8c28-ca2d-4cc0-aed2-5196c93505b5", "format": "json"}]: dispatch
Nov 23 10:10:57 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:10:58 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Nov 23 10:10:59 np0005532585.localdomain ceph-mon[300199]: pgmap v650: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 73 KiB/s wr, 3 op/s
Nov 23 10:10:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:10:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:10:59.728 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:10:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:10:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:11:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4178085071' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:11:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4178085071' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:11:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:11:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:11:00 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:11:01 np0005532585.localdomain podman[336712]: 2025-11-23 10:11:01.04473959 +0000 UTC m=+0.092720344 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 23 10:11:01 np0005532585.localdomain podman[336713]: 2025-11-23 10:11:01.132805731 +0000 UTC m=+0.176167012 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:11:01 np0005532585.localdomain podman[336714]: 2025-11-23 10:11:01.144015476 +0000 UTC m=+0.185007314 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:11:01 np0005532585.localdomain podman[336714]: 2025-11-23 10:11:01.158611535 +0000 UTC m=+0.199603433 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm)
Nov 23 10:11:01 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:11:01 np0005532585.localdomain podman[336712]: 2025-11-23 10:11:01.215104683 +0000 UTC m=+0.263085497 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 10:11:01 np0005532585.localdomain podman[336713]: 2025-11-23 10:11:01.21629469 +0000 UTC m=+0.259655951 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:11:01 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:11:01 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:11:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "format": "json"}]: dispatch
Nov 23 10:11:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d73f5489-d8e5-493d-9400-04efa839bc7c", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:01 np0005532585.localdomain ceph-mon[300199]: pgmap v651: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 120 KiB/s wr, 6 op/s
Nov 23 10:11:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7d1e8c28-ca2d-4cc0-aed2-5196c93505b5", "format": "json"}]: dispatch
Nov 23 10:11:01 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7d1e8c28-ca2d-4cc0-aed2-5196c93505b5", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:03 np0005532585.localdomain ceph-mon[300199]: pgmap v652: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 85 KiB/s wr, 4 op/s
Nov 23 10:11:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "format": "json"}]: dispatch
Nov 23 10:11:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b9ef5055-ce0d-4f29-b449-58f39c1f00af", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f4b0408c-423f-4414-8b75-dc6199d36194", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:11:04 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:04 np0005532585.localdomain sshd[336771]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:11:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:04.732 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:04.734 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:04.735 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:11:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:04.735 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:04.772 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:04 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:04.773 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:05 np0005532585.localdomain sshd[336771]: Received disconnect from 80.94.93.119 port 28090:11:  [preauth]
Nov 23 10:11:05 np0005532585.localdomain sshd[336771]: Disconnected from authenticating user root 80.94.93.119 port 28090 [preauth]
Nov 23 10:11:05 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f4b0408c-423f-4414-8b75-dc6199d36194", "format": "json"}]: dispatch
Nov 23 10:11:05 np0005532585.localdomain ceph-mon[300199]: pgmap v653: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 85 KiB/s wr, 4 op/s
Nov 23 10:11:05 np0005532585.localdomain sshd[336773]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:11:06 np0005532585.localdomain sshd[336773]: Received disconnect from 207.154.194.2 port 41210:11: Bye Bye [preauth]
Nov 23 10:11:06 np0005532585.localdomain sshd[336773]: Disconnected from authenticating user root 207.154.194.2 port 41210 [preauth]
Nov 23 10:11:07 np0005532585.localdomain ceph-mon[300199]: pgmap v654: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 144 KiB/s wr, 7 op/s
Nov 23 10:11:07 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "format": "json"}]: dispatch
Nov 23 10:11:07 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "346654a1-8043-457a-94b6-3b076c21a1d5", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:11:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:11:08 np0005532585.localdomain podman[336775]: 2025-11-23 10:11:08.054680533 +0000 UTC m=+0.096293275 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 10:11:08 np0005532585.localdomain podman[336775]: 2025-11-23 10:11:08.091577188 +0000 UTC m=+0.133189940 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:11:08 np0005532585.localdomain podman[336776]: 2025-11-23 10:11:08.104738953 +0000 UTC m=+0.141468584 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:11:08 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:11:08 np0005532585.localdomain podman[336776]: 2025-11-23 10:11:08.113441101 +0000 UTC m=+0.150170772 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:11:08 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:11:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f4b0408c-423f-4414-8b75-dc6199d36194", "format": "json"}]: dispatch
Nov 23 10:11:08 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f4b0408c-423f-4414-8b75-dc6199d36194", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:09.309 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:11:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:11:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:09.311 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:11:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:09 np0005532585.localdomain ceph-mon[300199]: pgmap v655: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 106 KiB/s wr, 6 op/s
Nov 23 10:11:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:09.774 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:09.776 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:09.776 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:11:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:09.777 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:09.822 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:09.824 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "auth_id": "admin", "format": "json"}]: dispatch
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: pgmap v656: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 131 KiB/s wr, 8 op/s
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "73463559-de38-4e65-91fe-e256e1993ef1", "format": "json"}]: dispatch
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "73463559-de38-4e65-91fe-e256e1993ef1", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b1a1fa7e-58c0-4be2-8fb7-cb23ab1000cb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b1a1fa7e-58c0-4be2-8fb7-cb23ab1000cb", "format": "json"}]: dispatch
Nov 23 10:11:10 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.811 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.812 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c044dc1e-d845-4631-87f1-40c48b89fe62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.813126', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc2a73dc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': 'a2cae6bdfd2042323766e92e7143c13940e5c696f555028865243072804b9178'}]}, 'timestamp': '2025-11-23 10:11:10.819531', '_unique_id': 'ce3ce53f917d4e55bf88b16107e18d54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.822 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.822 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d8c91f8-1b4e-4acf-96b8-c56b460c26ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.822646', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc2b0928-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '192a7ac019ce8961884fda3054162537e2b84ae805e8f46d1452d0e8d1cb82ee'}]}, 'timestamp': '2025-11-23 10:11:10.823401', '_unique_id': 'd3e5e169446f4be9ae1934b34358663e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.826 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '128fe7a0-f909-4478-b807-da52d4bd867f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.826143', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc2b8c2c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '1854dbd02928e47a609ec330009c0af8a86e5968a7c7eb78c27903241b6c1a12'}]}, 'timestamp': '2025-11-23 10:11:10.826644', '_unique_id': '76dd963a44bf44adb0a79d672002c5e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.866 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8ba88a-0dc7-4eca-ad5f-2f2405227d35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.828953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc31c51a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '16a0111bc600817b72c5da6b35cb85b571ce422fe68c0c57381daffb2ec8c0c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.828953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc31d7da-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'f71a2746f57edd5a1ee91f19704d888aed8ac9f20cafba7b6bc4d428cfce2a9c'}]}, 'timestamp': '2025-11-23 10:11:10.867935', '_unique_id': 'a6718563f3a24397b47dafab14ce8757'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.870 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ec4350b-2531-4909-a1cb-0bdc6a41b23d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.870698', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc32596c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '0b39fa9c9bba480867327869de48275c63dbd1f07b26064b77a8011a23949db0'}]}, 'timestamp': '2025-11-23 10:11:10.871345', '_unique_id': '52b35bc914c646abb170efff0b13e0f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d8c5721-48d7-46cf-96e5-1f9efeee0e01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.873703', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc32cdfc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': 'd36c17a32c031d2ab617009bf305d4f9c38a317375843a24f5bddb73cb4b9233'}]}, 'timestamp': '2025-11-23 10:11:10.874197', '_unique_id': '02d58bc412974dbc8cb25f48409cf7e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f4d012c-3f72-423e-9e19-9c96df8a3305', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.876394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc333620-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'c84478cddd61d74e47f0a7844471f199915000dc58cdf0058531410c239bfe0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.876394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc334836-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '1e24acd1784bfe66d91eef2df82d0b6e71d1b2c24aacc62fba4b4579a40e36c3'}]}, 'timestamp': '2025-11-23 10:11:10.877295', '_unique_id': 'a614cd4b9dbe49559361a7932869526f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '587cb7a5-45fd-43b5-9ec5-228105158d77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.879517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc33b0e6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'e0c1666708b5824bedbea23259218f887aa53edb200e2e30866925598edff8e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.879517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc33c31a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'e89fc9353d299b03062bf8372eacf6655a7c733f60b1d711315d982bfd255bab'}]}, 'timestamp': '2025-11-23 10:11:10.880436', '_unique_id': '9d825c5a624948919a2a0b83d1b47783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7072318-00ce-47bc-87b9-22bd52c3b39b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.882635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3429d6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '857a9704127f035d837f9b8a6cc52ffd413f842b39fda9a0b25a86460a7abaae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.882635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc343bce-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'cdaa17316256b60f96b542ec58d5d8f285430779072da27e263ad627170e1de7'}]}, 'timestamp': '2025-11-23 10:11:10.883528', '_unique_id': '3c37f7593c684f3cbcc372f326206a81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c464c1f6-eb46-404d-8b02-bd412a265a88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.885766', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc34a596-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '37a0738185c9be8ada3bf82d83f14308d9770ff60cf1c2231d9f6434139d7ada'}]}, 'timestamp': '2025-11-23 10:11:10.886268', '_unique_id': 'b6ec2a1511bf454c8b3e4ba18706d224'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58e19235-d0df-41ae-95f0-dc23ee9affbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:11:10.888445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bc37f2be-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.084885156, 'message_signature': '45b186f6a671743d39d0a5c42d161a72cf5555ad4f99781ca762bb13fe713d6b'}]}, 'timestamp': '2025-11-23 10:11:10.907970', '_unique_id': '8876dc64b5384674ae523f27fc146896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36b760e8-4e37-4887-99a5-7782d78d46cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.910351', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc38649c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '10e9a687abddb9c870feb54242fab2b6408777b6225fd3060c90a3203b297578'}]}, 'timestamp': '2025-11-23 10:11:10.911407', '_unique_id': '834744a332ea4977b0c779d05e1971d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '917fe17b-83d5-490d-80af-31598967a2d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.913592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3aa040-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': '5c31d5599f8411af3a14b160757b829691f2de204d2abfed28edd71c6046e921'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.913592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3ab116-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': 'b6dc8d7a337fe99cd4a7f6745cac3085d457c3d6374d9de9f30e28a7c5d6c11a'}]}, 'timestamp': '2025-11-23 10:11:10.925868', '_unique_id': '7d4ffb2c310044ceae54b78bf241dd93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0971faa6-2a75-4467-b95b-ddb1216b7357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.928251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3b2038-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': '5e8d6067c23da7852dbeece06c2da138c57424e50191d35e3bf9350d7074fc45'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.928251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3b31e0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': 'ef831a7d1f384d216e1d65d287b3af494c80ab324ddbd58dd09e09d25be7a9dc'}]}, 'timestamp': '2025-11-23 10:11:10.929148', '_unique_id': '3a2725e2104a4abd8e7ab6b397e9c590'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7aeb4e-ca21-43e2-bae9-7ef6950ed457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.931346', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc3b9874-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '45267dc2dc54b962024fd2642246b398dfb22388af7b8c92cb131ef37bd82af0'}]}, 'timestamp': '2025-11-23 10:11:10.931857', '_unique_id': '1895e26ce9bd4372a7227bb86a50e2b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6495e213-2cd4-4a91-8d85-32feec9dfcfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.934151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3c064c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'f1d6005e5afd51ecfe7a68170857f9f8208077339395dcf6261fd9fc6a0f0bc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.934151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3c1650-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '7ed75ce2e20bd4fe9aab38114ed780f3d8ba99762f209294433cdd56111938af'}]}, 'timestamp': '2025-11-23 10:11:10.935024', '_unique_id': 'e5bfa5ecc78c42359426ae6c5217bb44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bed0f083-b632-4bd2-a93a-af7dbd526518', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.937265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3c7fa0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': 'f28c6465d85e756f774119a99ea631324e980ec43d1dc6f317bbf64c2adeeff8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.937265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3c90c6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': '364706e242644de2fcdb044d38b32b5aea2f62da142cda3ea4525307343c8373'}]}, 'timestamp': '2025-11-23 10:11:10.938130', '_unique_id': '70aef3d6a6734082be90d3294d0334eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.940 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 19800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac3cb9b0-b08c-4739-a62a-bf0fb51dc619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19800000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:11:10.940299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bc3cf5de-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.084885156, 'message_signature': '5ea559a8cda5c1f6579c628f8d5b229e60fb0e13be435889d6e215092b8361cb'}]}, 'timestamp': '2025-11-23 10:11:10.940730', '_unique_id': '15ceb85f82c94335a80080b9d153614e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.943 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77b52242-8537-457b-b517-6ace28a10b2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.943111', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc3d63fc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '4ed8f5adb2133e2f2bf30bed60f9cc06f5d739d82a6410a98e2ab0110fcc3877'}]}, 'timestamp': '2025-11-23 10:11:10.943566', '_unique_id': '2563a0ecbb234574a2bfb2d05f185195'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.945 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.945 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3dad57f-423f-4d8b-852a-84356e72cb95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.945812', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc3dcf0e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '7eef0674d979818729c3be9b08fa5b9c60ad916e7ce81703ec4a95c888b1f8d1'}]}, 'timestamp': '2025-11-23 10:11:10.946304', '_unique_id': 'c63e9a162e454d57999ddba11a6e6d5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.948 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.948 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.948 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.949 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4157d870-e58a-48ff-b698-e6549f2d9a1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.948593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3e3a2a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'f2cc11f272a073b9adec58940189a6e67a3c0fc62c420a26b93b24e139a10228'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.948593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3e4b6e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '74846397c0385369cb5eec45262130f6e522ca7dbe935ae2265a9350e26cd7a4'}]}, 'timestamp': '2025-11-23 10:11:10.949462', '_unique_id': '6406d28ee7814055912a14f0a8279747'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:11:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:11:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:11:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:11:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:11:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:11:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:11:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18792 "" "Go-http-client/1.1"
Nov 23 10:11:12 np0005532585.localdomain sshd[336814]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:11:12 np0005532585.localdomain ceph-mon[300199]: pgmap v657: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 84 KiB/s wr, 5 op/s
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.520221) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673520281, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 1388, "num_deletes": 255, "total_data_size": 1910281, "memory_usage": 1935664, "flush_reason": "Manual Compaction"}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673535494, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 1008116, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35738, "largest_seqno": 37121, "table_properties": {"data_size": 1003172, "index_size": 2223, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14202, "raw_average_key_size": 21, "raw_value_size": 991949, "raw_average_value_size": 1528, "num_data_blocks": 98, "num_entries": 649, "num_filter_entries": 649, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892603, "oldest_key_time": 1763892603, "file_creation_time": 1763892673, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 15343 microseconds, and 5637 cpu microseconds.
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.535563) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 1008116 bytes OK
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.535591) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537183) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537205) EVENT_LOG_v1 {"time_micros": 1763892673537197, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537231) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 1903394, prev total WAL file size 1903718, number of live WAL files 2.
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.538025) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323538' seq:72057594037927935, type:22 .. '6D6772737461740034353131' seq:0, type:0; will stop at (end)
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(984KB)], [60(17MB)]
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673538091, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 19632605, "oldest_snapshot_seqno": -1}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 14508 keys, 17739657 bytes, temperature: kUnknown
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673605550, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 17739657, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17657700, "index_size": 44608, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36293, "raw_key_size": 389229, "raw_average_key_size": 26, "raw_value_size": 17412065, "raw_average_value_size": 1200, "num_data_blocks": 1652, "num_entries": 14508, "num_filter_entries": 14508, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892673, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.605862) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 17739657 bytes
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.607486) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 290.7 rd, 262.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 17.8 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(37.1) write-amplify(17.6) OK, records in: 15015, records dropped: 507 output_compression: NoCompression
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.607516) EVENT_LOG_v1 {"time_micros": 1763892673607503, "job": 36, "event": "compaction_finished", "compaction_time_micros": 67534, "compaction_time_cpu_micros": 27202, "output_level": 6, "num_output_files": 1, "total_output_size": 17739657, "num_input_records": 15015, "num_output_records": 14508, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673607789, "job": 36, "event": "table_file_deletion", "file_number": 62}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673610694, "job": 36, "event": "table_file_deletion", "file_number": 60}
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610736) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:11:13 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:11:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:14.086 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:11:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:14.087 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:11:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:14.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:14.823 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:15 np0005532585.localdomain ceph-mon[300199]: pgmap v658: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 84 KiB/s wr, 5 op/s
Nov 23 10:11:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b1a1fa7e-58c0-4be2-8fb7-cb23ab1000cb", "format": "json"}]: dispatch
Nov 23 10:11:15 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b1a1fa7e-58c0-4be2-8fb7-cb23ab1000cb", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:17 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:17.089 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:11:17 np0005532585.localdomain ceph-mon[300199]: pgmap v659: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 131 KiB/s wr, 7 op/s
Nov 23 10:11:17 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7d4a8f6e-3803-4134-a981-5fbbe44d23bd", "format": "json"}]: dispatch
Nov 23 10:11:19 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:19.826 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:19.828 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:19.828 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:11:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:19.828 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:19.857 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:19 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:19.858 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:11:19 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:11:20 np0005532585.localdomain podman[336817]: 2025-11-23 10:11:20.019926081 +0000 UTC m=+0.074860665 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:11:20 np0005532585.localdomain podman[336817]: 2025-11-23 10:11:20.032811837 +0000 UTC m=+0.087746431 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:11:20 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:11:20 np0005532585.localdomain podman[336816]: 2025-11-23 10:11:20.131053611 +0000 UTC m=+0.186070678 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:11:20 np0005532585.localdomain podman[336816]: 2025-11-23 10:11:20.168317287 +0000 UTC m=+0.223334284 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:11:20 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:11:20 np0005532585.localdomain ceph-mon[300199]: pgmap v660: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 72 KiB/s wr, 4 op/s
Nov 23 10:11:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7d4a8f6e-3803-4134-a981-5fbbe44d23bd", "format": "json"}]: dispatch
Nov 23 10:11:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e5534047-ca04-4edc-8696-f70777010bec", "snap_name": "cde814c2-996b-4a4f-8643-46a807416ffa_53a101d9-305d-472f-aad0-e1fd6eb94453", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e5534047-ca04-4edc-8696-f70777010bec", "snap_name": "cde814c2-996b-4a4f-8643-46a807416ffa", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:20 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:21 np0005532585.localdomain ceph-mon[300199]: pgmap v661: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 94 KiB/s wr, 5 op/s
Nov 23 10:11:21 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a8c9a605-348f-42c0-9962-a9e186065223", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:11:21 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a8c9a605-348f-42c0-9962-a9e186065223", "format": "json"}]: dispatch
Nov 23 10:11:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e5534047-ca04-4edc-8696-f70777010bec", "format": "json"}]: dispatch
Nov 23 10:11:22 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e5534047-ca04-4edc-8696-f70777010bec", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:23 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e268 e268: 6 total, 6 up, 6 in
Nov 23 10:11:23 np0005532585.localdomain ceph-mon[300199]: pgmap v662: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 68 KiB/s wr, 3 op/s
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.355 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.356 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.356 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.357 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:11:24 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "a8c9a605-348f-42c0-9962-a9e186065223", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 23 10:11:24 np0005532585.localdomain ceph-mon[300199]: osdmap e268: 6 total, 6 up, 6 in
Nov 23 10:11:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.820 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.848 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.849 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:11:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:24.905 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:25 np0005532585.localdomain ceph-mon[300199]: pgmap v664: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 82 KiB/s wr, 4 op/s
Nov 23 10:11:25 np0005532585.localdomain ceph-mon[300199]: mgrmap e52: np0005532584.naxwxy(active, since 16m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:11:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:26.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:26.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.238 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.239 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:11:27 np0005532585.localdomain ceph-mon[300199]: pgmap v665: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 76 KiB/s wr, 5 op/s
Nov 23 10:11:27 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1143743016' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:27 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:11:27 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1273085929' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.701 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.792 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.793 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.965 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.966 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11028MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.966 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:11:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:27.966 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.053 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.054 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.054 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.107 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2430657483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.507 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.513 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.532 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 e269: 6 total, 6 up, 6 in
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.534 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:11:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:28.535 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a8c9a605-348f-42c0-9962-a9e186065223", "format": "json"}]: dispatch
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a8c9a605-348f-42c0-9962-a9e186065223", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1273085929' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1604146320' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2430657483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:28 np0005532585.localdomain ceph-mon[300199]: osdmap e269: 6 total, 6 up, 6 in
Nov 23 10:11:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:29.532 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:29.533 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:29 np0005532585.localdomain ceph-mon[300199]: pgmap v666: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 76 KiB/s wr, 5 op/s
Nov 23 10:11:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1574676510' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:29.910 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:11:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:11:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:30.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:30 np0005532585.localdomain ceph-mon[300199]: pgmap v668: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 95 KiB/s wr, 5 op/s
Nov 23 10:11:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2856888675' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:11:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:11:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:11:31 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:11:32 np0005532585.localdomain podman[336904]: 2025-11-23 10:11:32.04110269 +0000 UTC m=+0.093103856 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 10:11:32 np0005532585.localdomain podman[336904]: 2025-11-23 10:11:32.054360868 +0000 UTC m=+0.106362034 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 23 10:11:32 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:11:32 np0005532585.localdomain podman[336903]: 2025-11-23 10:11:32.146506144 +0000 UTC m=+0.200241513 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 10:11:32 np0005532585.localdomain podman[336903]: 2025-11-23 10:11:32.176471255 +0000 UTC m=+0.230206634 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 23 10:11:32 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:11:32 np0005532585.localdomain podman[336902]: 2025-11-23 10:11:32.180920413 +0000 UTC m=+0.237264313 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:11:32 np0005532585.localdomain podman[336902]: 2025-11-23 10:11:32.291089753 +0000 UTC m=+0.347433723 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 23 10:11:32 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:11:32 np0005532585.localdomain ceph-mon[300199]: pgmap v669: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 740 B/s rd, 91 KiB/s wr, 5 op/s
Nov 23 10:11:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0acfdcba-644a-4a19-9206-6fe311fbddf1", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:11:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0acfdcba-644a-4a19-9206-6fe311fbddf1", "format": "json"}]: dispatch
Nov 23 10:11:33 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:34.913 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:34.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:34.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:11:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:34.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:34.943 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:34.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:35 np0005532585.localdomain ceph-mon[300199]: pgmap v670: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 76 KiB/s wr, 4 op/s
Nov 23 10:11:35 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:11:36 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "format": "json"}]: dispatch
Nov 23 10:11:37 np0005532585.localdomain ceph-mon[300199]: pgmap v671: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 2 op/s
Nov 23 10:11:37 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "0acfdcba-644a-4a19-9206-6fe311fbddf1", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Nov 23 10:11:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:11:38 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:11:39 np0005532585.localdomain podman[336964]: 2025-11-23 10:11:39.027107177 +0000 UTC m=+0.081236811 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:11:39 np0005532585.localdomain podman[336964]: 2025-11-23 10:11:39.038850028 +0000 UTC m=+0.092979652 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 23 10:11:39 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:11:39 np0005532585.localdomain podman[336963]: 2025-11-23 10:11:39.132588732 +0000 UTC m=+0.187166460 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:11:39 np0005532585.localdomain podman[336963]: 2025-11-23 10:11:39.146617634 +0000 UTC m=+0.201195362 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 10:11:39 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:11:39 np0005532585.localdomain ceph-mon[300199]: pgmap v672: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 2 op/s
Nov 23 10:11:39 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "snap_name": "59e67742-1f77-44c8-b4a2-e315dfd41df9", "format": "json"}]: dispatch
Nov 23 10:11:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.945 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.946 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.947 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.947 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:39 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:39.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:11:40 np0005532585.localdomain sshd[337006]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:11:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0acfdcba-644a-4a19-9206-6fe311fbddf1", "format": "json"}]: dispatch
Nov 23 10:11:40 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0acfdcba-644a-4a19-9206-6fe311fbddf1", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:40 np0005532585.localdomain sshd[337006]: Invalid user frappe from 107.172.15.139 port 44580
Nov 23 10:11:40 np0005532585.localdomain sshd[337006]: Received disconnect from 107.172.15.139 port 44580:11: Bye Bye [preauth]
Nov 23 10:11:40 np0005532585.localdomain sshd[337006]: Disconnected from invalid user frappe 107.172.15.139 port 44580 [preauth]
Nov 23 10:11:41 np0005532585.localdomain ceph-mon[300199]: pgmap v673: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 183 B/s rd, 92 KiB/s wr, 4 op/s
Nov 23 10:11:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:11:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:11:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:11:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:11:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:11:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1"
Nov 23 10:11:42 np0005532585.localdomain ceph-mon[300199]: from='client.25577 172.18.0.34:0/4122010736' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 23 10:11:43 np0005532585.localdomain ceph-mon[300199]: pgmap v674: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 2 op/s
Nov 23 10:11:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "snap_name": "59e67742-1f77-44c8-b4a2-e315dfd41df9_a9330eec-98f1-4d62-96d0-b42e1209eee2", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "snap_name": "59e67742-1f77-44c8-b4a2-e315dfd41df9", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "05500d0b-b770-4622-aeb8-7264743beada", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 23 10:11:43 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "05500d0b-b770-4622-aeb8-7264743beada", "format": "json"}]: dispatch
Nov 23 10:11:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e270 e270: 6 total, 6 up, 6 in
Nov 23 10:11:43 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:43.948 263258 INFO neutron.agent.linux.ip_lib [None req-525f7317-de4a-43be-8b86-b61be5a2a9bf - - - - - -] Device tap4117331a-ec cannot be used as it has no MAC address
Nov 23 10:11:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.018 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:44 np0005532585.localdomain kernel: device tap4117331a-ec entered promiscuous mode
Nov 23 10:11:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:44Z|00560|binding|INFO|Claiming lport 4117331a-ec62-4320-8258-283ce7293851 for this chassis.
Nov 23 10:11:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:44Z|00561|binding|INFO|4117331a-ec62-4320-8258-283ce7293851: Claiming unknown
Nov 23 10:11:44 np0005532585.localdomain systemd-udevd[337018]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:11:44 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892704.0378] manager: (tap4117331a-ec): new Generic device (/org/freedesktop/NetworkManager/Devices/88)
Nov 23 10:11:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:44.038 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35eacc49dfa4dcdab7bc1d511e6db78', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7edc718d-b211-427a-a4c0-519da72bd765, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=4117331a-ec62-4320-8258-283ce7293851) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:11:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:44.040 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4117331a-ec62-4320-8258-283ce7293851 in datapath 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7 bound to our chassis
Nov 23 10:11:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:44.041 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5f799f24-c8ba-4c61-a1b4-e05c3d9d02a9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:11:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:44.041 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:11:44 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:44.042 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cd468ced-afc6-4451-af91-ca18dc72c19e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:44Z|00562|binding|INFO|Setting lport 4117331a-ec62-4320-8258-283ce7293851 ovn-installed in OVS
Nov 23 10:11:44 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:44Z|00563|binding|INFO|Setting lport 4117331a-ec62-4320-8258-283ce7293851 up in Southbound
Nov 23 10:11:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.064 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap4117331a-ec: No such device
Nov 23 10:11:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.100 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.316 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:44 np0005532585.localdomain ceph-mon[300199]: osdmap e270: 6 total, 6 up, 6 in
Nov 23 10:11:44 np0005532585.localdomain ceph-mon[300199]: pgmap v676: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 76 KiB/s wr, 3 op/s
Nov 23 10:11:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:44.999 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:45 np0005532585.localdomain podman[337089]: 
Nov 23 10:11:45 np0005532585.localdomain podman[337089]: 2025-11-23 10:11:45.048573753 +0000 UTC m=+0.088137143 container create 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:11:45 np0005532585.localdomain systemd[1]: Started libpod-conmon-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757.scope.
Nov 23 10:11:45 np0005532585.localdomain podman[337089]: 2025-11-23 10:11:45.008607893 +0000 UTC m=+0.048171313 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:11:45 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:11:45 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddcb664480a4fa9e4b1887eb87a9eccbf38a1928d563f5f3df15bf5dcf54f58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:11:45 np0005532585.localdomain podman[337089]: 2025-11-23 10:11:45.135390274 +0000 UTC m=+0.174953664 container init 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:11:45 np0005532585.localdomain podman[337089]: 2025-11-23 10:11:45.146193307 +0000 UTC m=+0.185756687 container start 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:11:45 np0005532585.localdomain dnsmasq[337106]: started, version 2.85 cachesize 150
Nov 23 10:11:45 np0005532585.localdomain dnsmasq[337106]: DNS service limited to local subnets
Nov 23 10:11:45 np0005532585.localdomain dnsmasq[337106]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:11:45 np0005532585.localdomain dnsmasq[337106]: warning: no upstream servers configured
Nov 23 10:11:45 np0005532585.localdomain dnsmasq-dhcp[337106]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:11:45 np0005532585.localdomain dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 0 addresses
Nov 23 10:11:45 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host
Nov 23 10:11:45 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts
Nov 23 10:11:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:45.210 263258 INFO neutron.agent.dhcp.agent [None req-0855374f-423e-4dac-87cd-d20513f98355 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:11:44Z, description=, device_id=2dc260f8-5d48-4126-8d42-5f3396592b40, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b5e0d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b45a30>], id=c1256eec-8812-4d83-8b39-ce4a1c9e8997, ip_allocation=immediate, mac_address=fa:16:3e:12:96:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:11:41Z, description=, dns_domain=, id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-657383188-network, port_security_enabled=True, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32379, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3878, status=ACTIVE, subnets=['d87b1163-7cea-4d33-ab31-0f83a2c848f7'], tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:42Z, vlan_transparent=None, network_id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, port_security_enabled=False, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3886, status=DOWN, tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:44Z on network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7
Nov 23 10:11:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:45.299 263258 INFO neutron.agent.dhcp.agent [None req-aa144582-ed6a-4d8e-9609-05c9b75fbf90 - - - - - -] DHCP configuration for ports {'11e31315-69a4-4bf2-baf7-a726200c0c1e'} is completed
Nov 23 10:11:45 np0005532585.localdomain dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 1 addresses
Nov 23 10:11:45 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host
Nov 23 10:11:45 np0005532585.localdomain podman[337122]: 2025-11-23 10:11:45.421347444 +0000 UTC m=+0.049240176 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 10:11:45 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts
Nov 23 10:11:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "format": "json"}]: dispatch
Nov 23 10:11:45 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5409ddfd-477f-40a3-98e6-b831b5ca3a1a", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:45 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:45.845 263258 INFO neutron.agent.dhcp.agent [None req-c17d2edb-f5b2-4c20-99bd-61629349aded - - - - - -] DHCP configuration for ports {'c1256eec-8812-4d83-8b39-ce4a1c9e8997'} is completed
Nov 23 10:11:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:46.385 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:11:44Z, description=, device_id=2dc260f8-5d48-4126-8d42-5f3396592b40, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4de50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b4de20>], id=c1256eec-8812-4d83-8b39-ce4a1c9e8997, ip_allocation=immediate, mac_address=fa:16:3e:12:96:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:11:41Z, description=, dns_domain=, id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-657383188-network, port_security_enabled=True, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32379, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3878, status=ACTIVE, subnets=['d87b1163-7cea-4d33-ab31-0f83a2c848f7'], tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:42Z, vlan_transparent=None, network_id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, port_security_enabled=False, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3886, status=DOWN, tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:44Z on network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7
Nov 23 10:11:46 np0005532585.localdomain dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 1 addresses
Nov 23 10:11:46 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host
Nov 23 10:11:46 np0005532585.localdomain podman[337158]: 2025-11-23 10:11:46.582183176 +0000 UTC m=+0.036278407 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:11:46 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts
Nov 23 10:11:46 np0005532585.localdomain ceph-mon[300199]: pgmap v677: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 83 KiB/s wr, 5 op/s
Nov 23 10:11:46 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:46.799 263258 INFO neutron.agent.dhcp.agent [None req-9d3015ee-edc4-4e21-909d-12fc048c4fc0 - - - - - -] DHCP configuration for ports {'c1256eec-8812-4d83-8b39-ce4a1c9e8997'} is completed
Nov 23 10:11:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "05500d0b-b770-4622-aeb8-7264743beada", "format": "json"}]: dispatch
Nov 23 10:11:47 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "05500d0b-b770-4622-aeb8-7264743beada", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e271 e271: 6 total, 6 up, 6 in
Nov 23 10:11:49 np0005532585.localdomain ceph-mon[300199]: pgmap v678: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 83 KiB/s wr, 5 op/s
Nov 23 10:11:49 np0005532585.localdomain ceph-mon[300199]: osdmap e271: 6 total, 6 up, 6 in
Nov 23 10:11:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:50.001 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:11:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:11:50 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:11:51 np0005532585.localdomain podman[337179]: 2025-11-23 10:11:51.006943497 +0000 UTC m=+0.063210386 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:11:51 np0005532585.localdomain podman[337179]: 2025-11-23 10:11:51.020579417 +0000 UTC m=+0.076846266 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 23 10:11:51 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:11:51 np0005532585.localdomain systemd[1]: tmp-crun.TmHOYG.mount: Deactivated successfully.
Nov 23 10:11:51 np0005532585.localdomain podman[337178]: 2025-11-23 10:11:51.068679867 +0000 UTC m=+0.126040690 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:11:51 np0005532585.localdomain podman[337178]: 2025-11-23 10:11:51.074830626 +0000 UTC m=+0.132191489 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:11:51 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:11:51 np0005532585.localdomain ceph-mon[300199]: pgmap v680: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 106 KiB/s wr, 7 op/s
Nov 23 10:11:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7d4a8f6e-3803-4134-a981-5fbbe44d23bd", "format": "json"}]: dispatch
Nov 23 10:11:51 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7d4a8f6e-3803-4134-a981-5fbbe44d23bd", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:52 np0005532585.localdomain sudo[337221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:11:52 np0005532585.localdomain sudo[337221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:11:52 np0005532585.localdomain sudo[337221]: pam_unix(sudo:session): session closed for user root
Nov 23 10:11:52 np0005532585.localdomain sudo[337239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:11:52 np0005532585.localdomain sudo[337239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:11:52 np0005532585.localdomain systemd[1]: tmp-crun.v1EGwy.mount: Deactivated successfully.
Nov 23 10:11:52 np0005532585.localdomain dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 0 addresses
Nov 23 10:11:52 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host
Nov 23 10:11:52 np0005532585.localdomain dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts
Nov 23 10:11:52 np0005532585.localdomain podman[337287]: 2025-11-23 10:11:52.812007723 +0000 UTC m=+0.076923498 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 23 10:11:52 np0005532585.localdomain sudo[337239]: pam_unix(sudo:session): session closed for user root
Nov 23 10:11:53 np0005532585.localdomain kernel: device tap4117331a-ec left promiscuous mode
Nov 23 10:11:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:53.002 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:53Z|00564|binding|INFO|Releasing lport 4117331a-ec62-4320-8258-283ce7293851 from this chassis (sb_readonly=0)
Nov 23 10:11:53 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:53Z|00565|binding|INFO|Setting lport 4117331a-ec62-4320-8258-283ce7293851 down in Southbound
Nov 23 10:11:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:53.011 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35eacc49dfa4dcdab7bc1d511e6db78', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7edc718d-b211-427a-a4c0-519da72bd765, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=4117331a-ec62-4320-8258-283ce7293851) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:11:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:53.012 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4117331a-ec62-4320-8258-283ce7293851 in datapath 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7 unbound from our chassis
Nov 23 10:11:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:53.015 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:11:53 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:11:53.016 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5eb2b6-c73b-4c57-a444-9b8c975f8834]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:11:53 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:53.020 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:53 np0005532585.localdomain sudo[337327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:11:53 np0005532585.localdomain sudo[337327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:11:53 np0005532585.localdomain sudo[337327]: pam_unix(sudo:session): session closed for user root
Nov 23 10:11:53 np0005532585.localdomain ceph-mon[300199]: pgmap v681: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1018 B/s rd, 106 KiB/s wr, 7 op/s
Nov 23 10:11:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:11:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:11:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:11:53 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:11:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "snap_name": "4827a0db-6fdf-413e-96d9-4e5308156408_7604d04a-aeec-4e7c-8cb1-106f4bdc4a0a", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:54 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "snap_name": "4827a0db-6fdf-413e-96d9-4e5308156408", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:54 np0005532585.localdomain ceph-mon[300199]: pgmap v682: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 85 KiB/s wr, 6 op/s
Nov 23 10:11:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:11:54 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:11:54Z|00566|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:11:54 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:54.923 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:55.004 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:55 np0005532585.localdomain systemd[1]: tmp-crun.nebXYs.mount: Deactivated successfully.
Nov 23 10:11:55 np0005532585.localdomain dnsmasq[337106]: exiting on receipt of SIGTERM
Nov 23 10:11:55 np0005532585.localdomain podman[337363]: 2025-11-23 10:11:55.307027562 +0000 UTC m=+0.066312452 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:11:55 np0005532585.localdomain systemd[1]: libpod-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757.scope: Deactivated successfully.
Nov 23 10:11:55 np0005532585.localdomain podman[337378]: 2025-11-23 10:11:55.375709106 +0000 UTC m=+0.053567489 container died 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:11:55 np0005532585.localdomain podman[337378]: 2025-11-23 10:11:55.42037578 +0000 UTC m=+0.098234103 container cleanup 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:11:55 np0005532585.localdomain systemd[1]: libpod-conmon-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757.scope: Deactivated successfully.
Nov 23 10:11:55 np0005532585.localdomain podman[337379]: 2025-11-23 10:11:55.458629557 +0000 UTC m=+0.132013763 container remove 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 23 10:11:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:55.483 263258 INFO neutron.agent.dhcp.agent [None req-73031e67-7e55-4efc-80ab-1acfd1d70688 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:11:55 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:11:55.680 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:11:56 np0005532585.localdomain systemd[1]: tmp-crun.b4iYMM.mount: Deactivated successfully.
Nov 23 10:11:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-cddcb664480a4fa9e4b1887eb87a9eccbf38a1928d563f5f3df15bf5dcf54f58-merged.mount: Deactivated successfully.
Nov 23 10:11:56 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757-userdata-shm.mount: Deactivated successfully.
Nov 23 10:11:56 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d5b79f11d\x2dbf85\x2d4ba4\x2d8bb8\x2d52ec94385ea7.mount: Deactivated successfully.
Nov 23 10:11:56 np0005532585.localdomain ceph-mon[300199]: pgmap v683: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 86 KiB/s wr, 5 op/s
Nov 23 10:11:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "format": "json"}]: dispatch
Nov 23 10:11:56 np0005532585.localdomain ceph-mon[300199]: from='client.25577 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "97e78d12-95be-4f55-b739-9cd5b5c20fd4", "force": true, "format": "json"}]: dispatch
Nov 23 10:11:57 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:57.213 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:57 np0005532585.localdomain ceph-mon[300199]: mgrmap e53: np0005532584.naxwxy(active, since 16m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:11:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:11:57 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 4849 writes, 37K keys, 4849 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 4849 writes, 4849 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2570 writes, 13K keys, 2570 commit groups, 1.0 writes per commit group, ingest: 19.64 MB, 0.03 MB/s
                                                           Interval WAL: 2570 writes, 2570 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    170.0      0.25              0.11        18    0.014       0      0       0.0       0.0
                                                             L6      1/0   16.92 MB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   6.5    219.3    200.6      1.41              0.76        17    0.083    222K   8796       0.0       0.0
                                                            Sum      1/0   16.92 MB   0.0      0.3     0.0      0.3       0.3      0.1       0.0   7.5    185.8    196.0      1.66              0.88        35    0.048    222K   8796       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0  11.2    181.4    183.8      0.69              0.35        14    0.050     98K   3749       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   0.0    219.3    200.6      1.41              0.76        17    0.083    222K   8796       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    171.9      0.25              0.11        17    0.015       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.042, interval 0.011
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.32 GB write, 0.27 MB/s write, 0.30 GB read, 0.26 MB/s read, 1.7 seconds
                                                           Interval compaction: 0.12 GB write, 0.21 MB/s write, 0.12 GB read, 0.21 MB/s read, 0.7 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5651615b9350#2 capacity: 304.00 MB usage: 26.98 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000195 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1501,25.54 MB,8.40219%) FilterBlock(35,649.30 KB,0.208579%) IndexBlock(35,822.02 KB,0.264062%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 23 10:11:58 np0005532585.localdomain sshd[337406]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:11:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e272 e272: 6 total, 6 up, 6 in
Nov 23 10:11:59 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:11:59.287 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:11:59 np0005532585.localdomain ceph-mon[300199]: pgmap v684: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 86 KiB/s wr, 5 op/s
Nov 23 10:11:59 np0005532585.localdomain ceph-mon[300199]: osdmap e272: 6 total, 6 up, 6 in
Nov 23 10:11:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:11:59 np0005532585.localdomain sshd[337406]: Invalid user wg from 175.126.166.172 port 43712
Nov 23 10:11:59 np0005532585.localdomain sshd[337406]: Received disconnect from 175.126.166.172 port 43712:11: Bye Bye [preauth]
Nov 23 10:11:59 np0005532585.localdomain sshd[337406]: Disconnected from invalid user wg 175.126.166.172 port 43712 [preauth]
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:00 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:11:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:00 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:12:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:00.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:12:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:12:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:12:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1566503771' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:12:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:12:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1566503771' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:12:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1566503771' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:12:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/1566503771' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:12:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:12:00 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 19K writes, 73K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 19K writes, 6508 syncs, 2.94 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 43K keys, 11K commit groups, 1.0 writes per commit group, ingest: 32.13 MB, 0.05 MB/s
                                                          Interval WAL: 11K writes, 4855 syncs, 2.45 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:12:01 np0005532585.localdomain ceph-mon[300199]: pgmap v686: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 67 KiB/s wr, 4 op/s
Nov 23 10:12:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:12:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:12:02 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:12:02 np0005532585.localdomain podman[337409]: 2025-11-23 10:12:02.333216034 +0000 UTC m=+0.110744208 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:12:02 np0005532585.localdomain podman[337409]: 2025-11-23 10:12:02.339814128 +0000 UTC m=+0.117342302 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:12:02 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:12:02 np0005532585.localdomain podman[337434]: 2025-11-23 10:12:02.416359353 +0000 UTC m=+0.075275637 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:12:02 np0005532585.localdomain podman[337408]: 2025-11-23 10:12:02.522567441 +0000 UTC m=+0.303141979 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350)
Nov 23 10:12:02 np0005532585.localdomain podman[337434]: 2025-11-23 10:12:02.54658741 +0000 UTC m=+0.205503654 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 10:12:02 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:12:02 np0005532585.localdomain podman[337408]: 2025-11-23 10:12:02.566371059 +0000 UTC m=+0.346945627 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 23 10:12:02 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:12:02 np0005532585.localdomain ceph-mon[300199]: pgmap v687: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 67 KiB/s wr, 4 op/s
Nov 23 10:12:03 np0005532585.localdomain systemd[1]: tmp-crun.alrday.mount: Deactivated successfully.
Nov 23 10:12:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 e273: 6 total, 6 up, 6 in
Nov 23 10:12:03 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:03Z|00567|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:12:03 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:03.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:04 np0005532585.localdomain ceph-mon[300199]: osdmap e273: 6 total, 6 up, 6 in
Nov 23 10:12:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:12:04 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 27K writes, 102K keys, 27K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.01 MB/s
                                                          Cumulative WAL: 27K writes, 9964 syncs, 2.79 writes per sync, written: 0.09 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 17K writes, 61K keys, 17K commit groups, 1.0 writes per commit group, ingest: 49.73 MB, 0.08 MB/s
                                                          Interval WAL: 17K writes, 7175 syncs, 2.42 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:12:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:05.062 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:05 np0005532585.localdomain ceph-mon[300199]: pgmap v689: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 39 KiB/s wr, 2 op/s
Nov 23 10:12:06 np0005532585.localdomain ceph-mon[300199]: pgmap v690: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 49 KiB/s wr, 3 op/s
Nov 23 10:12:07 np0005532585.localdomain sshd[337469]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:12:07 np0005532585.localdomain sshd[337469]: Received disconnect from 207.154.194.2 port 49400:11: Bye Bye [preauth]
Nov 23 10:12:07 np0005532585.localdomain sshd[337469]: Disconnected from authenticating user root 207.154.194.2 port 49400 [preauth]
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.570079) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728570119, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1082, "num_deletes": 259, "total_data_size": 2010227, "memory_usage": 2030128, "flush_reason": "Manual Compaction"}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728576976, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1323910, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37126, "largest_seqno": 38203, "table_properties": {"data_size": 1319327, "index_size": 2118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11039, "raw_average_key_size": 20, "raw_value_size": 1309602, "raw_average_value_size": 2385, "num_data_blocks": 93, "num_entries": 549, "num_filter_entries": 549, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892673, "oldest_key_time": 1763892673, "file_creation_time": 1763892728, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 6962 microseconds, and 2935 cpu microseconds.
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.577035) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1323910 bytes OK
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.577058) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.579003) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.579016) EVENT_LOG_v1 {"time_micros": 1763892728579011, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.579035) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 2004734, prev total WAL file size 2005058, number of live WAL files 2.
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.581116) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353239' seq:72057594037927935, type:22 .. '6C6F676D0034373831' seq:0, type:0; will stop at (end)
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1292KB)], [63(16MB)]
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728581149, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19063567, "oldest_snapshot_seqno": -1}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 14517 keys, 18926856 bytes, temperature: kUnknown
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728641104, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 18926856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18843182, "index_size": 46261, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36357, "raw_key_size": 390669, "raw_average_key_size": 26, "raw_value_size": 18595694, "raw_average_value_size": 1280, "num_data_blocks": 1718, "num_entries": 14517, "num_filter_entries": 14517, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892728, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.641531) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 18926856 bytes
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.643206) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 317.1 rd, 314.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 16.9 +0.0 blob) out(18.1 +0.0 blob), read-write-amplify(28.7) write-amplify(14.3) OK, records in: 15057, records dropped: 540 output_compression: NoCompression
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.643234) EVENT_LOG_v1 {"time_micros": 1763892728643220, "job": 38, "event": "compaction_finished", "compaction_time_micros": 60117, "compaction_time_cpu_micros": 28094, "output_level": 6, "num_output_files": 1, "total_output_size": 18926856, "num_input_records": 15057, "num_output_records": 14517, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728643664, "job": 38, "event": "table_file_deletion", "file_number": 65}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728646056, "job": 38, "event": "table_file_deletion", "file_number": 63}
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.581065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646178) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646180) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:08 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:09 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:09.023 263258 INFO neutron.agent.linux.ip_lib [None req-1bad5901-8116-484e-9c67-16d4f9024020 - - - - - -] Device tap8d34d210-e3 cannot be used as it has no MAC address
Nov 23 10:12:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:09.098 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:09 np0005532585.localdomain kernel: device tap8d34d210-e3 entered promiscuous mode
Nov 23 10:12:09 np0005532585.localdomain NetworkManager[5975]: <info>  [1763892729.1067] manager: (tap8d34d210-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/89)
Nov 23 10:12:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:09Z|00568|binding|INFO|Claiming lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f for this chassis.
Nov 23 10:12:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:09Z|00569|binding|INFO|8d34d210-e3a3-4e23-87f0-ec4920bc623f: Claiming unknown
Nov 23 10:12:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:09.107 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:09 np0005532585.localdomain systemd-udevd[337481]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.117 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89472f96637447cc97b8792018a15a8e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=883be6eb-f071-49b9-9f06-2b23859983f7, chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8d34d210-e3a3-4e23-87f0-ec4920bc623f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.119 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8d34d210-e3a3-4e23-87f0-ec4920bc623f in datapath 2b870224-659d-44c3-8ebb-5c0e146f53c2 bound to our chassis
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.121 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3a9e6b95-b883-4481-9609-ef21812a17a9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.121 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b870224-659d-44c3-8ebb-5c0e146f53c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.122 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cf09fbf0-4876-4c24-8854-6e8fb5511abd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:12:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:09.140 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:09.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:12:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:09Z|00570|binding|INFO|Setting lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f ovn-installed in OVS
Nov 23 10:12:09 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:09Z|00571|binding|INFO|Setting lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f up in Southbound
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain virtnodedevd[230249]: ethtool ioctl error on tap8d34d210-e3: No such device
Nov 23 10:12:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:09.193 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:09 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:09.233 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:09 np0005532585.localdomain podman[337489]: 2025-11-23 10:12:09.255808278 +0000 UTC m=+0.093359513 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:12:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:12:09 np0005532585.localdomain podman[337489]: 2025-11-23 10:12:09.294413207 +0000 UTC m=+0.131964412 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.309 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.313 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:12:09 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:12:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:09.313 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:12:09 np0005532585.localdomain podman[337530]: 2025-11-23 10:12:09.367082603 +0000 UTC m=+0.089273128 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Nov 23 10:12:09 np0005532585.localdomain podman[337530]: 2025-11-23 10:12:09.409612372 +0000 UTC m=+0.131802897 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:12:09 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:12:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:09 np0005532585.localdomain ceph-mon[300199]: pgmap v691: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 462 B/s rd, 44 KiB/s wr, 3 op/s
Nov 23 10:12:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:10.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:10 np0005532585.localdomain ceph-mon[300199]: pgmap v692: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 7.7 KiB/s wr, 0 op/s
Nov 23 10:12:10 np0005532585.localdomain podman[337593]: 
Nov 23 10:12:10 np0005532585.localdomain podman[337593]: 2025-11-23 10:12:10.90922051 +0000 UTC m=+0.093863075 container create 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 23 10:12:10 np0005532585.localdomain systemd[1]: Started libpod-conmon-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830.scope.
Nov 23 10:12:10 np0005532585.localdomain podman[337593]: 2025-11-23 10:12:10.863561627 +0000 UTC m=+0.048204202 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 23 10:12:10 np0005532585.localdomain systemd[1]: tmp-crun.reHY9I.mount: Deactivated successfully.
Nov 23 10:12:10 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:12:10 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7a0f535b2a4bac0524376de77e2e7218bc8f8de445e4ca8e5fde600221f23be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 10:12:10 np0005532585.localdomain podman[337593]: 2025-11-23 10:12:10.989646071 +0000 UTC m=+0.174288636 container init 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:12:11 np0005532585.localdomain podman[337593]: 2025-11-23 10:12:11.000603767 +0000 UTC m=+0.185246352 container start 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:12:11 np0005532585.localdomain dnsmasq[337611]: started, version 2.85 cachesize 150
Nov 23 10:12:11 np0005532585.localdomain dnsmasq[337611]: DNS service limited to local subnets
Nov 23 10:12:11 np0005532585.localdomain dnsmasq[337611]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 23 10:12:11 np0005532585.localdomain dnsmasq[337611]: warning: no upstream servers configured
Nov 23 10:12:11 np0005532585.localdomain dnsmasq-dhcp[337611]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 23 10:12:11 np0005532585.localdomain dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 0 addresses
Nov 23 10:12:11 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host
Nov 23 10:12:11 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts
Nov 23 10:12:11 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:11.514 263258 INFO neutron.agent.dhcp.agent [None req-71487d0d-4203-4cfe-a9e9-63f9007780a9 - - - - - -] DHCP configuration for ports {'9e43af07-4a3b-4e51-b7a4-61386f16209f'} is completed
Nov 23 10:12:11 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:11.707 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:12:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:12:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:12:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1"
Nov 23 10:12:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:12:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19261 "" "Go-http-client/1.1"
Nov 23 10:12:12 np0005532585.localdomain ceph-mon[300199]: pgmap v693: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 7.7 KiB/s wr, 0 op/s
Nov 23 10:12:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:13.235 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:12:12Z, description=, device_id=53863d0f-d6fc-41eb-8fc3-62cf12ff3867, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7fee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8b7fa00>], id=c947339a-0011-4964-8e3a-0ace7b6f4440, ip_allocation=immediate, mac_address=fa:16:3e:56:43:3e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:12:04Z, description=, dns_domain=, id=2b870224-659d-44c3-8ebb-5c0e146f53c2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-505464635-network, port_security_enabled=True, project_id=89472f96637447cc97b8792018a15a8e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56028, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3936, status=ACTIVE, subnets=['29288101-6a0e-412a-aadc-f80ef429088f'], tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:05Z, vlan_transparent=None, network_id=2b870224-659d-44c3-8ebb-5c0e146f53c2, port_security_enabled=False, project_id=89472f96637447cc97b8792018a15a8e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3944, status=DOWN, tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:13Z on network 2b870224-659d-44c3-8ebb-5c0e146f53c2
Nov 23 10:12:13 np0005532585.localdomain podman[337629]: 2025-11-23 10:12:13.430937721 +0000 UTC m=+0.044694934 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:12:13 np0005532585.localdomain dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 1 addresses
Nov 23 10:12:13 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host
Nov 23 10:12:13 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts
Nov 23 10:12:13 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:13.737 263258 INFO neutron.agent.dhcp.agent [None req-323831dd-6992-41fc-b259-1e1a149a15e9 - - - - - -] DHCP configuration for ports {'c947339a-0011-4964-8e3a-0ace7b6f4440'} is completed
Nov 23 10:12:14 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:14.429 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:12:12Z, description=, device_id=53863d0f-d6fc-41eb-8fc3-62cf12ff3867, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8bd6dc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f32c8ad9d30>], id=c947339a-0011-4964-8e3a-0ace7b6f4440, ip_allocation=immediate, mac_address=fa:16:3e:56:43:3e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:12:04Z, description=, dns_domain=, id=2b870224-659d-44c3-8ebb-5c0e146f53c2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-505464635-network, port_security_enabled=True, project_id=89472f96637447cc97b8792018a15a8e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56028, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3936, status=ACTIVE, subnets=['29288101-6a0e-412a-aadc-f80ef429088f'], tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:05Z, vlan_transparent=None, network_id=2b870224-659d-44c3-8ebb-5c0e146f53c2, port_security_enabled=False, project_id=89472f96637447cc97b8792018a15a8e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3944, status=DOWN, tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:13Z on network 2b870224-659d-44c3-8ebb-5c0e146f53c2
Nov 23 10:12:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:14.555 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:12:14 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:14.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:14 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:14.557 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 23 10:12:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:14 np0005532585.localdomain dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 1 addresses
Nov 23 10:12:14 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host
Nov 23 10:12:14 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts
Nov 23 10:12:14 np0005532585.localdomain podman[337667]: 2025-11-23 10:12:14.65657496 +0000 UTC m=+0.070507837 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 10:12:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:15.069 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:15 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:15.429 263258 INFO neutron.agent.dhcp.agent [None req-6829f279-bb87-49d8-9ad1-6b6cb80048d9 - - - - - -] DHCP configuration for ports {'c947339a-0011-4964-8e3a-0ace7b6f4440'} is completed
Nov 23 10:12:15 np0005532585.localdomain ceph-mon[300199]: pgmap v694: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 7.6 KiB/s wr, 0 op/s
Nov 23 10:12:16 np0005532585.localdomain ceph-mon[300199]: pgmap v695: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 6.4 KiB/s wr, 0 op/s
Nov 23 10:12:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:19 np0005532585.localdomain ceph-mon[300199]: pgmap v696: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:20.073 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:12:20 np0005532585.localdomain ceph-mon[300199]: pgmap v697: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:12:21 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:12:22 np0005532585.localdomain podman[337687]: 2025-11-23 10:12:22.017254695 +0000 UTC m=+0.072679905 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:12:22 np0005532585.localdomain podman[337687]: 2025-11-23 10:12:22.028385017 +0000 UTC m=+0.083810237 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:12:22 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:12:22 np0005532585.localdomain podman[337688]: 2025-11-23 10:12:22.080611201 +0000 UTC m=+0.132155732 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 10:12:22 np0005532585.localdomain podman[337688]: 2025-11-23 10:12:22.11932903 +0000 UTC m=+0.170873551 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:12:22 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:12:22 np0005532585.localdomain ceph-mon[300199]: pgmap v698: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:23 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:23.558 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.427 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.428 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.428 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:12:24 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:24.428 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:12:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.075 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:12:25 np0005532585.localdomain ceph-mon[300199]: pgmap v699: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.763 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.770 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.780 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:12:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:25.780 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:12:26 np0005532585.localdomain ceph-mon[300199]: pgmap v700: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:27.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:12:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:28.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:28.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.281 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.281 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:12:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:29 np0005532585.localdomain ceph-mon[300199]: pgmap v701: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2515636495' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:12:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/193271679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.736 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.811 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:12:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:29.811 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:12:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.024 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.026 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11034MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.026 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.026 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.080 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.109 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.109 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.110 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.172 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:12:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:12:30 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2916345404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/193271679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2885826626' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.615 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.620 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.640 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.642 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:12:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:30.642 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:12:31 np0005532585.localdomain ceph-mon[300199]: pgmap v702: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2916345404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3327313948' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3424500751' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:12:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:31.643 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:31.644 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:12:32 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:32.499 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:32 np0005532585.localdomain ceph-mon[300199]: pgmap v703: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:12:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:12:32 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:12:33 np0005532585.localdomain podman[337775]: 2025-11-23 10:12:33.039807513 +0000 UTC m=+0.086154037 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 10:12:33 np0005532585.localdomain podman[337775]: 2025-11-23 10:12:33.051428141 +0000 UTC m=+0.097774635 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 23 10:12:33 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:12:33 np0005532585.localdomain podman[337776]: 2025-11-23 10:12:33.102589803 +0000 UTC m=+0.141727006 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 23 10:12:33 np0005532585.localdomain podman[337776]: 2025-11-23 10:12:33.146355047 +0000 UTC m=+0.185492180 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 23 10:12:33 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:12:33 np0005532585.localdomain podman[337774]: 2025-11-23 10:12:33.147718059 +0000 UTC m=+0.195574820 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 10:12:33 np0005532585.localdomain podman[337774]: 2025-11-23 10:12:33.227745539 +0000 UTC m=+0.275602320 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:12:33 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:12:34 np0005532585.localdomain dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 0 addresses
Nov 23 10:12:34 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host
Nov 23 10:12:34 np0005532585.localdomain podman[337848]: 2025-11-23 10:12:34.354153769 +0000 UTC m=+0.062941516 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:12:34 np0005532585.localdomain dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts
Nov 23 10:12:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:34Z|00572|binding|INFO|Releasing lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f from this chassis (sb_readonly=0)
Nov 23 10:12:34 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:34Z|00573|binding|INFO|Setting lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f down in Southbound
Nov 23 10:12:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:34.591 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:34 np0005532585.localdomain kernel: device tap8d34d210-e3 left promiscuous mode
Nov 23 10:12:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:34.602 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89472f96637447cc97b8792018a15a8e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=883be6eb-f071-49b9-9f06-2b23859983f7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>], logical_port=8d34d210-e3a3-4e23-87f0-ec4920bc623f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f08bf8cd4c0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 23 10:12:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:34.605 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8d34d210-e3a3-4e23-87f0-ec4920bc623f in datapath 2b870224-659d-44c3-8ebb-5c0e146f53c2 unbound from our chassis
Nov 23 10:12:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:34.607 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b870224-659d-44c3-8ebb-5c0e146f53c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 23 10:12:34 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:12:34.608 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[af690cd4-66dd-4b29-9eb3-768ec25b0eb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 23 10:12:34 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:34.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:35.081 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:35 np0005532585.localdomain ceph-mon[300199]: pgmap v704: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:35 np0005532585.localdomain sshd[337871]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:12:36 np0005532585.localdomain ceph-mon[300199]: pgmap v705: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:37 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:37Z|00574|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:12:37 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:37.515 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:38 np0005532585.localdomain systemd[1]: tmp-crun.iOVqs8.mount: Deactivated successfully.
Nov 23 10:12:38 np0005532585.localdomain dnsmasq[337611]: exiting on receipt of SIGTERM
Nov 23 10:12:38 np0005532585.localdomain podman[337888]: 2025-11-23 10:12:38.707003944 +0000 UTC m=+0.062566563 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 10:12:38 np0005532585.localdomain systemd[1]: libpod-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830.scope: Deactivated successfully.
Nov 23 10:12:38 np0005532585.localdomain podman[337901]: 2025-11-23 10:12:38.757354651 +0000 UTC m=+0.041607858 container died 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 23 10:12:38 np0005532585.localdomain podman[337901]: 2025-11-23 10:12:38.791631695 +0000 UTC m=+0.075884862 container cleanup 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:12:38 np0005532585.localdomain systemd[1]: libpod-conmon-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830.scope: Deactivated successfully.
Nov 23 10:12:38 np0005532585.localdomain podman[337909]: 2025-11-23 10:12:38.871937132 +0000 UTC m=+0.137187216 container remove 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 10:12:39 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:39.499 263258 INFO neutron.agent.dhcp.agent [None req-2a9705fa-0663-4828-9ab1-e1446c7d0128 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:12:39 np0005532585.localdomain neutron_dhcp_agent[263254]: 2025-11-23 10:12:39.507 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 23 10:12:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:39 np0005532585.localdomain ceph-mon[300199]: pgmap v706: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: tmp-crun.NilISF.mount: Deactivated successfully.
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-d7a0f535b2a4bac0524376de77e2e7218bc8f8de445e4ca8e5fde600221f23be-merged.mount: Deactivated successfully.
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830-userdata-shm.mount: Deactivated successfully.
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: run-netns-qdhcp\x2d2b870224\x2d659d\x2d44c3\x2d8ebb\x2d5c0e146f53c2.mount: Deactivated successfully.
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: tmp-crun.4CHOV2.mount: Deactivated successfully.
Nov 23 10:12:39 np0005532585.localdomain podman[337932]: 2025-11-23 10:12:39.783440899 +0000 UTC m=+0.089348176 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 10:12:39 np0005532585.localdomain podman[337932]: 2025-11-23 10:12:39.819747585 +0000 UTC m=+0.125654882 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:12:39 np0005532585.localdomain podman[337933]: 2025-11-23 10:12:39.824283753 +0000 UTC m=+0.127518038 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:12:39 np0005532585.localdomain podman[337933]: 2025-11-23 10:12:39.904296932 +0000 UTC m=+0.207531297 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:12:39 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:12:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:40.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:40.090 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:40 np0005532585.localdomain ceph-mon[300199]: pgmap v707: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.773623) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761773659, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 619, "num_deletes": 251, "total_data_size": 504977, "memory_usage": 516056, "flush_reason": "Manual Compaction"}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761778295, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 328382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38208, "largest_seqno": 38822, "table_properties": {"data_size": 325483, "index_size": 882, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7217, "raw_average_key_size": 19, "raw_value_size": 319655, "raw_average_value_size": 875, "num_data_blocks": 39, "num_entries": 365, "num_filter_entries": 365, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892728, "oldest_key_time": 1763892728, "file_creation_time": 1763892761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4723 microseconds, and 1705 cpu microseconds.
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.778342) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 328382 bytes OK
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.778367) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780089) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780109) EVENT_LOG_v1 {"time_micros": 1763892761780103, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780128) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 501509, prev total WAL file size 501509, number of live WAL files 2.
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780669) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end)
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(320KB)], [66(18MB)]
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761780740, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 19255238, "oldest_snapshot_seqno": -1}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14367 keys, 17820859 bytes, temperature: kUnknown
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761866223, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 17820859, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17739923, "index_size": 43907, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 388018, "raw_average_key_size": 27, "raw_value_size": 17496724, "raw_average_value_size": 1217, "num_data_blocks": 1615, "num_entries": 14367, "num_filter_entries": 14367, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.866501) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 17820859 bytes
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.868105) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.0 rd, 208.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 18.1 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(112.9) write-amplify(54.3) OK, records in: 14882, records dropped: 515 output_compression: NoCompression
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.868121) EVENT_LOG_v1 {"time_micros": 1763892761868114, "job": 40, "event": "compaction_finished", "compaction_time_micros": 85579, "compaction_time_cpu_micros": 55352, "output_level": 6, "num_output_files": 1, "total_output_size": 17820859, "num_input_records": 14882, "num_output_records": 14367, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761868252, "job": 40, "event": "table_file_deletion", "file_number": 68}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761869680, "job": 40, "event": "table_file_deletion", "file_number": 66}
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:41 np0005532585.localdomain ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 10:12:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:12:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:12:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:12:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:12:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:12:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18789 "" "Go-http-client/1.1"
Nov 23 10:12:42 np0005532585.localdomain ceph-mon[300199]: pgmap v708: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:43 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:12:43Z|00575|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0)
Nov 23 10:12:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:43.240 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:44 np0005532585.localdomain ceph-mon[300199]: pgmap v709: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:45.088 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:45.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:46 np0005532585.localdomain sshd[337871]: error: kex_exchange_identification: read: Connection timed out
Nov 23 10:12:46 np0005532585.localdomain sshd[337871]: banner exchange: Connection from 182.61.18.212 port 33222: Connection timed out
Nov 23 10:12:46 np0005532585.localdomain ceph-mon[300199]: pgmap v710: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:49 np0005532585.localdomain ceph-mon[300199]: pgmap v711: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:50.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:50 np0005532585.localdomain ceph-mon[300199]: pgmap v712: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:52 np0005532585.localdomain ceph-mon[300199]: pgmap v713: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:12:52 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:12:53 np0005532585.localdomain podman[337975]: 2025-11-23 10:12:53.014475026 +0000 UTC m=+0.071019113 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:12:53 np0005532585.localdomain podman[337975]: 2025-11-23 10:12:53.050248605 +0000 UTC m=+0.106792712 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:12:53 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:12:53 np0005532585.localdomain systemd[1]: tmp-crun.0kBbEF.mount: Deactivated successfully.
Nov 23 10:12:53 np0005532585.localdomain podman[337976]: 2025-11-23 10:12:53.114571951 +0000 UTC m=+0.157561162 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 10:12:53 np0005532585.localdomain podman[337976]: 2025-11-23 10:12:53.153293111 +0000 UTC m=+0.196282312 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:12:53 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:12:53 np0005532585.localdomain sudo[338017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:12:53 np0005532585.localdomain sudo[338017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:12:53 np0005532585.localdomain sudo[338017]: pam_unix(sudo:session): session closed for user root
Nov 23 10:12:53 np0005532585.localdomain sudo[338035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:12:53 np0005532585.localdomain sudo[338035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:12:54 np0005532585.localdomain sudo[338035]: pam_unix(sudo:session): session closed for user root
Nov 23 10:12:54 np0005532585.localdomain sudo[338086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:12:54 np0005532585.localdomain sudo[338086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:12:54 np0005532585.localdomain sudo[338086]: pam_unix(sudo:session): session closed for user root
Nov 23 10:12:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:54 np0005532585.localdomain ceph-mon[300199]: pgmap v714: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:12:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:12:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:12:54 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:12:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:55.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:12:55.097 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:12:56 np0005532585.localdomain ceph-mon[300199]: pgmap v715: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:58 np0005532585.localdomain ceph-mon[300199]: pgmap v716: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:12:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:12:59 np0005532585.localdomain sshd[338104]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:12:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:12:59 np0005532585.localdomain sshd[338104]: Invalid user testssh from 107.172.15.139 port 58304
Nov 23 10:12:59 np0005532585.localdomain sshd[338104]: Received disconnect from 107.172.15.139 port 58304:11: Bye Bye [preauth]
Nov 23 10:12:59 np0005532585.localdomain sshd[338104]: Disconnected from invalid user testssh 107.172.15.139 port 58304 [preauth]
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:12:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:12:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:13:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:00.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4154043123' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4154043123' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: pgmap v717: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4154043123' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:13:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/4154043123' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:13:02 np0005532585.localdomain ceph-mon[300199]: pgmap v718: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:13:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:13:03 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:13:04 np0005532585.localdomain podman[338107]: 2025-11-23 10:13:04.03557689 +0000 UTC m=+0.088052617 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:13:04 np0005532585.localdomain podman[338107]: 2025-11-23 10:13:04.044255096 +0000 UTC m=+0.096730763 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 10:13:04 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:13:04 np0005532585.localdomain podman[338108]: 2025-11-23 10:13:04.094302574 +0000 UTC m=+0.143939844 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Nov 23 10:13:04 np0005532585.localdomain podman[338108]: 2025-11-23 10:13:04.134310083 +0000 UTC m=+0.183947363 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9)
Nov 23 10:13:04 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:13:04 np0005532585.localdomain podman[338106]: 2025-11-23 10:13:04.189077237 +0000 UTC m=+0.241346108 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:13:04 np0005532585.localdomain podman[338106]: 2025-11-23 10:13:04.266420062 +0000 UTC m=+0.318688973 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:13:04 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:13:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:04 np0005532585.localdomain ceph-mon[300199]: pgmap v719: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:05.099 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:13:06 np0005532585.localdomain ceph-mon[300199]: pgmap v720: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:08 np0005532585.localdomain ceph-mon[300199]: pgmap v721: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:13:09.309 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:13:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:13:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:13:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:13:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:13:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:13:09 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:13:10 np0005532585.localdomain podman[338165]: 2025-11-23 10:13:10.018539591 +0000 UTC m=+0.075712477 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 10:13:10 np0005532585.localdomain podman[338165]: 2025-11-23 10:13:10.031282472 +0000 UTC m=+0.088455328 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 10:13:10 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:13:10 np0005532585.localdomain podman[338166]: 2025-11-23 10:13:10.080138284 +0000 UTC m=+0.128962704 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 10:13:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:10.101 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:13:10 np0005532585.localdomain podman[338166]: 2025-11-23 10:13:10.115875952 +0000 UTC m=+0.164700402 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:13:10 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:13:10 np0005532585.localdomain sshd[338208]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.811 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.813 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.816 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f2852cb-16a2-4951-a51a-388c8dbdbed8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.813249', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b0bd7e-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'e2e72c071e9b4478ec86af9c9ca662c983fd44e363bd783580ec8e6816b3b2ae'}]}, 'timestamp': '2025-11-23 10:13:10.817829', '_unique_id': '9b8727b8e87e49678ec2aa1ae19cb03f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceph-mon[300199]: pgmap v722: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc32106a-d93c-4c09-9ad0-ee96cd32d224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.820930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b5116c-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '1493a77b848df57fd843a6bac2e6d212452f1987ec11df29158b61d69045ece6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.820930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b525bc-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': 'f0cebeede2ee626fdcc1a574f85bdfbb64ee7ca719e3899462e5f050b29067c1'}]}, 'timestamp': '2025-11-23 10:13:10.846621', '_unique_id': '05ca018c295544a2bc057e373ac91aff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.849 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e41728c8-9cd7-417b-aa7a-7b8b19cc09c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.849194', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b59c2c-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '8d325b9b0ab971ec3d80cb4b3363bc71330a840a560303b881bb057631525ce8'}]}, 'timestamp': '2025-11-23 10:13:10.849682', '_unique_id': '67f23e9e503e4148bdb6f75d41f4fe29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ccee0b8-f5a0-41d4-9967-b6dd6171f135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.852039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b60d38-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '3dd6d96170460b0f4d6a02d90237980afee87d8d76b47744f5e31e63b941232d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.852039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b61dd2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '40f0e26e7bfb568d265029862c0dfc10818c8b467e04f88a2a20617da42cf94d'}]}, 'timestamp': '2025-11-23 10:13:10.852998', '_unique_id': '477a0f518fee4c738d563b2fb6567c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee7c670f-5a93-44f6-9b8d-0334ed526499', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.855341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b80318-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': '3cab1ff58f28f8be21fcd50544753e765b102856d3d49f56891eca5d2f17127e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.855341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b813b2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': 'fba3045610c52c5749ea042203d29a02ed8a6a906defd19521cf5c258ffde47a'}]}, 'timestamp': '2025-11-23 10:13:10.865808', '_unique_id': '1b513730c83f45ee9ddddc520cbc19d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9364228-93d6-4fc9-a564-9fd016165aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.868060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b87c3a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': 'd37f554c39c43ff75c32e13eea755df7841af36bb9e59e5df7f9f7ce84455b65'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.868060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b88c2a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': '2a8caafef0d2b81e1364a0468e06a9e796981c36a58df5550e6a0c3f26407e32'}]}, 'timestamp': '2025-11-23 10:13:10.868919', '_unique_id': '0ec87e0bd42048458c12d31215a3f427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e8666b5-fe6e-4c3d-83ef-9d8cbee507ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.871521', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b903b2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'd785bbd63d44c369d7b3aef3185964ce799dfe3188802ef222e7b9ec523f338c'}]}, 'timestamp': '2025-11-23 10:13:10.872027', '_unique_id': '4f99a06ecb4949828e309d32e371bdcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49e81172-dc73-4e5e-ada8-40952647de35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.874116', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b968c0-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '2299edc91546bcd214850eddd47e6e1f6e7292116a0d2bd178eedce6883e6698'}]}, 'timestamp': '2025-11-23 10:13:10.874591', '_unique_id': '7570ecea5936404f945148994f20a86d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d70222e-84e6-4e45-9153-b331d7337bfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.876739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b9d026-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '56ece8da771856f5ef37f9e5792f38df3230953be0e986b81a1866d143112f69'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.876739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b9e066-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '0b02d004e5933e1f475c222f2f058aa3a9c7a6c4f7fca1d32cbad96d581ad457'}]}, 'timestamp': '2025-11-23 10:13:10.877602', '_unique_id': 'efbfa587e5d5414c9c886818432886b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc1c1913-41d1-472e-af05-f9bed1eade17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.879923', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03ba4c40-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'ab84943e206a5b2ab59da5c952146ec13a276acffaf2db05b0bf48b3c202b487'}]}, 'timestamp': '2025-11-23 10:13:10.880392', '_unique_id': 'a2c76ab8cd634675bc515b50784b1c03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b4bcde5-6da5-4099-b08c-be3fd8797eac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:13:10.882562', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '03bcaaf8-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.072966644, 'message_signature': 'd80fe97e419d522e2c7feed5f6dfce2e1ad4475e2d297768c15dd6b7fdb0a959'}]}, 'timestamp': '2025-11-23 10:13:10.895951', '_unique_id': '304ec9640bf44eaf87bfdd6176f0a792'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.898 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 20430000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e217b216-b5ec-4cee-92b8-f208cec992bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20430000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:13:10.898065', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '03bd101a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.072966644, 'message_signature': '068863c6096cbcac171ad4c469d3b3a8759fbaf820a58061ffc9eabd6ff2796c'}]}, 'timestamp': '2025-11-23 10:13:10.898531', '_unique_id': '91592f9c1ec542b49d30b566a284aaab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.901 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e65cae1-316f-4b26-93f2-e2b596a82430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.900618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03bd73b6-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': 'c99bdf5105ad1426c94fac4543fa55e83cb9e002dde53eb9946601e2b83b998e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.900618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03bd855e-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '1daf439033514dce3e82f6e3844c2906ab106776f84a78c7dcbcb3c68f468f86'}]}, 'timestamp': '2025-11-23 10:13:10.901484', '_unique_id': '27e7bfa881e1443b96fa8f634e73e913'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df9664a-109c-42bc-8105-6a7872d1b1f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.903867', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03bdf8d6-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '5a02f3ec0940ccb41f409b926df14745fc47b7b58412143abb3ce7ce236bc03d'}]}, 'timestamp': '2025-11-23 10:13:10.904479', '_unique_id': '7fff7227dd254f22aaeb847bc98867aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ce11305-f31d-4b03-89a8-e5768ec6908c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.907795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03be926e-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': 'bbe322cc50fe5a94422de4dffabadba5c3281fd092ed972702762609c9031fe2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.907795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03bea538-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': '105340986568af5c4db720a7795021c811da40f8538f67064b57e3cacf725ab4'}]}, 'timestamp': '2025-11-23 10:13:10.908872', '_unique_id': '67b1c0bb431746229e70f45a78b17d9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain sshd[338208]: Invalid user ftpuser from 207.154.194.2 port 43452
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc04e770-01da-4be0-a222-cc22ae85783d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.912023', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03bf3480-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '737df3749b523310e4e294865e515f29949c868205719f3fd2dfbef1e91388b5'}]}, 'timestamp': '2025-11-23 10:13:10.912608', '_unique_id': '751f9f3e8f574c3e85838b7392ce17e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a51f1205-411d-4d98-b8d2-4406aabbbe1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.915248', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03bfbdc4-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '94fda37ee7fe8633681ba681fd6113543dd3a632b5c8a187b2335e3e5d3ec380'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.915248', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03bfd32c-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '3fb8846b3b1ef1392379be870acb8c37b726a1c70e06f57542044990c5d273e9'}]}, 'timestamp': '2025-11-23 10:13:10.916605', '_unique_id': '33233522b5b0429383631fe0c6650432'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef6bf948-c281-4fa1-94f0-1bb314565d2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.919363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03c052a2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '2bad0a4f9b4b6325a4a87175f133be297cc6e82f17fe2c6e2adff70a6c6ffba9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.919363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03c0662a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '020fd069616bfd3897b0bc96d429690f5950996dbd1c6142cc8e58c6220a518b'}]}, 'timestamp': '2025-11-23 10:13:10.920356', '_unique_id': 'a8e7980013d440518f2844f5ee4e6a50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26ab5a97-857e-45df-9fe8-c1bce67c7c95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.923269', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03c0eafa-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'c87a907e733f76a5d520f6b52bf35b5b08ff9924108d25d31a005ce40e8effa5'}]}, 'timestamp': '2025-11-23 10:13:10.923877', '_unique_id': 'bc2b428e7668490c84b7749914cbe70b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2faf7e54-a318-4ee2-ba48-ea31c366fa68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.926356', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03c16430-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'a4ac80c3e9dadc4f045b50dc7ed24fd66f181456d7cd29d13f624cb63cd39db8'}]}, 'timestamp': '2025-11-23 10:13:10.926967', '_unique_id': '53de440ef8264a00a1572dc14d65dbe6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f3216ba-9b7d-4784-aefe-d287bfb576f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.929368', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03c1d8de-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'bb800bd3dc50a7e0a255040f46ef3e07735794e37b30c65a431f0dcb64f30189'}]}, 'timestamp': '2025-11-23 10:13:10.929952', '_unique_id': '82149093d79d4c09aad14a22bcd0d8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     yield
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 23 10:13:10 np0005532585.localdomain ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging 
Nov 23 10:13:11 np0005532585.localdomain sshd[338208]: Received disconnect from 207.154.194.2 port 43452:11: Bye Bye [preauth]
Nov 23 10:13:11 np0005532585.localdomain sshd[338208]: Disconnected from invalid user ftpuser 207.154.194.2 port 43452 [preauth]
Nov 23 10:13:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:13:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:13:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:13:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:13:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:13:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1"
Nov 23 10:13:12 np0005532585.localdomain sshd[336814]: fatal: Timeout before authentication for 182.61.18.212 port 54940
Nov 23 10:13:12 np0005532585.localdomain ceph-mon[300199]: pgmap v723: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:14 np0005532585.localdomain sshd[338210]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:14 np0005532585.localdomain sshd[338210]: Accepted publickey for zuul from 38.102.83.114 port 47324 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:14 np0005532585.localdomain systemd-logind[761]: New session 74 of user zuul.
Nov 23 10:13:14 np0005532585.localdomain systemd[1]: Started Session 74 of User zuul.
Nov 23 10:13:14 np0005532585.localdomain sshd[338210]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:14 np0005532585.localdomain ceph-mon[300199]: pgmap v724: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:14 np0005532585.localdomain sudo[338230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubbbodaceazlvdwkjiqkqgxalmiqigtt ; /usr/bin/python3
Nov 23 10:13:14 np0005532585.localdomain sudo[338230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:14 np0005532585.localdomain ovn_controller[154788]: 2025-11-23T10:13:14Z|00576|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Nov 23 10:13:14 np0005532585.localdomain python3[338232]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-fc5a-8bfb-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 10:13:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:15.104 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:15 np0005532585.localdomain sudo[338230]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:16 np0005532585.localdomain ceph-mon[300199]: pgmap v725: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:19 np0005532585.localdomain ceph-mon[300199]: pgmap v726: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:20.106 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:13:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:20.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:20.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:13:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:20.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:13:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:20.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:13:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:20.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:20 np0005532585.localdomain ceph-mon[300199]: pgmap v727: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:20 np0005532585.localdomain sshd[338210]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:20 np0005532585.localdomain systemd[1]: session-74.scope: Deactivated successfully.
Nov 23 10:13:20 np0005532585.localdomain systemd-logind[761]: Session 74 logged out. Waiting for processes to exit.
Nov 23 10:13:20 np0005532585.localdomain systemd-logind[761]: Removed session 74.
Nov 23 10:13:22 np0005532585.localdomain sshd[338235]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:22 np0005532585.localdomain ceph-mon[300199]: pgmap v728: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:23 np0005532585.localdomain sshd[338235]: Received disconnect from 175.126.166.172 port 44952:11: Bye Bye [preauth]
Nov 23 10:13:23 np0005532585.localdomain sshd[338235]: Disconnected from authenticating user root 175.126.166.172 port 44952 [preauth]
Nov 23 10:13:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:13:23 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:13:23 np0005532585.localdomain podman[338237]: 2025-11-23 10:13:23.573539191 +0000 UTC m=+0.068708302 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:13:23 np0005532585.localdomain podman[338237]: 2025-11-23 10:13:23.582745343 +0000 UTC m=+0.077914514 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:13:23 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:13:23 np0005532585.localdomain systemd[1]: tmp-crun.NEN8OT.mount: Deactivated successfully.
Nov 23 10:13:23 np0005532585.localdomain podman[338238]: 2025-11-23 10:13:23.650985671 +0000 UTC m=+0.143712737 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 23 10:13:23 np0005532585.localdomain podman[338238]: 2025-11-23 10:13:23.686494502 +0000 UTC m=+0.179221498 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true)
Nov 23 10:13:23 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:13:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:24 np0005532585.localdomain ceph-mon[300199]: pgmap v729: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.111 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.473 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.474 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.474 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.475 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:13:25 np0005532585.localdomain ceph-mon[300199]: mgrmap e54: np0005532584.naxwxy(active, since 18m), standbys: np0005532585.gzafiw, np0005532586.thmvqb
Nov 23 10:13:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:25.998 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:13:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:26.012 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:13:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:26.013 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:13:26 np0005532585.localdomain ceph-mon[300199]: pgmap v730: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 23 10:13:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:28.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:28.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:28.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:28 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:28.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:13:28 np0005532585.localdomain ceph-mon[300199]: pgmap v731: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.238 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.239 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:13:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:13:29 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1995420351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.657 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.755 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.755 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:13:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/159070162' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1995420351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.956 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.958 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11027MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.958 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:13:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:29.959 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:13:29 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.144 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.145 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.145 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.301 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:13:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:13:30 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/984744193' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.759 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.764 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.801 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.803 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.804 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.804 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:30.831 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:30 np0005532585.localdomain ceph-mon[300199]: pgmap v732: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 23 10:13:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/984744193' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3920762680' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:31.840 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2356149243' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:31.841 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:32 np0005532585.localdomain ceph-mon[300199]: pgmap v733: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 23 10:13:32 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1143693070' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:13:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:33.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:33 np0005532585.localdomain sshd[338320]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:33 np0005532585.localdomain sshd[338320]: Accepted publickey for zuul from 38.102.83.114 port 35868 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:33 np0005532585.localdomain systemd-logind[761]: New session 75 of user zuul.
Nov 23 10:13:33 np0005532585.localdomain systemd[1]: Started Session 75 of User zuul.
Nov 23 10:13:33 np0005532585.localdomain sshd[338320]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:33 np0005532585.localdomain sudo[338324]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Nov 23 10:13:33 np0005532585.localdomain sudo[338324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:34 np0005532585.localdomain sudo[338324]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:34 np0005532585.localdomain sshd[338323]: Received disconnect from 38.102.83.114 port 35868:11: disconnected by user
Nov 23 10:13:34 np0005532585.localdomain sshd[338323]: Disconnected from user zuul 38.102.83.114 port 35868
Nov 23 10:13:34 np0005532585.localdomain sshd[338320]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:34 np0005532585.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Nov 23 10:13:34 np0005532585.localdomain systemd-logind[761]: Session 75 logged out. Waiting for processes to exit.
Nov 23 10:13:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:13:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:13:34 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:13:34 np0005532585.localdomain systemd-logind[761]: Removed session 75.
Nov 23 10:13:34 np0005532585.localdomain ceph-mon[300199]: pgmap v734: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 23 10:13:34 np0005532585.localdomain podman[338342]: 2025-11-23 10:13:34.753010071 +0000 UTC m=+0.115839930 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 10:13:34 np0005532585.localdomain podman[338343]: 2025-11-23 10:13:34.768235008 +0000 UTC m=+0.128034124 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 10:13:34 np0005532585.localdomain podman[338342]: 2025-11-23 10:13:34.848857136 +0000 UTC m=+0.211687035 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:13:34 np0005532585.localdomain systemd[1]: tmp-crun.3NdZmd.mount: Deactivated successfully.
Nov 23 10:13:34 np0005532585.localdomain sshd[338402]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:34 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:13:34 np0005532585.localdomain podman[338349]: 2025-11-23 10:13:34.858528833 +0000 UTC m=+0.206718982 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Nov 23 10:13:35 np0005532585.localdomain sshd[338402]: Accepted publickey for zuul from 38.102.83.114 port 35884 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:35 np0005532585.localdomain podman[338349]: 2025-11-23 10:13:35.013517875 +0000 UTC m=+0.361707954 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:13:35 np0005532585.localdomain systemd-logind[761]: New session 76 of user zuul.
Nov 23 10:13:35 np0005532585.localdomain systemd[1]: Started Session 76 of User zuul.
Nov 23 10:13:35 np0005532585.localdomain sshd[338402]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:35 np0005532585.localdomain podman[338343]: 2025-11-23 10:13:35.068197675 +0000 UTC m=+0.427996801 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 10:13:35 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:13:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:35.116 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:35.119 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:35 np0005532585.localdomain sudo[338406]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Nov 23 10:13:35 np0005532585.localdomain sudo[338406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:35 np0005532585.localdomain sudo[338406]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:35 np0005532585.localdomain sshd[338405]: Received disconnect from 38.102.83.114 port 35884:11: disconnected by user
Nov 23 10:13:35 np0005532585.localdomain sshd[338405]: Disconnected from user zuul 38.102.83.114 port 35884
Nov 23 10:13:35 np0005532585.localdomain sshd[338402]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:35 np0005532585.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Nov 23 10:13:35 np0005532585.localdomain systemd-logind[761]: Session 76 logged out. Waiting for processes to exit.
Nov 23 10:13:35 np0005532585.localdomain systemd-logind[761]: Removed session 76.
Nov 23 10:13:35 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:13:35 np0005532585.localdomain sshd[338424]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:35 np0005532585.localdomain sshd[338424]: Accepted publickey for zuul from 38.102.83.114 port 35888 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:35 np0005532585.localdomain systemd-logind[761]: New session 77 of user zuul.
Nov 23 10:13:35 np0005532585.localdomain systemd[1]: Started Session 77 of User zuul.
Nov 23 10:13:35 np0005532585.localdomain sshd[338424]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:35 np0005532585.localdomain sudo[338428]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Nov 23 10:13:35 np0005532585.localdomain sudo[338428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:35 np0005532585.localdomain sudo[338428]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:35 np0005532585.localdomain sshd[338427]: Received disconnect from 38.102.83.114 port 35888:11: disconnected by user
Nov 23 10:13:35 np0005532585.localdomain sshd[338427]: Disconnected from user zuul 38.102.83.114 port 35888
Nov 23 10:13:35 np0005532585.localdomain sshd[338424]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:35 np0005532585.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Nov 23 10:13:35 np0005532585.localdomain systemd-logind[761]: Session 77 logged out. Waiting for processes to exit.
Nov 23 10:13:35 np0005532585.localdomain systemd-logind[761]: Removed session 77.
Nov 23 10:13:36 np0005532585.localdomain sshd[338446]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:36 np0005532585.localdomain sshd[338446]: Accepted publickey for zuul from 38.102.83.114 port 36744 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:36 np0005532585.localdomain systemd-logind[761]: New session 78 of user zuul.
Nov 23 10:13:36 np0005532585.localdomain systemd[1]: Started Session 78 of User zuul.
Nov 23 10:13:36 np0005532585.localdomain sshd[338446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:36.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:36.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 23 10:13:36 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:36.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 23 10:13:36 np0005532585.localdomain sudo[338450]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Nov 23 10:13:36 np0005532585.localdomain sudo[338450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:36 np0005532585.localdomain sudo[338450]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:36 np0005532585.localdomain sshd[338449]: Received disconnect from 38.102.83.114 port 36744:11: disconnected by user
Nov 23 10:13:36 np0005532585.localdomain sshd[338449]: Disconnected from user zuul 38.102.83.114 port 36744
Nov 23 10:13:36 np0005532585.localdomain sshd[338446]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:36 np0005532585.localdomain systemd-logind[761]: Session 78 logged out. Waiting for processes to exit.
Nov 23 10:13:36 np0005532585.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Nov 23 10:13:36 np0005532585.localdomain systemd-logind[761]: Removed session 78.
Nov 23 10:13:36 np0005532585.localdomain sshd[338468]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:36 np0005532585.localdomain sshd[338468]: Accepted publickey for zuul from 38.102.83.114 port 36752 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:36 np0005532585.localdomain systemd-logind[761]: New session 79 of user zuul.
Nov 23 10:13:36 np0005532585.localdomain systemd[1]: Started Session 79 of User zuul.
Nov 23 10:13:36 np0005532585.localdomain sshd[338468]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:36 np0005532585.localdomain ceph-mon[300199]: pgmap v735: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 23 10:13:36 np0005532585.localdomain sudo[338472]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Nov 23 10:13:36 np0005532585.localdomain sudo[338472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:36 np0005532585.localdomain sudo[338472]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:36 np0005532585.localdomain sshd[338471]: Received disconnect from 38.102.83.114 port 36752:11: disconnected by user
Nov 23 10:13:36 np0005532585.localdomain sshd[338471]: Disconnected from user zuul 38.102.83.114 port 36752
Nov 23 10:13:36 np0005532585.localdomain sshd[338468]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:36 np0005532585.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Nov 23 10:13:36 np0005532585.localdomain systemd-logind[761]: Session 79 logged out. Waiting for processes to exit.
Nov 23 10:13:36 np0005532585.localdomain systemd-logind[761]: Removed session 79.
Nov 23 10:13:37 np0005532585.localdomain sshd[338491]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:37 np0005532585.localdomain sshd[338491]: Accepted publickey for zuul from 38.102.83.114 port 36760 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:37 np0005532585.localdomain systemd-logind[761]: New session 80 of user zuul.
Nov 23 10:13:37 np0005532585.localdomain systemd[1]: Started Session 80 of User zuul.
Nov 23 10:13:37 np0005532585.localdomain sshd[338491]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:37 np0005532585.localdomain sudo[338495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Nov 23 10:13:37 np0005532585.localdomain sudo[338495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:37 np0005532585.localdomain sudo[338495]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:37 np0005532585.localdomain sshd[338494]: Received disconnect from 38.102.83.114 port 36760:11: disconnected by user
Nov 23 10:13:37 np0005532585.localdomain sshd[338494]: Disconnected from user zuul 38.102.83.114 port 36760
Nov 23 10:13:37 np0005532585.localdomain sshd[338491]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:37 np0005532585.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Nov 23 10:13:37 np0005532585.localdomain systemd-logind[761]: Session 80 logged out. Waiting for processes to exit.
Nov 23 10:13:37 np0005532585.localdomain systemd-logind[761]: Removed session 80.
Nov 23 10:13:37 np0005532585.localdomain sshd[338513]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:37 np0005532585.localdomain sshd[338513]: Accepted publickey for zuul from 38.102.83.114 port 36766 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:37 np0005532585.localdomain systemd-logind[761]: New session 81 of user zuul.
Nov 23 10:13:37 np0005532585.localdomain systemd[1]: Started Session 81 of User zuul.
Nov 23 10:13:37 np0005532585.localdomain sshd[338513]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:38 np0005532585.localdomain sudo[338517]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Nov 23 10:13:38 np0005532585.localdomain sudo[338517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:38 np0005532585.localdomain sudo[338517]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:38 np0005532585.localdomain sshd[338516]: Received disconnect from 38.102.83.114 port 36766:11: disconnected by user
Nov 23 10:13:38 np0005532585.localdomain sshd[338516]: Disconnected from user zuul 38.102.83.114 port 36766
Nov 23 10:13:38 np0005532585.localdomain sshd[338513]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:38 np0005532585.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Nov 23 10:13:38 np0005532585.localdomain systemd-logind[761]: Session 81 logged out. Waiting for processes to exit.
Nov 23 10:13:38 np0005532585.localdomain systemd-logind[761]: Removed session 81.
Nov 23 10:13:38 np0005532585.localdomain sshd[338535]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:38 np0005532585.localdomain sshd[338535]: Accepted publickey for zuul from 38.102.83.114 port 36780 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:38 np0005532585.localdomain systemd-logind[761]: New session 82 of user zuul.
Nov 23 10:13:38 np0005532585.localdomain systemd[1]: Started Session 82 of User zuul.
Nov 23 10:13:38 np0005532585.localdomain sshd[338535]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:38 np0005532585.localdomain sudo[338539]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Nov 23 10:13:38 np0005532585.localdomain sudo[338539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:38 np0005532585.localdomain sudo[338539]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:38 np0005532585.localdomain sshd[338538]: Received disconnect from 38.102.83.114 port 36780:11: disconnected by user
Nov 23 10:13:38 np0005532585.localdomain sshd[338538]: Disconnected from user zuul 38.102.83.114 port 36780
Nov 23 10:13:38 np0005532585.localdomain sshd[338535]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:38 np0005532585.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Nov 23 10:13:38 np0005532585.localdomain systemd-logind[761]: Session 82 logged out. Waiting for processes to exit.
Nov 23 10:13:38 np0005532585.localdomain systemd-logind[761]: Removed session 82.
Nov 23 10:13:38 np0005532585.localdomain ceph-mon[300199]: pgmap v736: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 23 10:13:39 np0005532585.localdomain sshd[338557]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:13:39 np0005532585.localdomain sshd[338557]: Accepted publickey for zuul from 38.102.83.114 port 36786 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:13:39 np0005532585.localdomain systemd-logind[761]: New session 83 of user zuul.
Nov 23 10:13:39 np0005532585.localdomain systemd[1]: Started Session 83 of User zuul.
Nov 23 10:13:39 np0005532585.localdomain sshd[338557]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:13:39 np0005532585.localdomain sudo[338561]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Nov 23 10:13:39 np0005532585.localdomain sudo[338561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:13:39 np0005532585.localdomain sudo[338561]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:39 np0005532585.localdomain sshd[338560]: Received disconnect from 38.102.83.114 port 36786:11: disconnected by user
Nov 23 10:13:39 np0005532585.localdomain sshd[338560]: Disconnected from user zuul 38.102.83.114 port 36786
Nov 23 10:13:39 np0005532585.localdomain sshd[338557]: pam_unix(sshd:session): session closed for user zuul
Nov 23 10:13:39 np0005532585.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Nov 23 10:13:39 np0005532585.localdomain systemd-logind[761]: Session 83 logged out. Waiting for processes to exit.
Nov 23 10:13:39 np0005532585.localdomain systemd-logind[761]: Removed session 83.
Nov 23 10:13:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:40.119 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:40.123 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:40 np0005532585.localdomain ceph-mon[300199]: pgmap v737: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 23 10:13:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:13:40 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:13:41 np0005532585.localdomain podman[338579]: 2025-11-23 10:13:41.042389468 +0000 UTC m=+0.085943902 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 10:13:41 np0005532585.localdomain podman[338579]: 2025-11-23 10:13:41.057673057 +0000 UTC m=+0.101227481 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible)
Nov 23 10:13:41 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:13:41 np0005532585.localdomain podman[338580]: 2025-11-23 10:13:41.140951647 +0000 UTC m=+0.184456609 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:13:41 np0005532585.localdomain podman[338580]: 2025-11-23 10:13:41.180408489 +0000 UTC m=+0.223913421 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:13:41 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:13:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:13:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:13:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:13:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:13:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:13:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1"
Nov 23 10:13:42 np0005532585.localdomain ceph-mon[300199]: pgmap v738: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:43.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:43 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:43.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 23 10:13:44 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:44.224 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:13:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:44 np0005532585.localdomain ceph-mon[300199]: pgmap v739: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:45.121 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:46 np0005532585.localdomain ceph-mon[300199]: pgmap v740: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:48 np0005532585.localdomain ceph-mon[300199]: pgmap v741: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:50.125 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:13:50 np0005532585.localdomain ceph-mon[300199]: pgmap v742: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:52 np0005532585.localdomain ceph-mon[300199]: pgmap v743: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:13:53 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:13:54 np0005532585.localdomain systemd[1]: tmp-crun.pHfmfe.mount: Deactivated successfully.
Nov 23 10:13:54 np0005532585.localdomain podman[338624]: 2025-11-23 10:13:54.061681848 +0000 UTC m=+0.103423068 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 10:13:54 np0005532585.localdomain podman[338624]: 2025-11-23 10:13:54.069523669 +0000 UTC m=+0.111264859 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 23 10:13:54 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:13:54 np0005532585.localdomain podman[338623]: 2025-11-23 10:13:54.033609886 +0000 UTC m=+0.083945000 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 23 10:13:54 np0005532585.localdomain podman[338623]: 2025-11-23 10:13:54.117285537 +0000 UTC m=+0.167620651 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:13:54 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:13:54 np0005532585.localdomain sudo[338664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:13:54 np0005532585.localdomain sudo[338664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:13:54 np0005532585.localdomain sudo[338664]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:54 np0005532585.localdomain sudo[338682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:13:54 np0005532585.localdomain sudo[338682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:13:54 np0005532585.localdomain ceph-mon[300199]: pgmap v744: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:13:55.128 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:13:55 np0005532585.localdomain sudo[338682]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:55 np0005532585.localdomain sudo[338731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:13:55 np0005532585.localdomain sudo[338731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:13:55 np0005532585.localdomain sudo[338731]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:55 np0005532585.localdomain sudo[338749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 46550e70-79cb-5f55-bf6d-1204b97e083b -- inventory --format=json-pretty --filter-for-batch
Nov 23 10:13:55 np0005532585.localdomain sudo[338749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 2025-11-23 10:13:56.202343163 +0000 UTC m=+0.086228760 container create 6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_blackburn, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7)
Nov 23 10:13:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad.scope.
Nov 23 10:13:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 2025-11-23 10:13:56.162489038 +0000 UTC m=+0.046374705 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 2025-11-23 10:13:56.277075509 +0000 UTC m=+0.160961106 container init 6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_blackburn, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vendor=Red Hat, Inc., release=553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64)
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 2025-11-23 10:13:56.286673834 +0000 UTC m=+0.170559431 container start 6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_blackburn, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True)
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 2025-11-23 10:13:56.287329944 +0000 UTC m=+0.171215591 container attach 6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_blackburn, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=553)
Nov 23 10:13:56 np0005532585.localdomain nifty_blackburn[338822]: 167 167
Nov 23 10:13:56 np0005532585.localdomain systemd[1]: libpod-6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad.scope: Deactivated successfully.
Nov 23 10:13:56 np0005532585.localdomain podman[338807]: 2025-11-23 10:13:56.29469459 +0000 UTC m=+0.178580197 container died 6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_blackburn, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=553, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git)
Nov 23 10:13:56 np0005532585.localdomain podman[338827]: 2025-11-23 10:13:56.389317288 +0000 UTC m=+0.083236938 container remove 6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_blackburn, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 23 10:13:56 np0005532585.localdomain systemd[1]: libpod-conmon-6b2f1d210c71ee8f0f611d5e9861ab890e4f6fa6cf826545c7a3265425bb60ad.scope: Deactivated successfully.
Nov 23 10:13:56 np0005532585.localdomain podman[338849]: 
Nov 23 10:13:56 np0005532585.localdomain podman[338849]: 2025-11-23 10:13:56.632749847 +0000 UTC m=+0.082217966 container create 1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_bell, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 10:13:56 np0005532585.localdomain systemd[1]: Started libpod-conmon-1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d.scope.
Nov 23 10:13:56 np0005532585.localdomain systemd[1]: Started libcrun container.
Nov 23 10:13:56 np0005532585.localdomain podman[338849]: 2025-11-23 10:13:56.597699811 +0000 UTC m=+0.047167940 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 23 10:13:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afaa93d4df97ce60fc1a8c900b187f6a1392daed43c968916bc583aa713ef43a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 10:13:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afaa93d4df97ce60fc1a8c900b187f6a1392daed43c968916bc583aa713ef43a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 10:13:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afaa93d4df97ce60fc1a8c900b187f6a1392daed43c968916bc583aa713ef43a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 10:13:56 np0005532585.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afaa93d4df97ce60fc1a8c900b187f6a1392daed43c968916bc583aa713ef43a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 10:13:56 np0005532585.localdomain podman[338849]: 2025-11-23 10:13:56.707768033 +0000 UTC m=+0.157236142 container init 1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_bell, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, distribution-scope=public)
Nov 23 10:13:56 np0005532585.localdomain podman[338849]: 2025-11-23 10:13:56.719912075 +0000 UTC m=+0.169380194 container start 1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_bell, ceph=True, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, version=7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Nov 23 10:13:56 np0005532585.localdomain podman[338849]: 2025-11-23 10:13:56.723870338 +0000 UTC m=+0.173338437 container attach 1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_bell, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 23 10:13:56 np0005532585.localdomain ceph-mon[300199]: pgmap v745: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-b31cd910a15ceadc9414b30ba8e05c55ef335bb7e50f81da68d43a31a3cf48c6-merged.mount: Deactivated successfully.
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]: [
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:     {
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "available": false,
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "ceph_device": false,
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "lsm_data": {},
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "lvs": [],
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "path": "/dev/sr0",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "rejected_reasons": [
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "Has a FileSystem",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "Insufficient space (<5GB)"
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         ],
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         "sys_api": {
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "actuators": null,
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "device_nodes": "sr0",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "human_readable_size": "482.00 KB",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "id_bus": "ata",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "model": "QEMU DVD-ROM",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "nr_requests": "2",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "partitions": {},
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "path": "/dev/sr0",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "removable": "1",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "rev": "2.5+",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "ro": "0",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "rotational": "1",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "sas_address": "",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "sas_device_handle": "",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "scheduler_mode": "mq-deadline",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "sectors": 0,
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "sectorsize": "2048",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "size": 493568.0,
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "support_discard": "0",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "type": "disk",
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:             "vendor": "QEMU"
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:         }
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]:     }
Nov 23 10:13:57 np0005532585.localdomain vibrant_bell[338864]: ]
Nov 23 10:13:57 np0005532585.localdomain systemd[1]: libpod-1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d.scope: Deactivated successfully.
Nov 23 10:13:57 np0005532585.localdomain systemd[1]: libpod-1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d.scope: Consumed 1.011s CPU time.
Nov 23 10:13:57 np0005532585.localdomain podman[338849]: 2025-11-23 10:13:57.749352617 +0000 UTC m=+1.198820696 container died 1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_bell, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Nov 23 10:13:57 np0005532585.localdomain systemd[1]: tmp-crun.GydDoc.mount: Deactivated successfully.
Nov 23 10:13:57 np0005532585.localdomain systemd[1]: var-lib-containers-storage-overlay-afaa93d4df97ce60fc1a8c900b187f6a1392daed43c968916bc583aa713ef43a-merged.mount: Deactivated successfully.
Nov 23 10:13:57 np0005532585.localdomain podman[340906]: 2025-11-23 10:13:57.850862605 +0000 UTC m=+0.088807160 container remove 1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_bell, description=Red Hat Ceph Storage 7, ceph=True, release=553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 23 10:13:57 np0005532585.localdomain systemd[1]: libpod-conmon-1bd615f73e106162624f25d3dc2983b3212f77a223ef4b38da881ca8683daa7d.scope: Deactivated successfully.
Nov 23 10:13:57 np0005532585.localdomain sudo[338749]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:58 np0005532585.localdomain sudo[340920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:13:58 np0005532585.localdomain sudo[340920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:13:58 np0005532585.localdomain sudo[340920]: pam_unix(sudo:session): session closed for user root
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: pgmap v746: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:58 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:13:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:13:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:13:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:13:59 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:14:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:00.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:00 np0005532585.localdomain ceph-mon[300199]: pgmap v747: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2425581679' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:14:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/2425581679' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:14:02 np0005532585.localdomain ceph-mon[300199]: pgmap v748: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:04 np0005532585.localdomain ceph-mon[300199]: pgmap v749: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:05.133 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:14:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:14:05 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:14:06 np0005532585.localdomain systemd[1]: tmp-crun.QOFndq.mount: Deactivated successfully.
Nov 23 10:14:06 np0005532585.localdomain podman[340939]: 2025-11-23 10:14:06.046117161 +0000 UTC m=+0.091011677 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 23 10:14:06 np0005532585.localdomain podman[340940]: 2025-11-23 10:14:06.09748933 +0000 UTC m=+0.140473497 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Nov 23 10:14:06 np0005532585.localdomain podman[340940]: 2025-11-23 10:14:06.1394664 +0000 UTC m=+0.182450567 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public)
Nov 23 10:14:06 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:14:06 np0005532585.localdomain podman[340938]: 2025-11-23 10:14:06.161783526 +0000 UTC m=+0.202969089 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller)
Nov 23 10:14:06 np0005532585.localdomain podman[340939]: 2025-11-23 10:14:06.181326016 +0000 UTC m=+0.226220602 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 23 10:14:06 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:14:06 np0005532585.localdomain podman[340938]: 2025-11-23 10:14:06.221679555 +0000 UTC m=+0.262865088 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 10:14:06 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:14:06 np0005532585.localdomain ceph-mon[300199]: pgmap v750: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:08 np0005532585.localdomain ceph-mon[300199]: pgmap v751: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:08.921 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:08.943 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 23 10:14:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:08.944 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:14:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:08.944 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:14:08 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:08.974 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:14:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:14:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:14:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:14:09.311 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:14:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:14:09.312 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:14:09 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:10.136 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:10.137 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:10.137 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:10.138 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:10.138 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:10 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:10.141 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:10 np0005532585.localdomain ceph-mon[300199]: pgmap v752: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:11 np0005532585.localdomain podman[240668]: time="2025-11-23T10:14:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:14:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:14:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:14:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:14:11 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:14:11 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:14:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18793 "" "Go-http-client/1.1"
Nov 23 10:14:12 np0005532585.localdomain systemd[1]: tmp-crun.JZs24w.mount: Deactivated successfully.
Nov 23 10:14:12 np0005532585.localdomain podman[341001]: 2025-11-23 10:14:12.019248121 +0000 UTC m=+0.092792353 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 23 10:14:12 np0005532585.localdomain podman[341000]: 2025-11-23 10:14:12.06185789 +0000 UTC m=+0.135498065 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 10:14:12 np0005532585.localdomain podman[341000]: 2025-11-23 10:14:12.077381586 +0000 UTC m=+0.151021771 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd)
Nov 23 10:14:12 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:14:12 np0005532585.localdomain podman[341001]: 2025-11-23 10:14:12.132029026 +0000 UTC m=+0.205573238 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:14:12 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:14:12 np0005532585.localdomain ceph-mon[300199]: pgmap v753: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:14 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:14 np0005532585.localdomain ceph-mon[300199]: pgmap v754: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:15 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:15.141 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:15 np0005532585.localdomain sshd[341042]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:14:16 np0005532585.localdomain sshd[341042]: Received disconnect from 207.154.194.2 port 45792:11: Bye Bye [preauth]
Nov 23 10:14:16 np0005532585.localdomain sshd[341042]: Disconnected from authenticating user root 207.154.194.2 port 45792 [preauth]
Nov 23 10:14:16 np0005532585.localdomain ceph-mon[300199]: pgmap v755: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:18 np0005532585.localdomain ceph-mon[300199]: pgmap v756: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:19 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:20.142 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:20.144 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:20.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:20.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:20.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:20 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:20.156 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:20 np0005532585.localdomain ceph-mon[300199]: pgmap v757: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:21 np0005532585.localdomain sshd[341044]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:14:21 np0005532585.localdomain sshd[341044]: Received disconnect from 107.172.15.139 port 60734:11: Bye Bye [preauth]
Nov 23 10:14:21 np0005532585.localdomain sshd[341044]: Disconnected from authenticating user root 107.172.15.139 port 60734 [preauth]
Nov 23 10:14:22 np0005532585.localdomain ceph-mon[300199]: pgmap v758: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:24 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:24 np0005532585.localdomain ceph-mon[300199]: pgmap v759: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:14:24 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:14:25 np0005532585.localdomain podman[341046]: 2025-11-23 10:14:25.02662104 +0000 UTC m=+0.075669776 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 23 10:14:25 np0005532585.localdomain podman[341046]: 2025-11-23 10:14:25.062291226 +0000 UTC m=+0.111339972 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 23 10:14:25 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:14:25 np0005532585.localdomain podman[341047]: 2025-11-23 10:14:25.137188507 +0000 UTC m=+0.182203550 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 10:14:25 np0005532585.localdomain podman[341047]: 2025-11-23 10:14:25.149187316 +0000 UTC m=+0.194202329 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 23 10:14:25 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:25.157 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:25 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.237 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.237 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.238 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 23 10:14:26 np0005532585.localdomain ceph-mon[300199]: pgmap v760: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.897 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.897 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.897 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 23 10:14:26 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:26.898 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 23 10:14:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:27.395 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 23 10:14:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:27.409 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 23 10:14:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:27.410 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 23 10:14:27 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:27.411 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:28 np0005532585.localdomain ceph-mon[300199]: pgmap v761: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:29 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:29.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:29 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:29 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2604027664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:14:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:14:29 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:14:30 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:30 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:14:30 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:14:30 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:30 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:14:30 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.159 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.160 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.160 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.160 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.161 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.237 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:14:30 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:14:30 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1161150632' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.653 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.715 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.716 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.896 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.897 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11037MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.897 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.897 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:14:30 np0005532585.localdomain ceph-mon[300199]: pgmap v762: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3265261699' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:30 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1161150632' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.964 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.965 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.965 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 23 10:14:30 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:30.989 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.007 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.008 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.019 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.043 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.089 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 23 10:14:31 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 10:14:31 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/213700595' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.539 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.545 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.559 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.561 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 23 10:14:31 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:31.562 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 23 10:14:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/213700595' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:31 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/550863230' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:32 np0005532585.localdomain ceph-mon[300199]: pgmap v763: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:32 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2052878976' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 23 10:14:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:33.558 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:33 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:33.559 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:34 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:34 np0005532585.localdomain ceph-mon[300199]: pgmap v764: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.181 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5019 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.184 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.185 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.188 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:35 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:35.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 23 10:14:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:14:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:14:36 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:14:36 np0005532585.localdomain ceph-mon[300199]: pgmap v765: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:37 np0005532585.localdomain podman[341133]: 2025-11-23 10:14:37.029101618 +0000 UTC m=+0.083945800 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 23 10:14:37 np0005532585.localdomain systemd[1]: tmp-crun.OHwxec.mount: Deactivated successfully.
Nov 23 10:14:37 np0005532585.localdomain podman[341135]: 2025-11-23 10:14:37.129819723 +0000 UTC m=+0.178133024 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 23 10:14:37 np0005532585.localdomain podman[341133]: 2025-11-23 10:14:37.135359333 +0000 UTC m=+0.190203495 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 10:14:37 np0005532585.localdomain podman[341135]: 2025-11-23 10:14:37.145307609 +0000 UTC m=+0.193620980 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Nov 23 10:14:37 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:14:37 np0005532585.localdomain podman[341134]: 2025-11-23 10:14:37.112517542 +0000 UTC m=+0.163794974 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 10:14:37 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:14:37 np0005532585.localdomain podman[341134]: 2025-11-23 10:14:37.194299304 +0000 UTC m=+0.245576806 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 10:14:37 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:14:39 np0005532585.localdomain ceph-mon[300199]: pgmap v766: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:39 np0005532585.localdomain sshd[341191]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:14:39 np0005532585.localdomain sshd[341191]: Accepted publickey for zuul from 192.168.122.10 port 41516 ssh2: RSA SHA256:nmzS+EtMlwQrsnciNxMD2sy0vA/srlgaJuLM+rQgnCY
Nov 23 10:14:39 np0005532585.localdomain systemd-logind[761]: New session 84 of user zuul.
Nov 23 10:14:39 np0005532585.localdomain systemd[1]: Started Session 84 of User zuul.
Nov 23 10:14:39 np0005532585.localdomain sshd[341191]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 23 10:14:39 np0005532585.localdomain sudo[341195]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Nov 23 10:14:39 np0005532585.localdomain sudo[341195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 23 10:14:39 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:40.189 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:40.191 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:40.192 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:40.192 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:40.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:40 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:40.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:41 np0005532585.localdomain ceph-mon[300199]: pgmap v767: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:41 np0005532585.localdomain podman[240668]: time="2025-11-23T10:14:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 23 10:14:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:14:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1"
Nov 23 10:14:41 np0005532585.localdomain podman[240668]: @ - - [23/Nov/2025:10:14:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1"
Nov 23 10:14:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.
Nov 23 10:14:42 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.
Nov 23 10:14:43 np0005532585.localdomain podman[341394]: 2025-11-23 10:14:43.040049641 +0000 UTC m=+0.086201340 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: pgmap v768: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: from='client.49416 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: from='client.58945 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: from='client.68921 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: from='client.49422 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1010526671' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 23 10:14:43 np0005532585.localdomain podman[341393]: 2025-11-23 10:14:43.095377541 +0000 UTC m=+0.141837830 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 23 10:14:43 np0005532585.localdomain podman[341393]: 2025-11-23 10:14:43.105810771 +0000 UTC m=+0.152271060 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 10:14:43 np0005532585.localdomain podman[341394]: 2025-11-23 10:14:43.121005238 +0000 UTC m=+0.167156977 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 23 10:14:43 np0005532585.localdomain systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully.
Nov 23 10:14:43 np0005532585.localdomain systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully.
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "status"} v 0)
Nov 23 10:14:43 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3168009464' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 23 10:14:44 np0005532585.localdomain ceph-mon[300199]: from='client.58951 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:44 np0005532585.localdomain ceph-mon[300199]: from='client.68927 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3168009464' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 23 10:14:44 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3351056879' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 23 10:14:44 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:45 np0005532585.localdomain ceph-mon[300199]: pgmap v769: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:45.210 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:45.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:45.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:45.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:45.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:45 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:45.245 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:45 np0005532585.localdomain ovs-vsctl[341484]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 23 10:14:46 np0005532585.localdomain virtqemud[203731]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 23 10:14:46 np0005532585.localdomain virtqemud[203731]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 23 10:14:46 np0005532585.localdomain virtqemud[203731]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 23 10:14:46 np0005532585.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 341639 (lsinitrd)
Nov 23 10:14:46 np0005532585.localdomain systemd[1]: Mounting EFI System Partition Automount...
Nov 23 10:14:46 np0005532585.localdomain systemd[1]: Mounted EFI System Partition Automount.
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: cache status {prefix=cache status} (starting...)
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: client ls {prefix=client ls} (starting...)
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:47 np0005532585.localdomain ceph-mon[300199]: pgmap v770: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:47 np0005532585.localdomain lvm[341719]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 23 10:14:47 np0005532585.localdomain lvm[341719]: VG ceph_vg1 finished
Nov 23 10:14:47 np0005532585.localdomain lvm[341734]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 10:14:47 np0005532585.localdomain lvm[341734]: VG ceph_vg0 finished
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: damage ls {prefix=damage ls} (starting...)
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: dump loads {prefix=dump loads} (starting...)
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 23 10:14:47 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: from='client.49434 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: from='client.49440 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: from='client.58966 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: from='client.68942 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3048889610' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3851928342' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "report"} v 0)
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3708486609' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1133661239' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 23 10:14:48 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Nov 23 10:14:48 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1343039448' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: ops {prefix=ops} (starting...)
Nov 23 10:14:49 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config log"} v 0)
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3996961862' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.58972 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: pgmap v771: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.68951 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.49458 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3708486609' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4238211147' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/555593730' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2641698963' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.58996 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.68966 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1133661239' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4160910027' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/828858147' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2120265198' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1343039448' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2941413893' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3996961862' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1899350493' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/41285028' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config-key dump"} v 0)
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3840425674' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:49 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: session ls {prefix=session ls} (starting...)
Nov 23 10:14:49 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl Can't run that command on an inactive MDS!
Nov 23 10:14:49 np0005532585.localdomain ceph-mds[287052]: mds.mds.np0005532585.jcltnl asok_command: status {prefix=status} (starting...)
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 23 10:14:49 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3037356796' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.49506 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/168170290' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2590722668' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/41285028' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3840425674' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3769118842' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4173593946' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3980943465' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3037356796' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2424329786' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/586164529' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:50.245 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:50.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:14:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:50.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:14:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:50.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/861124455' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:50.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:50 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:50.285 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "features"} v 0)
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1727928170' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 10:14:50 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2910384352' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr stat"} v 0)
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2910476656' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3797739684' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.49518 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: pgmap v772: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.59050 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.69017 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.59062 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2510610572' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.69029 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/861124455' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1687227166' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3032686292' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2095089999' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1727928170' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2153661629' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2910384352' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2650359476' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3781009230' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2910476656' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3237286252' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3797739684' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3791738503' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:51 np0005532585.localdomain sshd[342352]: main: sshd: ssh-rsa algorithm is disabled
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Nov 23 10:14:51 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4029306894' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.49560 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/144733000' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3785737868' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.59101 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3791738503' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.49578 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.69077 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3433932238' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/4029306894' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1203579280' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Nov 23 10:14:52 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3388039984' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/469495303' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain sshd[342352]: Invalid user root11 from 175.126.166.172 port 48112
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:16.699333+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:17.699512+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:18.699734+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:19.700042+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient:  got monmap 14 from mon.np0005532586 (according to old e14)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: dump:
                                                          epoch 14
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:50.591476+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
                                                          3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:20.700219+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:21.700420+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:22.700550+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:23.700783+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:24.700950+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:25.701078+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:26.701217+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:27.701377+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:28.701532+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient:  got monmap 15 from mon.np0005532586 (according to old e15)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: dump:
                                                          epoch 15
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:59.410256+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:29.701656+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:30.701805+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:31.701991+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:32.702137+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:33.702281+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:34.702488+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:35.702696+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:36.702908+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:37.703131+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:38.703283+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:39.703429+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:40.703594+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:41.703785+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:42.703990+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:43.704124+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:44.704254+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:45.704424+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:46.704554+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:47.704705+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:48.704875+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:49.705067+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:50.705226+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:51.705424+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:52.705574+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91701248 unmapped: 2613248 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:53.705732+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient:  got monmap 16 from mon.np0005532586 (according to old e16)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: dump:
                                                          epoch 16
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:53:23.789795+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: mon.np0005532586 at [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] went away
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _reopen_session rank -1
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _add_conns ranks=[1,0]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): picked mon.np0005532585 con 0x56377d64fc00 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): picked mon.np0005532584 con 0x56377e56dc00 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): start opening mon connection
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): start opening mon connection
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _finish_auth 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): get_auth_request con 0x56377d64fc00 auth_method 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _init_auth method 2
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_done global_id 24221 payload 293
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _finish_hunting 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: found mon.np0005532585
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _finish_auth 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:53.812039+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: get_auth_request con 0x56377e56dc00 auth_method 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: ms_handle_reset current mon [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _reopen_session rank -1
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _add_conns ranks=[0,1]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): picked mon.np0005532584 con 0x56377e56dc00 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): picked mon.np0005532585 con 0x56377d64e000 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): start opening mon connection
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): start opening mon connection
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 ms_handle_reset con 0x56377d64fc00 session 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): get_auth_request con 0x56377d64e000 auth_method 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _init_auth method 2
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient(hunting): handle_auth_done global_id 24221 payload 293
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _finish_hunting 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: found mon.np0005532585
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _finish_auth 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:53.821480+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient:  got monmap 16 from mon.np0005532585 (according to old e16)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: dump:
                                                          epoch 16
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:53:23.789795+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_config config(7 keys)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: set_mon_vals no callback set
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 33
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:54.705883+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:55.706094+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:56.706293+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:57.706485+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:58.706629+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:59.706799+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:00.706961+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:01.707158+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:02.707326+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:03.707457+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:04.707577+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:05.707754+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:06.707973+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:07.708153+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:08.708343+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:09.708499+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient:  got monmap 17 from mon.np0005532585 (according to old e17)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: dump:
                                                          epoch 17
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:53:40.507961+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532586
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:10.708668+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:11.708936+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:12.709099+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:13.709320+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:14.709527+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 34
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:15.709698+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:16.709879+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:17.710073+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91889664 unmapped: 2424832 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:18.710227+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:19.710382+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:20.710582+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:21.710775+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:22.711002+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:23.711165+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:24.711339+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:25.711509+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:26.711636+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:27.711794+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:28.711953+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 985347 data_alloc: 184549376 data_used: 7110656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:29.712128+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91799552 unmapped: 2514944 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 85 handle_osd_map epochs [85,86], i have 85, src has [1,86]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 107.870407104s of 107.923583984s, submitted: 12
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 35
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now 
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/4027327596
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc reconnect No active mgr available yet
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a9a000/0x0/0x1bfc00000, data 0x2f771c9/0x2ff4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 ms_handle_reset con 0x56377b702000 session 0x56377d2fa960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:30.712271+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f103800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 36
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: get_auth_request con 0x56377e432c00 auth_method 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_configure stats_period=5
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:31.712452+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a96000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 92135424 unmapped: 2179072 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:32.712640+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 92135424 unmapped: 2179072 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:33.713150+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 37
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 92143616 unmapped: 2170880 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:34.718699+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91848704 unmapped: 2465792 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:35.718879+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91848704 unmapped: 2465792 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 38
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:36.719072+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:37.719246+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:38.719398+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:39.719569+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:40.719727+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:41.719922+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:42.720079+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:43.720251+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:44.721524+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:45.722101+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91832320 unmapped: 2482176 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:46.722234+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 39
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:47.723593+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:48.725142+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:49.725525+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:50.726049+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:51.726311+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:52.726506+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:53.726700+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:54.727034+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:55.727384+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:56.727594+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:57.727744+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:58.727946+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:59.728220+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:00.746280+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:01.746491+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:02.746695+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:03.746845+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:04.747139+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:05.747326+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:06.747481+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:07.747699+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:08.748109+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:09.748419+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:10.748652+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:11.748849+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:12.748997+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:13.749167+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:14.749487+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:15.749683+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:16.749817+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:17.750042+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:18.750295+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:19.750494+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:20.750667+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:21.750871+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:22.751025+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:23.751171+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:24.751325+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:25.751430+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:26.751591+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:27.751758+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:28.751843+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:29.751956+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:30.752100+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:31.752303+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:32.752462+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:33.752539+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:34.752716+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:35.752879+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:36.753079+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:37.753499+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:38.753664+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:39.753820+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:40.753957+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:41.754125+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:42.754265+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:43.754419+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:44.754557+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:45.754754+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:46.754948+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:47.755170+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:48.755318+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:49.755655+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988639 data_alloc: 184549376 data_used: 7118848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:50.756458+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:51.756700+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91840512 unmapped: 2473984 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:52.757308+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 82.577377319s of 82.633476257s, submitted: 12
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now 
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc reconnect Terminating session with v2:172.18.0.108:6810/335107178
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc reconnect No active mgr available yet
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 ms_handle_reset con 0x56377f103800 session 0x56377da79680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b702000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a97000/0x0/0x1bfc00000, data 0x2f79389/0x2ff7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:53.757600+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 41
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: get_auth_request con 0x56377ee57c00 auth_method 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_configure stats_period=5
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:54.758165+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:55.758384+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 42
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:56.758523+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:57.758669+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91987968 unmapped: 2326528 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:58.758866+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 43
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:59.759133+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:00.759379+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:01.759631+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:02.759761+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:03.759877+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:04.760153+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:05.760327+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:06.760572+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:07.760737+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:08.760881+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:09.761062+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:10.761222+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:11.761390+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:12.761528+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:13.761699+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:14.761824+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:15.761952+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:16.762169+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:17.762349+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:18.762482+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:19.762685+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:20.762824+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:21.762930+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:22.763071+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:23.763216+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:24.763365+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:25.763522+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:26.763721+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:27.763872+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:28.764030+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:29.764137+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:30.764268+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:31.764427+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:32.764620+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:33.764726+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:34.764855+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:35.764995+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:36.765096+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:37.765233+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:38.765374+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:39.765520+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:40.765637+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:41.765794+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:42.765880+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:43.766039+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:44.766240+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:45.766402+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:46.766548+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:47.766703+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:48.766842+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:49.766990+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:50.767122+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:51.767348+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:52.767487+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:53.767600+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:54.767759+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:55.767960+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:56.768105+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:57.768732+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:58.768869+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:59.769041+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:00.769192+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:01.769392+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:02.769540+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:03.769697+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:04.769840+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:05.769959+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:06.770094+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:07.770234+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:08.770371+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:09.770607+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:10.770746+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:11.770951+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:12.771113+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:13.771268+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:14.771431+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91774976 unmapped: 2539520 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 992839 data_alloc: 184549376 data_used: 7127040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b8a92000/0x0/0x1bfc00000, data 0x2f7b6f7/0x2ffb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 82.290542603s of 82.347732544s, submitted: 12
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:15.771504+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 91807744 unmapped: 2506752 heap: 94314496 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 heartbeat osd_stat(store_statfs(0x1b7e22000/0x0/0x1bfc00000, data 0x3beb71a/0x3c6c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:16.771638+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 92913664 unmapped: 18186240 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 88 handle_osd_map epochs [88,88], i have 88, src has [1,88]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 88 ms_handle_reset con 0x56377e56cc00 session 0x56377d2f30e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b7e1d000/0x0/0x1bfc00000, data 0x3beda82/0x3c70000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:17.771853+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 92651520 unmapped: 18448384 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56dc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:18.772001+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b761c000/0x0/0x1bfc00000, data 0x43edaa5/0x4471000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 97361920 unmapped: 13737984 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 handle_osd_map epochs [89,89], i have 89, src has [1,89]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 ms_handle_reset con 0x56377e56dc00 session 0x56377d2fa3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:19.772161+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93708288 unmapped: 17391616 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:20.772316+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93708288 unmapped: 17391616 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:21.772514+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93708288 unmapped: 17391616 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:22.772642+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93708288 unmapped: 17391616 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:23.772818+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93708288 unmapped: 17391616 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 44
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:24.772961+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:25.773368+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:26.773527+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:27.773712+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:28.773935+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:29.774120+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:30.774277+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:31.774452+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:32.774623+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:33.774778+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:34.774994+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:35.775189+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets getting new tickets!
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:36.775439+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _finish_auth 0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:36.776230+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:37.775610+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:38.775865+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:39.776024+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:40.776258+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:41.776450+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:42.776620+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:43.776814+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:44.776978+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:45.777126+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:46.777274+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:47.777399+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:48.777534+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:49.777677+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:50.777863+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:51.778121+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:52.778287+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:53.778422+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:54.778553+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:55.778723+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:56.778877+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:57.779085+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:58.779262+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:59.779400+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:00.779522+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:01.779690+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:02.779812+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:03.779953+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:04.780064+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:05.780208+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:06.780360+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:07.780499+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:08.780687+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:09.780819+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:10.780943+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:11.781095+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:12.781244+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:13.781386+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:14.781512+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:15.813398+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:16.813535+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:17.813705+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93855744 unmapped: 17244160 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:18.813934+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93863936 unmapped: 17235968 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:19.814075+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93863936 unmapped: 17235968 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1233075 data_alloc: 184549376 data_used: 7139328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:20.814237+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93863936 unmapped: 17235968 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:21.814429+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93863936 unmapped: 17235968 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b69a7000/0x0/0x1bfc00000, data 0x505fe0d/0x50e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:22.814604+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93863936 unmapped: 17235968 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:23.814807+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93863936 unmapped: 17235968 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:24.814942+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56e800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 69.081085205s of 69.388839722s, submitted: 53
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93888512 unmapped: 17211392 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 90 handle_osd_map epochs [90,90], i have 90, src has [1,90]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 90 ms_handle_reset con 0x56377e56e800 session 0x56377d2494a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1242522 data_alloc: 184549376 data_used: 7151616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:25.815086+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93896704 unmapped: 17203200 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b69a0000/0x0/0x1bfc00000, data 0x50625cb/0x50ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:26.815264+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93904896 unmapped: 17195008 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:27.815425+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93904896 unmapped: 17195008 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b69a0000/0x0/0x1bfc00000, data 0x50625cb/0x50ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:28.815733+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93904896 unmapped: 17195008 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:29.816000+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 93904896 unmapped: 17195008 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1244520 data_alloc: 184549376 data_used: 7163904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 handle_osd_map epochs [91,91], i have 91, src has [1,91]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:30.816231+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 ms_handle_reset con 0x56377e561800 session 0x56377d2f5a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94011392 unmapped: 17088512 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:31.816423+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94011392 unmapped: 17088512 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b699f000/0x0/0x1bfc00000, data 0x5064531/0x50ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:32.816588+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94011392 unmapped: 17088512 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b699f000/0x0/0x1bfc00000, data 0x5064531/0x50ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:33.816722+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94011392 unmapped: 17088512 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:34.817312+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b699f000/0x0/0x1bfc00000, data 0x5064531/0x50ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94011392 unmapped: 17088512 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b699f000/0x0/0x1bfc00000, data 0x5064531/0x50ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1243849 data_alloc: 184549376 data_used: 7163904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:35.817483+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94011392 unmapped: 17088512 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 ms_handle_reset con 0x56377e561800 session 0x56377d23da40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 ms_handle_reset con 0x56377e56cc00 session 0x56377d23cb40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 ms_handle_reset con 0x56377d4ec000 session 0x56377d23c780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:36.817669+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 94019584 unmapped: 17080320 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.251956940s of 12.451662064s, submitted: 61
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:37.818037+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56dc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e56dc00 session 0x56377d23e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 99909632 unmapped: 11190272 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b699c000/0x0/0x1bfc00000, data 0x506677f/0x50f1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f103800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377f103800 session 0x56377d23f0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:38.818212+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 104284160 unmapped: 6815744 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377d4ec000 session 0x56377c6f72c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:39.818760+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101679104 unmapped: 9420800 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1315847 data_alloc: 184549376 data_used: 12775424
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b63aa000/0x0/0x1bfc00000, data 0x56587e1/0x56e4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:40.818992+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101679104 unmapped: 9420800 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:41.819256+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101752832 unmapped: 9347072 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:42.819496+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101752832 unmapped: 9347072 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:43.819700+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101777408 unmapped: 9322496 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:44.819957+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101777408 unmapped: 9322496 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1317970 data_alloc: 184549376 data_used: 12775424
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:45.820129+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b63a9000/0x0/0x1bfc00000, data 0x56587f1/0x56e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101777408 unmapped: 9322496 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:46.820371+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 101777408 unmapped: 9322496 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:47.820539+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102023168 unmapped: 9076736 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:48.820763+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b63a9000/0x0/0x1bfc00000, data 0x56587f1/0x56e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102023168 unmapped: 9076736 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:49.821119+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102023168 unmapped: 9076736 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1322130 data_alloc: 184549376 data_used: 13332480
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:50.821273+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102031360 unmapped: 9068544 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:51.821575+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102031360 unmapped: 9068544 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:52.821760+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102031360 unmapped: 9068544 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:53.821969+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102031360 unmapped: 9068544 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:54.822250+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b63a9000/0x0/0x1bfc00000, data 0x56587f1/0x56e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102031360 unmapped: 9068544 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1322130 data_alloc: 184549376 data_used: 13332480
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:55.822465+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 18.444498062s of 18.671947479s, submitted: 66
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 103096320 unmapped: 8003584 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e56cc00 session 0x56377d53fa40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:56.822621+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b63a9000/0x0/0x1bfc00000, data 0x56587f1/0x56e5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102637568 unmapped: 8462336 heap: 111099904 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56dc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e56dc00 session 0x56377d53ed20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:57.822791+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102080512 unmapped: 14270464 heap: 116350976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:58.822960+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e4a3400 session 0x56377d53e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 102137856 unmapped: 14213120 heap: 116350976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb8c00 session 0x56377d53fe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e561800 session 0x56377c6f63c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:59.823167+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb8c00 session 0x56377d2f25a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e4a3400 session 0x56377d2f2780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 107102208 unmapped: 9248768 heap: 116350976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56dc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e56dc00 session 0x56377d620d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b4506000/0x0/0x1bfc00000, data 0x74f0886/0x7580000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb9000 session 0x56377d2f3e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1568369 data_alloc: 184549376 data_used: 13373440
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb9000 session 0x56377d9f90e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:00.823316+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 107798528 unmapped: 8552448 heap: 116350976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb8c00 session 0x56377ade74a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:01.823520+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab4000 session 0x56377c643a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 105832448 unmapped: 10518528 heap: 116350976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab5000 session 0x56377d4a21e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:02.823666+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e568c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e568c00 session 0x56377dac3e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 107094016 unmapped: 13459456 heap: 120553472 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:03.823872+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 107094016 unmapped: 13459456 heap: 120553472 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e568c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e568c00 session 0x56377d2cd4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb8c00 session 0x56377d9f94a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb9000 session 0x56377d9f90e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377b7a3c00 session 0x56377d2f2780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab4800 session 0x56377d2f3e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x806a8f8/0x80fc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:04.824117+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab4800 session 0x56377d53fa40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377b7a3c00 session 0x56377d53e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb9000 session 0x56377ade6d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb8c00 session 0x56377c642960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e568c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 108240896 unmapped: 21315584 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e568c00 session 0x56377c643860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1800770 data_alloc: 184549376 data_used: 15949824
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377d4ec000 session 0x56377d53f4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:05.824327+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.912508011s of 10.150933266s, submitted: 309
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 112386048 unmapped: 17170432 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb9000 session 0x56377d348b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:06.824564+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 115523584 unmapped: 14032896 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:07.825359+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab4800 session 0x56377d2cc000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b4453000/0x0/0x1bfc00000, data 0x759c92b/0x7630000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 117555200 unmapped: 12001280 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:08.825549+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab4000 session 0x56377d5f6000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 118931456 unmapped: 10625024 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab5000 session 0x56377d9b65a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:09.826004+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377d4ec000 session 0x56377d9b7a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 119144448 unmapped: 10412032 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b4439000/0x0/0x1bfc00000, data 0x75c094e/0x7655000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1702447 data_alloc: 201326592 data_used: 25255936
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:10.826279+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b4439000/0x0/0x1bfc00000, data 0x75c094e/0x7655000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 119226368 unmapped: 10330112 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b4439000/0x0/0x1bfc00000, data 0x75c094e/0x7655000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:11.826466+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 121282560 unmapped: 8273920 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:12.826677+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123494400 unmapped: 6062080 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:13.826855+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126230528 unmapped: 3325952 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:14.827180+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126246912 unmapped: 3309568 heap: 129556480 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1774287 data_alloc: 201326592 data_used: 32026624
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:15.827684+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.559051514s of 10.029386520s, submitted: 141
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 133234688 unmapped: 2154496 heap: 135389184 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:16.827873+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 130859008 unmapped: 4530176 heap: 135389184 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b3609000/0x0/0x1bfc00000, data 0x83e894e/0x847d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:17.828141+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 130859008 unmapped: 4530176 heap: 135389184 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:18.828355+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136781824 unmapped: 4349952 heap: 141131776 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb9000 session 0x56377d53e000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:19.828492+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135028736 unmapped: 6103040 heap: 141131776 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:20.828610+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2036431 data_alloc: 201326592 data_used: 33087488
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b236c000/0x0/0x1bfc00000, data 0x968694e/0x971b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136167424 unmapped: 4964352 heap: 141131776 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:21.828791+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136167424 unmapped: 4964352 heap: 141131776 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:22.828937+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136167424 unmapped: 4964352 heap: 141131776 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:23.829104+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139419648 unmapped: 5922816 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:24.829256+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138420224 unmapped: 6922240 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b13de000/0x0/0x1bfc00000, data 0xa61594e/0xa6aa000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:25.829392+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2152351 data_alloc: 201326592 data_used: 33505280
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377eab4000 session 0x56377c6cf4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.999533653s of 10.224999428s, submitted: 406
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139755520 unmapped: 5586944 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:26.829519+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377e56b000 session 0x56377d2f4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131612672 unmapped: 13729792 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:27.829693+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b2f9d000/0x0/0x1bfc00000, data 0x8a598b9/0x8aeb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b2f9d000/0x0/0x1bfc00000, data 0x8a598b9/0x8aeb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131620864 unmapped: 13721600 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:28.829855+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131751936 unmapped: 13590528 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:29.829952+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131751936 unmapped: 13590528 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:30.830116+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 ms_handle_reset con 0x56377dfb8c00 session 0x56377dbfde00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1873491 data_alloc: 201326592 data_used: 22032384
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131792896 unmapped: 13549568 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:31.830298+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 92 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131907584 unmapped: 13434880 heap: 145342464 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 93 ms_handle_reset con 0x56377d4ec000 session 0x56377dbfda40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b2f96000/0x0/0x1bfc00000, data 0x8a668b9/0x8af8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 93 ms_handle_reset con 0x56377e56b000 session 0x56377c5c30e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:32.830432+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137773056 unmapped: 26492928 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 93 ms_handle_reset con 0x56377eab4000 session 0x56377dc87a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:33.830595+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137789440 unmapped: 26476544 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 94 ms_handle_reset con 0x56377ba89800 session 0x56377c6f72c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ed800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:34.830831+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137789440 unmapped: 26476544 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 94 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 ms_handle_reset con 0x56377d4ed800 session 0x56377c6f6b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:35.830987+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2130245 data_alloc: 201326592 data_used: 23134208
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 ms_handle_reset con 0x56377ba89800 session 0x56377ade81e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 ms_handle_reset con 0x56377d4ec000 session 0x56377ade94a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 ms_handle_reset con 0x56377e56b000 session 0x56377dc64000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137854976 unmapped: 26411008 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.416559219s of 10.335108757s, submitted: 229
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:36.831137+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 heartbeat osd_stat(store_statfs(0x1b0f4a000/0x0/0x1bfc00000, data 0xa6a8435/0xa742000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 ms_handle_reset con 0x56377eab4000 session 0x56377dc87680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126115840 unmapped: 38150144 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:37.831268+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f102c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125804544 unmapped: 38461440 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 ms_handle_reset con 0x56377f102c00 session 0x56377d53fa40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:38.831414+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 ms_handle_reset con 0x56377ba89800 session 0x56377d53f860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 ms_handle_reset con 0x56377d4ec000 session 0x56377d53ed20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125812736 unmapped: 38453248 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 ms_handle_reset con 0x56377b7a3c00 session 0x56377d6203c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 ms_handle_reset con 0x56377e56cc00 session 0x56377da78b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:39.831553+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377e56b000 session 0x56377d53e960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377ba89800 session 0x56377da79860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377d4ec000 session 0x56377d2f4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377e56b000 session 0x56377c6cf4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377e56cc00 session 0x56377d9b7c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377eab4000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377eab4000 session 0x56377c6f6f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377b7a3c00 session 0x56377d53e000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 ms_handle_reset con 0x56377ba89800 session 0x56377d5f6f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d4ec000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 122839040 unmapped: 41426944 heap: 164265984 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377e56cc00 session 0x56377d53e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377e56b000 session 0x56377d2cd0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:40.831674+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f103400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1588793 data_alloc: 184549376 data_used: 8998912
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377f103400 session 0x56377d348d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377b7a3c00 session 0x56377d3492c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377ba89800 session 0x56377d348b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377d4ec000 session 0x56377d2cde00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377e56b000 session 0x56377ade6d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56cc00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 133611520 unmapped: 34635776 heap: 168247296 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f102400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:41.831862+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 ms_handle_reset con 0x56377e56cc00 session 0x56377d505680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 heartbeat osd_stat(store_statfs(0x1b2b17000/0x0/0x1bfc00000, data 0x8adae92/0x8b77000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 98 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123461632 unmapped: 48988160 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:42.831982+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 99 ms_handle_reset con 0x56377f102400 session 0x56377c58f680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123510784 unmapped: 48939008 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:43.832124+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123551744 unmapped: 48898048 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:44.832328+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 100 ms_handle_reset con 0x56377b7a3c00 session 0x56377d23ed20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 120504320 unmapped: 51945472 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:45.832518+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1648424 data_alloc: 184549376 data_used: 7630848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 100 heartbeat osd_stat(store_statfs(0x1b4665000/0x0/0x1bfc00000, data 0x6f8455b/0x7021000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 120537088 unmapped: 51912704 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:46.832655+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.073532104s of 10.655554771s, submitted: 418
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123232256 unmapped: 49217536 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:47.832779+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123076608 unmapped: 49373184 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:48.832938+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123076608 unmapped: 49373184 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:49.833037+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123076608 unmapped: 49373184 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:50.833164+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1690930 data_alloc: 184549376 data_used: 14028800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123076608 unmapped: 49373184 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:51.833347+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 101 heartbeat osd_stat(store_statfs(0x1b4668000/0x0/0x1bfc00000, data 0x6f867c5/0x7025000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123092992 unmapped: 49356800 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:52.833485+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 101 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123174912 unmapped: 49274880 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:53.833651+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123174912 unmapped: 49274880 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:54.833825+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123183104 unmapped: 49266688 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:55.833972+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1694732 data_alloc: 184549376 data_used: 14049280
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124542976 unmapped: 47906816 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:56.834163+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 102 heartbeat osd_stat(store_statfs(0x1b465c000/0x0/0x1bfc00000, data 0x6f8ea13/0x702f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125329408 unmapped: 47120384 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:57.834358+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 102 heartbeat osd_stat(store_statfs(0x1b465a000/0x0/0x1bfc00000, data 0x6f90a13/0x7031000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125583360 unmapped: 46866432 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:58.834503+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.941456795s of 12.136711121s, submitted: 95
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 102 ms_handle_reset con 0x56377ba89800 session 0x56377c58e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125583360 unmapped: 46866432 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:59.834664+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125689856 unmapped: 46759936 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 103 ms_handle_reset con 0x56377e56b000 session 0x56377d23e780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b429000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:00.834832+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1760873 data_alloc: 201326592 data_used: 17420288
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 103 ms_handle_reset con 0x56377b429000 session 0x56377d2cd680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 132751360 unmapped: 39698432 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:01.834992+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 133873664 unmapped: 38576128 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:02.835194+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 104 heartbeat osd_stat(store_statfs(0x1b2841000/0x0/0x1bfc00000, data 0x8da61a9/0x8e4c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 104 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 ms_handle_reset con 0x56377b7a3c00 session 0x56377d6401e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 133947392 unmapped: 38502400 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:03.835365+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 ms_handle_reset con 0x56377ba89800 session 0x56377d2f4780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 ms_handle_reset con 0x56377e56b000 session 0x56377d5f6d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f102400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 ms_handle_reset con 0x56377f102400 session 0x56377d5d10e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 133963776 unmapped: 38486016 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 heartbeat osd_stat(store_statfs(0x1b2839000/0x0/0x1bfc00000, data 0x8da8c6a/0x8e52000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:04.835526+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a2400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123789312 unmapped: 48660480 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 106 ms_handle_reset con 0x56377e4a2400 session 0x56377d641680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:05.835664+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1656027 data_alloc: 184549376 data_used: 5365760
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123822080 unmapped: 48627712 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:06.836118+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123822080 unmapped: 48627712 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:07.836297+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123822080 unmapped: 48627712 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:08.836518+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 ms_handle_reset con 0x56377b7a3c00 session 0x56377c6ce5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 ms_handle_reset con 0x56377ba89800 session 0x56377ade7860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 ms_handle_reset con 0x56377e56b000 session 0x56377d2f5c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f102400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 ms_handle_reset con 0x56377f102400 session 0x56377d621a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b474d000/0x0/0x1bfc00000, data 0x6e98b2d/0x6f41000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56c400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.879722595s of 10.000221252s, submitted: 265
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56c800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 ms_handle_reset con 0x56377e56c400 session 0x56377d349860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123904000 unmapped: 48545792 heap: 172449792 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:09.836834+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 108 ms_handle_reset con 0x56377b7a3c00 session 0x56377dc87c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135774208 unmapped: 44376064 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 108 ms_handle_reset con 0x56377e56c800 session 0x56377dac2000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:10.837068+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1911321 data_alloc: 184549376 data_used: 13463552
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56b000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135790592 unmapped: 44359680 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b2bfb000/0x0/0x1bfc00000, data 0x89e4ef3/0x8a92000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:11.837288+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 109 ms_handle_reset con 0x56377e56b000 session 0x56377d2fa780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f102400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 109 ms_handle_reset con 0x56377f102400 session 0x56377d2f63c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135946240 unmapped: 44204032 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:12.837497+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba88800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135979008 unmapped: 44171264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:13.837672+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 111 ms_handle_reset con 0x56377ba89c00 session 0x56377fb0c000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b2bcd000/0x0/0x1bfc00000, data 0x8a0d4c8/0x8abd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125001728 unmapped: 55148544 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:14.837850+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125788160 unmapped: 54362112 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:15.838041+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1692532 data_alloc: 184549376 data_used: 9060352
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 111 ms_handle_reset con 0x56377ba88800 session 0x56377fb0c3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123715584 unmapped: 56434688 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:16.838241+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 111 ms_handle_reset con 0x56377b7a3c00 session 0x56377d9f9a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b6463000/0x0/0x1bfc00000, data 0x50b4812/0x5163000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:17.838487+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:18.838806+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:19.838962+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:20.839153+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:21.839381+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:22.839566+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:23.839722+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:24.839882+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:25.840101+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:26.840274+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:27.840406+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:28.840699+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:29.840865+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:30.841049+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:31.841288+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:32.841440+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:33.841574+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123609088 unmapped: 56541184 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:34.841763+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:35.841907+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:36.842088+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:37.842246+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:38.842443+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:39.842642+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:40.842827+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:41.843032+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:42.843168+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:43.843315+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:44.843462+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:45.843609+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:46.843773+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:47.843956+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:48.844099+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:49.844269+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:50.844405+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123617280 unmapped: 56532992 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:51.844559+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:52.844702+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:53.844854+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:54.845021+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:55.845192+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:56.845354+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:57.845507+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:58.845710+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123641856 unmapped: 56508416 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:59.845865+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:00.845989+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:01.846181+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:02.846384+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:03.846617+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:04.846767+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:05.846968+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123650048 unmapped: 56500224 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:06.847107+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123666432 unmapped: 56483840 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:07.847249+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:08.847412+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:09.848478+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:10.849437+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:11.858131+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:12.858671+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:13.858842+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123674624 unmapped: 56475648 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:14.859067+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:15.860184+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:16.862427+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:17.862603+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:18.862771+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:19.862941+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:20.863068+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:21.863210+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123682816 unmapped: 56467456 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:22.863360+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:23.863504+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:24.863647+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:25.864211+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:26.864395+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:27.864823+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:28.865011+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:29.865180+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123691008 unmapped: 56459264 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:30.865341+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123699200 unmapped: 56451072 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:31.865690+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123699200 unmapped: 56451072 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:32.865875+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123699200 unmapped: 56451072 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:33.866238+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123699200 unmapped: 56451072 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:34.866446+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123699200 unmapped: 56451072 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b654b000/0x0/0x1bfc00000, data 0x5092a4d/0x5141000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:35.866596+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 123699200 unmapped: 56451072 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1469911 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 86.252708435s of 87.206436157s, submitted: 292
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:36.866757+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124764160 unmapped: 55386112 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 113 heartbeat osd_stat(store_statfs(0x1b6547000/0x0/0x1bfc00000, data 0x5094dbd/0x5146000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:37.866914+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 ms_handle_reset con 0x56377ba89c00 session 0x56377da78960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124846080 unmapped: 55304192 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:38.867067+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124411904 unmapped: 55738368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:39.867217+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124420096 unmapped: 55730176 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6540000/0x0/0x1bfc00000, data 0x50971f6/0x514c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56c800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 ms_handle_reset con 0x56377e56c800 session 0x56377d5f6960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377f102400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 heartbeat osd_stat(store_statfs(0x1b6540000/0x0/0x1bfc00000, data 0x50971f6/0x514c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:40.867374+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124452864 unmapped: 55697408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 ms_handle_reset con 0x56377f102400 session 0x56377dbfc3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1480627 data_alloc: 184549376 data_used: 5390336
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:41.867545+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124452864 unmapped: 55697408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:42.867701+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124477440 unmapped: 55672832 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:43.867827+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124477440 unmapped: 55672832 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:44.867973+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124477440 unmapped: 55672832 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:45.868122+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124477440 unmapped: 55672832 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1484829 data_alloc: 184549376 data_used: 5402624
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b653f000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:46.868281+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124485632 unmapped: 55664640 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:47.868452+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124493824 unmapped: 55656448 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:48.868586+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124493824 unmapped: 55656448 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:49.868738+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124493824 unmapped: 55656448 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:50.868921+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 124502016 unmapped: 55648256 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 14.148326874s of 14.364180565s, submitted: 70
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1490961 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 ms_handle_reset con 0x56377b7a3c00 session 0x56377c6f6000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:51.869053+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b6540000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127229952 unmapped: 52920320 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:52.869219+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127229952 unmapped: 52920320 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:53.869410+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127229952 unmapped: 52920320 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:54.869525+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127238144 unmapped: 52912128 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b6540000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:55.869674+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127238144 unmapped: 52912128 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1495913 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b653e000/0x0/0x1bfc00000, data 0x5099444/0x5150000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:56.869801+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127238144 unmapped: 52912128 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b653e000/0x0/0x1bfc00000, data 0x5099444/0x5150000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:57.869926+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b653e000/0x0/0x1bfc00000, data 0x5099444/0x5150000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127246336 unmapped: 52903936 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba88800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:58.870054+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136011776 unmapped: 44138496 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 ms_handle_reset con 0x56377ba88800 session 0x56377d5d14a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 ms_handle_reset con 0x56377ba89c00 session 0x56377d2f4960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:59.870185+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126443520 unmapped: 53706752 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56c800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:00.870317+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 ms_handle_reset con 0x56377e56c800 session 0x56377d248d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1503443 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9f000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:01.870514+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9f000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:02.870652+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:03.870822+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:04.870978+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:05.871125+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1503443 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:06.871315+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9f000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:07.871564+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:08.871729+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:09.871955+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:10.872124+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1503443 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 20.070550919s of 20.558778763s, submitted: 120
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e560c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 ms_handle_reset con 0x56377e560c00 session 0x56377d23e3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:11.872304+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:12.872448+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 53690368 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9e000/0x0/0x1bfc00000, data 0x5099444/0x5150000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:13.872596+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 ms_handle_reset con 0x56377b7a3c00 session 0x56377ade74a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126492672 unmapped: 53657600 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:14.872802+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9e000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126492672 unmapped: 53657600 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:15.873343+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9e000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126492672 unmapped: 53657600 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1505284 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:16.873662+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126492672 unmapped: 53657600 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:17.873824+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126492672 unmapped: 53657600 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:18.873997+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:19.875093+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4fa0000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:20.875503+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1503187 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:21.875714+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:22.875886+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:23.876066+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:24.876219+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:25.876379+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4fa0000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1503187 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:26.876539+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:27.876688+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:28.876838+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:29.877029+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4fa0000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 18.518951416s of 18.583858490s, submitted: 14
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:30.877176+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1504969 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:31.877378+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:32.877527+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:33.877700+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126517248 unmapped: 53633024 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 2789 syncs, 3.74 writes per sync, written: 0.04 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 4513 writes, 15K keys, 4513 commit groups, 1.0 writes per commit group, ingest: 16.92 MB, 0.03 MB/s
                                                          Interval WAL: 4513 writes, 1925 syncs, 2.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:34.877999+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126550016 unmapped: 53600256 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4f9f000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:35.878118+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126550016 unmapped: 53600256 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1505001 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:36.878268+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b4fa0000/0x0/0x1bfc00000, data 0x50993d2/0x514e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126550016 unmapped: 53600256 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:37.878429+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126558208 unmapped: 53592064 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:38.878567+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126574592 unmapped: 53575680 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:39.878747+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126574592 unmapped: 53575680 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 heartbeat osd_stat(store_statfs(0x1b4f98000/0x0/0x1bfc00000, data 0x509bb80/0x5155000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.332384109s of 10.543567657s, submitted: 52
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:40.878929+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 heartbeat osd_stat(store_statfs(0x1b4f98000/0x0/0x1bfc00000, data 0x509bb80/0x5155000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126607360 unmapped: 53542912 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1520983 data_alloc: 184549376 data_used: 8142848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:41.879099+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126607360 unmapped: 53542912 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:42.879260+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135045120 unmapped: 45105152 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 heartbeat osd_stat(store_statfs(0x1b4797000/0x0/0x1bfc00000, data 0x589bbb3/0x5957000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:43.879427+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125632512 unmapped: 54517760 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:44.879758+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125640704 unmapped: 54509568 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:45.879904+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 134053888 unmapped: 46096384 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2123329 data_alloc: 184549376 data_used: 8142848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:46.880034+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 116 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125779968 unmapped: 54370304 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 117 ms_handle_reset con 0x56377ba89c00 session 0x56377c58e780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:47.880186+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56c800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125804544 unmapped: 54345728 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 117 heartbeat osd_stat(store_statfs(0x1aef93000/0x0/0x1bfc00000, data 0xb09df1e/0xb15b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:48.880359+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 117 handle_osd_map epochs [117,118], i have 117, src has [1,118]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125845504 unmapped: 54304768 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 118 ms_handle_reset con 0x56377e56c800 session 0x56377d23e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:49.880505+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 118 handle_osd_map epochs [118,119], i have 118, src has [1,119]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125935616 unmapped: 54214656 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:50.880639+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.174942970s of 10.016796112s, submitted: 182
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125943808 unmapped: 54206464 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 119 heartbeat osd_stat(store_statfs(0x1b4f8f000/0x0/0x1bfc00000, data 0x50a221a/0x515e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1539648 data_alloc: 184549376 data_used: 8167424
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:51.880815+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125943808 unmapped: 54206464 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:52.881028+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125943808 unmapped: 54206464 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:53.881200+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125943808 unmapped: 54206464 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:54.881351+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e569400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 119 handle_osd_map epochs [119,120], i have 119, src has [1,120]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 120 ms_handle_reset con 0x56377e569400 session 0x56377dac25a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125960192 unmapped: 54190080 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:55.881488+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 121 ms_handle_reset con 0x56377b7a3800 session 0x56377d6414a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 125976576 unmapped: 54173696 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1555908 data_alloc: 184549376 data_used: 8179712
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:56.881692+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 121 handle_osd_map epochs [121,122], i have 121, src has [1,122]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126025728 unmapped: 54124544 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 122 heartbeat osd_stat(store_statfs(0x1b4f7c000/0x0/0x1bfc00000, data 0x50a9ac9/0x5171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:57.881850+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 122 handle_osd_map epochs [122,123], i have 122, src has [1,123]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126083072 unmapped: 54067200 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:58.881985+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126132224 unmapped: 54018048 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:59.882187+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 124 handle_osd_map epochs [124,125], i have 124, src has [1,125]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 125 ms_handle_reset con 0x56377b7a3c00 session 0x56377d53e5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127180800 unmapped: 52969472 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 125 heartbeat osd_stat(store_statfs(0x1b4f65000/0x0/0x1bfc00000, data 0x50b2861/0x5188000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:00.882341+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.547062874s of 10.071820259s, submitted: 140
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 125 ms_handle_reset con 0x56377ba89c00 session 0x56377b72c1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127188992 unmapped: 52961280 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e569400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1600223 data_alloc: 184549376 data_used: 8138752
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:01.882488+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126500864 unmapped: 53649408 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 126 ms_handle_reset con 0x56377e569400 session 0x56377b72c5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:02.882646+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e56c800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 126509056 unmapped: 53641216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 126 ms_handle_reset con 0x56377e4a3400 session 0x56377c6ced20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:03.882872+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127533056 unmapped: 52617216 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 127 heartbeat osd_stat(store_statfs(0x1b4f5e000/0x0/0x1bfc00000, data 0x50b5d7f/0x518d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:04.883074+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781557000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 127696896 unmapped: 52453376 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:05.883215+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781556c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 128 heartbeat osd_stat(store_statfs(0x1b4f5a000/0x0/0x1bfc00000, data 0x50b7633/0x5190000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138371072 unmapped: 41779200 heap: 180150272 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 128 ms_handle_reset con 0x563781556c00 session 0x56377d2fda40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2085906 data_alloc: 184549376 data_used: 8155136
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:06.883337+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 128 ms_handle_reset con 0x56377b7a3c00 session 0x5637816e03c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 129 heartbeat osd_stat(store_statfs(0x1b135f000/0x0/0x1bfc00000, data 0x8cb7610/0x8d8f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 129187840 unmapped: 59359232 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:07.883509+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 131653632 unmapped: 56893440 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:08.883651+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 130 heartbeat osd_stat(store_statfs(0x1aa9b9000/0x0/0x1bfc00000, data 0xe4bbbb7/0xe593000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6caf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 130 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 131 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140410880 unmapped: 48136192 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:09.883827+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 133357568 unmapped: 55189504 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:10.883952+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 7.579258919s of 10.029663086s, submitted: 523
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137969664 unmapped: 50577408 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3694304 data_alloc: 184549376 data_used: 8192000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:11.884130+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 132 ms_handle_reset con 0x56377ba89c00 session 0x5637816e1860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 146759680 unmapped: 41787392 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:12.884295+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 132 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138690560 unmapped: 49856512 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:13.884466+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 133 heartbeat osd_stat(store_statfs(0x19d5b1000/0x0/0x1bfc00000, data 0x1b8c189d/0x1b99c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6caf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 143040512 unmapped: 45506560 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:14.884616+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136323072 unmapped: 52224000 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:15.884829+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 144965632 unmapped: 43581440 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:16.884937+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4984994 data_alloc: 184549376 data_used: 8138752
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 133 heartbeat osd_stat(store_statfs(0x196db2000/0x0/0x1bfc00000, data 0x220c189d/0x2219c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6caf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136863744 unmapped: 51683328 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:17.885075+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137306112 unmapped: 51240960 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:18.885218+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377e4a3400 session 0x56377b72cf00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x563781557000 session 0x56377c6ce000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377e56c800 session 0x56377da783c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137510912 unmapped: 51036160 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377b7a3c00 session 0x56377dc63680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377ba89c00 session 0x56377d4a32c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:19.885944+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 heartbeat osd_stat(store_statfs(0x1905ab000/0x0/0x1bfc00000, data 0x284c3bbf/0x285a3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x70af9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377e4a3400 session 0x56377d4a25a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137576448 unmapped: 50970624 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 heartbeat osd_stat(store_statfs(0x1905aa000/0x0/0x1bfc00000, data 0x284c3c21/0x285a4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x70af9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781557000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:20.886131+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 7.478157997s of 10.013821602s, submitted: 371
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x563781557000 session 0x56377dac3860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135839744 unmapped: 52707328 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:21.886375+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1786626 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 heartbeat osd_stat(store_statfs(0x1b0da9000/0x0/0x1bfc00000, data 0x50c3c21/0x51a4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x70af9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e569400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377e569400 session 0x56377da785a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135839744 unmapped: 52707328 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:22.886505+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 ms_handle_reset con 0x56377b7a3c00 session 0x56377dc65e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135880704 unmapped: 52666368 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:23.886647+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135864320 unmapped: 52682752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:24.886794+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135847936 unmapped: 52699136 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:25.886955+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 135 heartbeat osd_stat(store_statfs(0x1b39ad000/0x0/0x1bfc00000, data 0x50c5dc3/0x51a0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x70af9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 136 ms_handle_reset con 0x56377ba89c00 session 0x56377c6cfc20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135856128 unmapped: 52690944 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:26.887363+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1787561 data_alloc: 184549376 data_used: 8159232
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135864320 unmapped: 52682752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:27.888004+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 137 ms_handle_reset con 0x56377e4a3400 session 0x56377c5c3680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135864320 unmapped: 52682752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781557000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 137 ms_handle_reset con 0x563781557000 session 0x56377d4a3e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 137 heartbeat osd_stat(store_statfs(0x1b4983000/0x0/0x1bfc00000, data 0x50ca53b/0x51a8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:28.888162+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135872512 unmapped: 52674560 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781556800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:29.888329+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 ms_handle_reset con 0x563781556800 session 0x56377b72cb40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135888896 unmapped: 52658176 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:30.888936+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.405095100s of 10.333524704s, submitted: 314
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 ms_handle_reset con 0x56377b7a3c00 session 0x56377d249c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 ms_handle_reset con 0x56377ba89c00 session 0x56377dac2f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135921664 unmapped: 52625408 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 heartbeat osd_stat(store_statfs(0x1b497f000/0x0/0x1bfc00000, data 0x50cc931/0x51ae000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:31.889157+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1797372 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 ms_handle_reset con 0x56377e4a3400 session 0x56377ade6b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135913472 unmapped: 52633600 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781557000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 ms_handle_reset con 0x563781557000 session 0x56377ade6000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:32.889372+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 139 ms_handle_reset con 0x56377dfbac00 session 0x56377d641e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135938048 unmapped: 52609024 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:33.889615+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 139 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 140 ms_handle_reset con 0x56377b7a3c00 session 0x56377d349680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135946240 unmapped: 52600832 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 140 ms_handle_reset con 0x56377ba89c00 session 0x56377dc874a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:34.889783+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 140 ms_handle_reset con 0x56377dfbac00 session 0x56377d5f7680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135962624 unmapped: 52584448 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:35.889991+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 140 ms_handle_reset con 0x56377e4a3400 session 0x56377d248b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135962624 unmapped: 52584448 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:36.890144+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1804048 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 141 heartbeat osd_stat(store_statfs(0x1b4973000/0x0/0x1bfc00000, data 0x50d32bf/0x51ba000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135970816 unmapped: 52576256 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:37.890278+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 141 heartbeat osd_stat(store_statfs(0x1b4973000/0x0/0x1bfc00000, data 0x50d32bf/0x51ba000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135970816 unmapped: 52576256 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:38.890432+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135979008 unmapped: 52568064 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:39.890600+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135979008 unmapped: 52568064 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:40.890757+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 135987200 unmapped: 52559872 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:41.890951+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1810361 data_alloc: 184549376 data_used: 8134656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.330600739s of 10.653170586s, submitted: 97
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136052736 unmapped: 52494336 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:42.891147+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781557000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 ms_handle_reset con 0x563781557000 session 0x56377d249c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 heartbeat osd_stat(store_statfs(0x1b496e000/0x0/0x1bfc00000, data 0x50d56ed/0x51bf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 136069120 unmapped: 52477952 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:43.891332+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 ms_handle_reset con 0x56377ba89c00 session 0x56377da78960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 ms_handle_reset con 0x56377b7a3c00 session 0x56377b72cb40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137142272 unmapped: 51404800 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:44.891501+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 ms_handle_reset con 0x56377dfbac00 session 0x56377da78d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 ms_handle_reset con 0x56377e4a3400 session 0x56377c6cfc20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137150464 unmapped: 51396608 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:45.891667+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137150464 unmapped: 51396608 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:46.891805+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1817964 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 144 ms_handle_reset con 0x56377daf5000 session 0x56377d5052c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 137224192 unmapped: 51322880 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:47.891971+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138289152 unmapped: 50257920 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:48.892098+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 145 heartbeat osd_stat(store_statfs(0x1b4964000/0x0/0x1bfc00000, data 0x50dc00b/0x51c8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138289152 unmapped: 50257920 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:49.892236+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 145 ms_handle_reset con 0x56377b7a3c00 session 0x56377dbfc1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138166272 unmapped: 50380800 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:50.892377+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138190848 unmapped: 50356224 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:51.892576+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1818636 data_alloc: 184549376 data_used: 8159232
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ba89c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.765432358s of 10.430480003s, submitted: 202
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 145 ms_handle_reset con 0x56377ba89c00 session 0x5637816e1c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138190848 unmapped: 50356224 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 145 heartbeat osd_stat(store_statfs(0x1b4963000/0x0/0x1bfc00000, data 0x50dc08e/0x51cb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:52.892754+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138231808 unmapped: 50315264 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 ms_handle_reset con 0x56377daf5000 session 0x56377c6430e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:53.892939+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 ms_handle_reset con 0x56377dfbac00 session 0x56377d2fb4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e4a3400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 ms_handle_reset con 0x56377e4a3400 session 0x56377d9b7c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138993664 unmapped: 49553408 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:54.893096+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 138993664 unmapped: 49553408 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:55.893234+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139067392 unmapped: 49479680 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:56.893422+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1921194 data_alloc: 184549376 data_used: 8171520
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 146 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139132928 unmapped: 49414144 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:57.893613+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 147 ms_handle_reset con 0x56377daf5000 session 0x56377dc623c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 147 heartbeat osd_stat(store_statfs(0x1b335d000/0x0/0x1bfc00000, data 0x66db6b6/0x67d0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139796480 unmapped: 48750592 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:58.893805+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 148 ms_handle_reset con 0x56377dfbac00 session 0x5637816e0780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 148 ms_handle_reset con 0x56377b7a3c00 session 0x56377d5d1c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139624448 unmapped: 48922624 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:59.893986+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 149 ms_handle_reset con 0x56377daf4c00 session 0x563781d7e960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e059800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e059400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 149 ms_handle_reset con 0x56377e059400 session 0x563781d7f0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139657216 unmapped: 48889856 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:00.894182+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 ms_handle_reset con 0x56377e059800 session 0x563781d7ed20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139714560 unmapped: 48832512 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:01.894390+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2042482 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 45
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 ms_handle_reset con 0x56377daf4c00 session 0x563781d7f4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 heartbeat osd_stat(store_statfs(0x1b334a000/0x0/0x1bfc00000, data 0x66e2875/0x67e1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dd9d000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.381918907s of 10.044230461s, submitted: 160
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140115968 unmapped: 48431104 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 ms_handle_reset con 0x56377dfbac00 session 0x563781d7eb40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:02.894507+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 ms_handle_reset con 0x56377daf5000 session 0x563781d7f860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 ms_handle_reset con 0x56377dd9d000 session 0x56377d621c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 ms_handle_reset con 0x56377daf4c00 session 0x56377d2fb4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 ms_handle_reset con 0x56377b7a3c00 session 0x563781d7f2c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139870208 unmapped: 48676864 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:03.894684+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 heartbeat osd_stat(store_statfs(0x1b25d3000/0x0/0x1bfc00000, data 0x745abcf/0x755a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 152 ms_handle_reset con 0x56377daf5000 session 0x56377dc874a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 139927552 unmapped: 48619520 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:04.894860+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 ms_handle_reset con 0x56377dfbac00 session 0x56377dc87a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e059800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 ms_handle_reset con 0x56377e059800 session 0x56377dc87680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140009472 unmapped: 48537600 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:05.895011+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 ms_handle_reset con 0x56377b7a3c00 session 0x56377dbfc1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 ms_handle_reset con 0x56377daf4c00 session 0x56377c6f72c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 ms_handle_reset con 0x56377daf5000 session 0x56377d5f7680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140042240 unmapped: 48504832 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:06.895219+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 ms_handle_reset con 0x56377dfbac00 session 0x56377d2f30e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377c530400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2050950 data_alloc: 184549376 data_used: 8134656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 154 ms_handle_reset con 0x56377dfd7c00 session 0x56377dac2f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140181504 unmapped: 48365568 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:07.895370+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 154 handle_osd_map epochs [154,155], i have 154, src has [1,155]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140222464 unmapped: 48324608 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:08.895564+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 156 ms_handle_reset con 0x56377b7a3800 session 0x56377fcee3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 156 heartbeat osd_stat(store_statfs(0x1b3f57000/0x0/0x1bfc00000, data 0x5ad62be/0x5bd6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 156 ms_handle_reset con 0x56377b7a3c00 session 0x56377fcef0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 156 ms_handle_reset con 0x56377daf5000 session 0x5637814603c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140312576 unmapped: 48234496 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:09.895758+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 157 ms_handle_reset con 0x56377daf4c00 session 0x56377d9b6960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140320768 unmapped: 48226304 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:10.895987+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfbac00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 157 handle_osd_map epochs [157,158], i have 157, src has [1,158]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 158 ms_handle_reset con 0x56377dfbac00 session 0x56377d9b6d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140394496 unmapped: 48152576 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:11.896243+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1905194 data_alloc: 184549376 data_used: 8138752
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.661634445s of 10.080890656s, submitted: 416
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 46
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140566528 unmapped: 47980544 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:12.896418+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 158 handle_osd_map epochs [158,159], i have 158, src has [1,159]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 159 ms_handle_reset con 0x56377b7a3800 session 0x5637816e1e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140582912 unmapped: 47964160 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:13.896547+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 159 heartbeat osd_stat(store_statfs(0x1b4928000/0x0/0x1bfc00000, data 0x50fb3e6/0x5203000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 159 ms_handle_reset con 0x56377b7a3c00 session 0x5637814601e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140582912 unmapped: 47964160 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:14.896772+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140582912 unmapped: 47964160 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:15.896993+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 159 heartbeat osd_stat(store_statfs(0x1b492c000/0x0/0x1bfc00000, data 0x50fb384/0x5202000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140582912 unmapped: 47964160 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:16.897233+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1903938 data_alloc: 184549376 data_used: 8134656
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140582912 unmapped: 47964160 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:17.897407+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:18.897586+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 140591104 unmapped: 47955968 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:19.897730+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148504576 unmapped: 40042496 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 ms_handle_reset con 0x56377daf4c00 session 0x56377dc625a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 ms_handle_reset con 0x56377daf5000 session 0x56378243d680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd6c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 ms_handle_reset con 0x56377dfd6c00 session 0x56377d2f2b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 ms_handle_reset con 0x56377b7a3800 session 0x56377c6f6780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 ms_handle_reset con 0x56377b7a3c00 session 0x5637816e0d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:20.897927+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142188544 unmapped: 46358528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 heartbeat osd_stat(store_statfs(0x1b30be000/0x0/0x1bfc00000, data 0x656666c/0x6670000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:21.898107+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142188544 unmapped: 46358528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2073390 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.539269447s of 10.017318726s, submitted: 163
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 ms_handle_reset con 0x56377daf4c00 session 0x56377d5f6b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:22.898318+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142188544 unmapped: 46358528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf5000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:23.898548+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142188544 unmapped: 46358528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 161 ms_handle_reset con 0x56377daf5000 session 0x56377d2f7680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:24.898688+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142229504 unmapped: 46317568 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd6800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:25.898850+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142229504 unmapped: 46317568 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 161 handle_osd_map epochs [161,162], i have 161, src has [1,162]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 162 ms_handle_reset con 0x56377dfd6800 session 0x56377d4a30e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 162 heartbeat osd_stat(store_statfs(0x1b30b7000/0x0/0x1bfc00000, data 0x6568a36/0x6675000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:26.899005+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142229504 unmapped: 46317568 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2086503 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 162 ms_handle_reset con 0x56377b7a3c00 session 0x56377da78960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:27.899177+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 142254080 unmapped: 46292992 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 162 heartbeat osd_stat(store_statfs(0x1b30b3000/0x0/0x1bfc00000, data 0x656adc4/0x667a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:28.899330+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 143998976 unmapped: 44548096 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:29.899475+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148627456 unmapped: 39919616 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:30.899720+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148627456 unmapped: 39919616 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:31.899980+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148627456 unmapped: 39919616 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2162823 data_alloc: 201326592 data_used: 18997248
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 162 heartbeat osd_stat(store_statfs(0x1b30b3000/0x0/0x1bfc00000, data 0x656adc4/0x667a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:32.900187+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148692992 unmapped: 39854080 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 162 handle_osd_map epochs [162,163], i have 162, src has [1,163]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.265985489s of 10.451216698s, submitted: 51
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:33.900372+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148692992 unmapped: 39854080 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:34.900511+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 163 heartbeat osd_stat(store_statfs(0x1b30af000/0x0/0x1bfc00000, data 0x656d012/0x667e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148692992 unmapped: 39854080 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 163 heartbeat osd_stat(store_statfs(0x1b30af000/0x0/0x1bfc00000, data 0x656d012/0x667e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:35.900714+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148692992 unmapped: 39854080 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:36.900948+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148692992 unmapped: 39854080 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2165825 data_alloc: 201326592 data_used: 18997248
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:37.901156+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 148692992 unmapped: 39854080 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:38.901333+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157155328 unmapped: 31391744 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:39.901485+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155598848 unmapped: 32948224 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 163 heartbeat osd_stat(store_statfs(0x1b1d92000/0x0/0x1bfc00000, data 0x788b012/0x799c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:40.901663+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154755072 unmapped: 33792000 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:41.901816+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154755072 unmapped: 33792000 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2321025 data_alloc: 201326592 data_used: 19185664
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 163 heartbeat osd_stat(store_statfs(0x1b1d76000/0x0/0x1bfc00000, data 0x78a1012/0x79b2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:42.901974+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154771456 unmapped: 33775616 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:43.902169+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154771456 unmapped: 33775616 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:44.902421+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154820608 unmapped: 33726464 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.016279221s of 12.649553299s, submitted: 198
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:45.902598+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154959872 unmapped: 33587200 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 163 heartbeat osd_stat(store_statfs(0x1b1d57000/0x0/0x1bfc00000, data 0x78c6012/0x79d7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:46.902764+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154976256 unmapped: 33570816 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2324815 data_alloc: 201326592 data_used: 19189760
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 ms_handle_reset con 0x56377daf4c00 session 0x56377c6cf0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:47.902956+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156049408 unmapped: 32497664 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 heartbeat osd_stat(store_statfs(0x1b1d4e000/0x0/0x1bfc00000, data 0x78c87d1/0x79df000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,2])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 ms_handle_reset con 0x56377dfd7000 session 0x563781461a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:48.903092+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156655616 unmapped: 31891456 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 heartbeat osd_stat(store_statfs(0x1b1d4e000/0x0/0x1bfc00000, data 0x78c87d1/0x79df000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 ms_handle_reset con 0x56377e561800 session 0x56377d4a25a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:49.903271+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156704768 unmapped: 31842304 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 165 ms_handle_reset con 0x56377dfd7400 session 0x56377fcef860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:50.903484+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157794304 unmapped: 30752768 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 165 heartbeat osd_stat(store_statfs(0x1b1d4d000/0x0/0x1bfc00000, data 0x78ca746/0x79e0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 165 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 166 ms_handle_reset con 0x56377b7a3c00 session 0x56377d2f6b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 166 ms_handle_reset con 0x56377dfd7000 session 0x56377fcee960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:51.903708+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157810688 unmapped: 30736384 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 166 ms_handle_reset con 0x56377e561800 session 0x56377b72c000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2347685 data_alloc: 201326592 data_used: 19746816
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 heartbeat osd_stat(store_statfs(0x1b1d3f000/0x0/0x1bfc00000, data 0x78d2b86/0x79ed000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 ms_handle_reset con 0x56377b428800 session 0x56377d505c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377d64f800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:52.903841+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 ms_handle_reset con 0x56377d64f800 session 0x56377c58e1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 ms_handle_reset con 0x56377b428400 session 0x56377c58f680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157884416 unmapped: 30662656 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 ms_handle_reset con 0x56377b428800 session 0x56377d505c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:53.903976+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 ms_handle_reset con 0x56377b7a3c00 session 0x56377d5f7680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 ms_handle_reset con 0x56377daf4c00 session 0x5637816e01e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157917184 unmapped: 30629888 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 ms_handle_reset con 0x56377b428c00 session 0x56377d2f50e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:54.904123+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157925376 unmapped: 30621696 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 ms_handle_reset con 0x56377b428800 session 0x56377d5d0960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 ms_handle_reset con 0x56377b7a3c00 session 0x56377d23d4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.064233780s of 10.036801338s, submitted: 228
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 168 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:55.904240+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 169 ms_handle_reset con 0x56377b428400 session 0x56377d23e3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157974528 unmapped: 30572544 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 169 ms_handle_reset con 0x56377dfd7000 session 0x56377fcef860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 169 handle_osd_map epochs [169,170], i have 169, src has [1,170]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 170 heartbeat osd_stat(store_statfs(0x1b1d31000/0x0/0x1bfc00000, data 0x78d98d3/0x79fb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:56.904373+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157974528 unmapped: 30572544 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 170 ms_handle_reset con 0x56377daf4c00 session 0x56377fcee3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2378468 data_alloc: 201326592 data_used: 19705856
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 170 ms_handle_reset con 0x56377daf4c00 session 0x563781d7f680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 170 handle_osd_map epochs [170,171], i have 170, src has [1,171]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 171 ms_handle_reset con 0x56377b428400 session 0x56377d9f9a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:57.904518+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157974528 unmapped: 30572544 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 172 ms_handle_reset con 0x56377b428800 session 0x56377d2fa780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:58.904674+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158031872 unmapped: 30515200 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 172 handle_osd_map epochs [172,173], i have 172, src has [1,173]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 173 ms_handle_reset con 0x56377dfd7000 session 0x56377b72d2c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:59.904941+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158031872 unmapped: 30515200 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 47
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:00.905197+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158220288 unmapped: 30326784 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:01.905371+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158261248 unmapped: 30285824 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 174 heartbeat osd_stat(store_statfs(0x1b1d18000/0x0/0x1bfc00000, data 0x78e7c71/0x7a14000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 174 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 174 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2400038 data_alloc: 201326592 data_used: 19718144
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 175 ms_handle_reset con 0x56377b214000 session 0x56378243da40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 175 ms_handle_reset con 0x56377b214400 session 0x56377c6f6780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:02.905550+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158343168 unmapped: 30203904 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 176 ms_handle_reset con 0x56377e561800 session 0x56377c6f7c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:03.905876+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158359552 unmapped: 30187520 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 177 ms_handle_reset con 0x56377b214000 session 0x56377e31b0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:04.906066+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158400512 unmapped: 30146560 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 177 handle_osd_map epochs [177,178], i have 177, src has [1,178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 178 ms_handle_reset con 0x56377b428400 session 0x56377d349680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:05.906778+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158400512 unmapped: 30146560 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.110803604s of 10.813267708s, submitted: 241
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 178 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 179 ms_handle_reset con 0x56377dfd7800 session 0x563781461c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 179 heartbeat osd_stat(store_statfs(0x1b1d08000/0x0/0x1bfc00000, data 0x78f0f35/0x7a25000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:06.906963+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158425088 unmapped: 30121984 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2415981 data_alloc: 201326592 data_used: 19718144
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:07.907358+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 179 handle_osd_map epochs [179,180], i have 179, src has [1,180]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158441472 unmapped: 30105600 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 180 ms_handle_reset con 0x56377b7a3800 session 0x56377d4a2000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 180 heartbeat osd_stat(store_statfs(0x1b1cfd000/0x0/0x1bfc00000, data 0x78f55e9/0x7a2e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:08.907496+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151158784 unmapped: 37388288 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 180 ms_handle_reset con 0x56377dfd7800 session 0x56377ade6d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:09.907658+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151240704 unmapped: 37306368 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 181 heartbeat osd_stat(store_statfs(0x1b44c8000/0x0/0x1bfc00000, data 0x512c92e/0x5263000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:10.907840+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151265280 unmapped: 37281792 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 181 handle_osd_map epochs [181,182], i have 181, src has [1,182]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 182 ms_handle_reset con 0x56377b214000 session 0x56377db343c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:11.908116+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151265280 unmapped: 37281792 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2051034 data_alloc: 184549376 data_used: 8716288
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:12.908454+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151134208 unmapped: 37412864 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 183 ms_handle_reset con 0x56377b214400 session 0x56377dbfc1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:13.908681+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151150592 unmapped: 37396480 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:14.908824+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 183 heartbeat osd_stat(store_statfs(0x1b44c6000/0x0/0x1bfc00000, data 0x513105e/0x5268000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 183 handle_osd_map epochs [183,184], i have 183, src has [1,184]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150478848 unmapped: 38068224 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:15.908984+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151527424 unmapped: 37019648 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:16.909314+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150478848 unmapped: 38068224 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 185 handle_osd_map epochs [185,186], i have 185, src has [1,186]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.869583130s of 10.623068810s, submitted: 269
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 186 ms_handle_reset con 0x56377b428400 session 0x56377d620960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2052106 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 186 ms_handle_reset con 0x56377b214000 session 0x56377b72dc20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 186 heartbeat osd_stat(store_statfs(0x1b44bb000/0x0/0x1bfc00000, data 0x5137b1e/0x5272000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:17.909602+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 186 handle_osd_map epochs [186,187], i have 186, src has [1,187]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150552576 unmapped: 37994496 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:18.910018+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150552576 unmapped: 37994496 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:19.910279+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b44b8000/0x0/0x1bfc00000, data 0x5139d36/0x5274000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150446080 unmapped: 38100992 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:20.910413+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150446080 unmapped: 38100992 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b44b9000/0x0/0x1bfc00000, data 0x5139d65/0x5273000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:21.910660+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150446080 unmapped: 38100992 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2054866 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:22.910856+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 187 handle_osd_map epochs [187,188], i have 187, src has [1,188]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:23.911051+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:24.911226+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b44b5000/0x0/0x1bfc00000, data 0x513c129/0x5279000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b44b5000/0x0/0x1bfc00000, data 0x513c129/0x5279000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:25.911365+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:26.911487+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2057860 data_alloc: 184549376 data_used: 8159232
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b44b5000/0x0/0x1bfc00000, data 0x513c129/0x5279000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:27.911713+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:28.911876+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.508609772s of 11.772043228s, submitted: 90
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150388736 unmapped: 38158336 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 ms_handle_reset con 0x56377b214400 session 0x56377d249e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:29.912114+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150323200 unmapped: 38223872 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 ms_handle_reset con 0x56377b428400 session 0x56377c6cf2c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:30.912238+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150315008 unmapped: 38232064 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:31.912399+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150315008 unmapped: 38232064 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b44b3000/0x0/0x1bfc00000, data 0x513c2ca/0x527b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2063876 data_alloc: 184549376 data_used: 8159232
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 ms_handle_reset con 0x56377b7a3800 session 0x56377d2f7a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:32.912548+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 ms_handle_reset con 0x56377dfd7800 session 0x56377c6cfe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150380544 unmapped: 38166528 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b433b000/0x0/0x1bfc00000, data 0x52b6232/0x53f3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 ms_handle_reset con 0x56377b214000 session 0x56377c5c2780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:33.912706+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 36749312 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:34.912841+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 36749312 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:35.913064+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 36749312 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 188 handle_osd_map epochs [188,189], i have 188, src has [1,189]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 ms_handle_reset con 0x56377b214400 session 0x56377ade81e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:36.913214+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 36732928 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2167484 data_alloc: 184549376 data_used: 8171520
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:37.913368+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 heartbeat osd_stat(store_statfs(0x1b386a000/0x0/0x1bfc00000, data 0x5d8269d/0x5ec3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 ms_handle_reset con 0x56377b428400 session 0x563781d7fe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151822336 unmapped: 36724736 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:38.913509+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150880256 unmapped: 37666816 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 ms_handle_reset con 0x56377b7a3800 session 0x563781d7fa40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.106908798s of 10.540288925s, submitted: 101
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:39.913664+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150880256 unmapped: 37666816 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:40.913863+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 heartbeat osd_stat(store_statfs(0x1b386b000/0x0/0x1bfc00000, data 0x5d826c6/0x5ec2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150880256 unmapped: 37666816 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:41.914125+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150880256 unmapped: 37666816 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2168419 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 189 handle_osd_map epochs [189,190], i have 189, src has [1,190]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:42.914312+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 heartbeat osd_stat(store_statfs(0x1b3866000/0x0/0x1bfc00000, data 0x5d84af5/0x5ec7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150896640 unmapped: 37650432 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:43.914518+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150913024 unmapped: 37634048 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:44.914955+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e561800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377e561800 session 0x56377e31a780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150929408 unmapped: 37617664 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b214000 session 0x56377da785a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:45.915159+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150929408 unmapped: 37617664 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b214400 session 0x56377d620780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:46.915303+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b428400 session 0x56377fcefe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 37568512 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2175423 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:47.915407+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b7a3800 session 0x56377d2cdc20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 heartbeat osd_stat(store_statfs(0x1b3866000/0x0/0x1bfc00000, data 0x5d84acf/0x5ec7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 37568512 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:48.915539+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151011328 unmapped: 37535744 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.579089165s of 10.002539635s, submitted: 101
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b428800 session 0x56377d2f6f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 heartbeat osd_stat(store_statfs(0x1b3864000/0x0/0x1bfc00000, data 0x5d84c67/0x5eca000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:49.915693+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151011328 unmapped: 37535744 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:50.915846+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 heartbeat osd_stat(store_statfs(0x1b3865000/0x0/0x1bfc00000, data 0x5d84c44/0x5ec9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151027712 unmapped: 37519360 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:51.916092+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151027712 unmapped: 37519360 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2178839 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b214000 session 0x56378243cd20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:52.916244+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 ms_handle_reset con 0x56377b214400 session 0x56377c58e000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151035904 unmapped: 37511168 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:53.916374+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 handle_osd_map epochs [190,191], i have 190, src has [1,191]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 190 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151076864 unmapped: 37470208 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 191 ms_handle_reset con 0x56377b7a3800 session 0x56377db34780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 191 ms_handle_reset con 0x56377b428400 session 0x56377d2f3c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377daf4c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 191 ms_handle_reset con 0x56377daf4c00 session 0x56377c58e3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 191 heartbeat osd_stat(store_statfs(0x1b3864000/0x0/0x1bfc00000, data 0x5d84d9a/0x5eca000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:54.916502+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151085056 unmapped: 37462016 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 191 ms_handle_reset con 0x56377b214400 session 0x56377d2cd0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:55.916651+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 191 handle_osd_map epochs [191,192], i have 191, src has [1,192]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151101440 unmapped: 37445632 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 192 ms_handle_reset con 0x56377b214000 session 0x56377d4a3e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:56.916830+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 192 heartbeat osd_stat(store_statfs(0x1b345f000/0x0/0x1bfc00000, data 0x5d87261/0x5ecd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 192 ms_handle_reset con 0x56377b428400 session 0x56378243d860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151117824 unmapped: 37429248 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2191646 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 192 ms_handle_reset con 0x56377b7a3800 session 0x56378243da40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 192 ms_handle_reset con 0x56377dfd7000 session 0x56377d5d1680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:57.916963+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 193 ms_handle_reset con 0x56377dfd7000 session 0x56377d2f6780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 193 heartbeat osd_stat(store_statfs(0x1b3459000/0x0/0x1bfc00000, data 0x5d8b9e5/0x5ed4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151117824 unmapped: 37429248 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:58.917096+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152174592 unmapped: 36372480 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:59.917243+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.081675529s of 10.685091019s, submitted: 212
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 195 ms_handle_reset con 0x56377b214000 session 0x56377d23d4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152174592 unmapped: 36372480 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:00.917398+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152182784 unmapped: 36364288 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 195 ms_handle_reset con 0x56377b214400 session 0x56377fb0c780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:01.917585+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 196 ms_handle_reset con 0x56377b428400 session 0x56377fb0de00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152190976 unmapped: 36356096 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2201414 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:02.917768+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152190976 unmapped: 36356096 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:03.918341+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 197 heartbeat osd_stat(store_statfs(0x1b3450000/0x0/0x1bfc00000, data 0x5d9256e/0x5ede000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 198 ms_handle_reset con 0x56377b7a3800 session 0x56377d2f2780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152199168 unmapped: 36347904 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:04.918489+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 198 ms_handle_reset con 0x56377b214000 session 0x56377db35c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 199 ms_handle_reset con 0x56377b214400 session 0x563781d7e960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152199168 unmapped: 36347904 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:05.919017+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152199168 unmapped: 36347904 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 199 ms_handle_reset con 0x56377dfd7000 session 0x56377d2f54a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:06.919227+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 200 handle_osd_map epochs [199,200], i have 200, src has [1,200]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 200 ms_handle_reset con 0x56377b428400 session 0x56377fcee5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152199168 unmapped: 36347904 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2220018 data_alloc: 184549376 data_used: 8171520
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:07.919529+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152199168 unmapped: 36347904 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:08.919720+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377ee57400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 201 ms_handle_reset con 0x56377ee57400 session 0x5637816e0f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 202 ms_handle_reset con 0x56377b214000 session 0x56377e31b2c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 36691968 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:09.919950+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 202 heartbeat osd_stat(store_statfs(0x1b343a000/0x0/0x1bfc00000, data 0x5d9d662/0x5ef4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 36691968 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.829440117s of 10.352311134s, submitted: 179
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 202 ms_handle_reset con 0x56377b214400 session 0x56377d53fe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 202 heartbeat osd_stat(store_statfs(0x1b3434000/0x0/0x1bfc00000, data 0x5d9fb02/0x5ef9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:10.920070+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151887872 unmapped: 36659200 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 203 ms_handle_reset con 0x56377b428400 session 0x56377d53fa40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:11.920220+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 203 ms_handle_reset con 0x56377dfd7000 session 0x56378243de00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377df87400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151961600 unmapped: 36585472 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 204 ms_handle_reset con 0x56377df87400 session 0x56377c6425a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2236560 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 204 heartbeat osd_stat(store_statfs(0x1b3430000/0x0/0x1bfc00000, data 0x5da2083/0x5efe000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:12.920346+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 204 ms_handle_reset con 0x56377b214000 session 0x56377c643c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b428400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 204 ms_handle_reset con 0x56377b428400 session 0x56377dc87680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 205 ms_handle_reset con 0x56377b214400 session 0x56377d349680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151994368 unmapped: 36552704 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:13.920540+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 205 handle_osd_map epochs [205,206], i have 205, src has [1,206]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152010752 unmapped: 36536320 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377df87400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 206 ms_handle_reset con 0x56377df87400 session 0x5637816e1a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:14.920706+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152027136 unmapped: 36519936 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:15.920843+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd7000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 206 heartbeat osd_stat(store_statfs(0x1b3423000/0x0/0x1bfc00000, data 0x5da8d05/0x5f0b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152027136 unmapped: 36519936 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 207 ms_handle_reset con 0x56377dfd7000 session 0x56377d504000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:16.921069+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 151912448 unmapped: 36634624 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2259441 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:17.921218+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 208 handle_osd_map epochs [208,209], i have 208, src has [1,209]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152002560 unmapped: 36544512 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:18.921390+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 209 ms_handle_reset con 0x56377b214000 session 0x56377d23d0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 209 ms_handle_reset con 0x56377b214400 session 0x56377db35680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152043520 unmapped: 36503552 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:19.921550+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 209 heartbeat osd_stat(store_statfs(0x1b3412000/0x0/0x1bfc00000, data 0x5daf9cc/0x5f19000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152043520 unmapped: 36503552 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.115362167s of 10.153011322s, submitted: 344
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:20.921745+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152092672 unmapped: 36454400 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 209 heartbeat osd_stat(store_statfs(0x1b3411000/0x0/0x1bfc00000, data 0x5daf9fb/0x5f18000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:21.921952+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 152100864 unmapped: 36446208 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2263857 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:22.922163+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 210 ms_handle_reset con 0x56377b7a3c00 session 0x5637816e0d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153239552 unmapped: 35307520 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 210 heartbeat osd_stat(store_statfs(0x1b3411000/0x0/0x1bfc00000, data 0x5db1c87/0x5f1b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:23.922350+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 48
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153075712 unmapped: 35471360 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:24.922514+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153075712 unmapped: 35471360 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:25.922707+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 211 heartbeat osd_stat(store_statfs(0x1b340a000/0x0/0x1bfc00000, data 0x5db431e/0x5f22000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153083904 unmapped: 35463168 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:26.922821+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153092096 unmapped: 35454976 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2275157 data_alloc: 184549376 data_used: 8163328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 211 ms_handle_reset con 0x56377e058c00 session 0x56377d53ef00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:27.923010+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153092096 unmapped: 35454976 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:28.923164+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 211 handle_osd_map epochs [211,212], i have 211, src has [1,212]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153124864 unmapped: 35422208 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:29.923363+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377c6db800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153124864 unmapped: 35422208 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 212 heartbeat osd_stat(store_statfs(0x1b3407000/0x0/0x1bfc00000, data 0x5db67ed/0x5f25000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:30.923501+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.804990768s of 10.444905281s, submitted: 510
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b703000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 212 ms_handle_reset con 0x56377b703000 session 0x56377d2f3a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153124864 unmapped: 35422208 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:31.923679+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 212 ms_handle_reset con 0x56377b214000 session 0x563781461680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153133056 unmapped: 35414016 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 49
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 212 ms_handle_reset con 0x56377b214400 session 0x56377dbfd680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2281799 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:32.923837+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 153174016 unmapped: 35373056 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:33.924020+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154222592 unmapped: 34324480 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:34.924150+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 154230784 unmapped: 34316288 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:35.924278+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 213 ms_handle_reset con 0x56377b7a3c00 session 0x56377c6430e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155287552 unmapped: 33259520 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 213 heartbeat osd_stat(store_statfs(0x1b3407000/0x0/0x1bfc00000, data 0x5db8ebc/0x5f27000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:36.924392+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 213 heartbeat osd_stat(store_statfs(0x1b3405000/0x0/0x1bfc00000, data 0x5db8f94/0x5f29000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155287552 unmapped: 33259520 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2295325 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:37.924519+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 213 handle_osd_map epochs [213,214], i have 213, src has [1,214]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 heartbeat osd_stat(store_statfs(0x1b3403000/0x0/0x1bfc00000, data 0x5db9006/0x5f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377e058c00 session 0x56377d53e3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155312128 unmapped: 33234944 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b703c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377b703c00 session 0x563780d2c000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 heartbeat osd_stat(store_statfs(0x1b3403000/0x0/0x1bfc00000, data 0x5db9006/0x5f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:38.924682+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155320320 unmapped: 33226752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:39.924820+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155320320 unmapped: 33226752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:40.924958+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155320320 unmapped: 33226752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:41.925112+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155320320 unmapped: 33226752 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2292520 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:42.925252+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155328512 unmapped: 33218560 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 heartbeat osd_stat(store_statfs(0x1b3403000/0x0/0x1bfc00000, data 0x5dbb18f/0x5f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:43.925440+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155328512 unmapped: 33218560 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 heartbeat osd_stat(store_statfs(0x1b3403000/0x0/0x1bfc00000, data 0x5dbb18f/0x5f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 13.131130219s of 13.646506310s, submitted: 169
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:44.925613+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b703c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377b703c00 session 0x563780d2c1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155336704 unmapped: 33210368 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:45.925758+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377b214000 session 0x56377d3490e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377b214400 session 0x56377d6205a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155336704 unmapped: 33210368 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377e058000 session 0x563780d2c5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:46.925964+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155361280 unmapped: 33185792 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 heartbeat osd_stat(store_statfs(0x1b33fe000/0x0/0x1bfc00000, data 0x5dbb29d/0x5f2e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2302506 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:47.926116+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155361280 unmapped: 33185792 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:48.926241+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 ms_handle_reset con 0x56377e058c00 session 0x563780d2c780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155361280 unmapped: 33185792 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 215 ms_handle_reset con 0x56377e058000 session 0x563780e6a3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:49.926380+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 215 ms_handle_reset con 0x56377b214400 session 0x563780e6a5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155516928 unmapped: 33030144 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 ms_handle_reset con 0x56377b214000 session 0x563780e6b860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:50.926532+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 ms_handle_reset con 0x56377e058c00 session 0x563780d2c960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b703c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 ms_handle_reset con 0x56377b703c00 session 0x563780d2cb40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155590656 unmapped: 32956416 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b703c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 ms_handle_reset con 0x56377b703c00 session 0x563780d2cd20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:51.926695+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 heartbeat osd_stat(store_statfs(0x1b33d0000/0x0/0x1bfc00000, data 0x5de1944/0x5f5d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155688960 unmapped: 32858112 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2324663 data_alloc: 184549376 data_used: 8212480
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:52.926818+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 ms_handle_reset con 0x56377b214400 session 0x563780e6be00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 155926528 unmapped: 32620544 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 217 ms_handle_reset con 0x56377b214000 session 0x563780d2cf00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:53.926978+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 217 handle_osd_map epochs [217,218], i have 217, src has [1,218]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156164096 unmapped: 32382976 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.230495453s of 10.037485123s, submitted: 152
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:54.927110+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b7a3c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 218 ms_handle_reset con 0x56377dfb8c00 session 0x563780d8be00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156237824 unmapped: 32309248 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 219 ms_handle_reset con 0x56377b7a3c00 session 0x56377d53ed20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 219 ms_handle_reset con 0x56377e058c00 session 0x5637807fe780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 219 ms_handle_reset con 0x56377e058000 session 0x56377dc623c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:55.927285+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156401664 unmapped: 32145408 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:56.927435+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 156409856 unmapped: 32137216 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 220 ms_handle_reset con 0x56377b214000 session 0x563781752d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 220 ms_handle_reset con 0x56377b214400 session 0x5637807fed20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2351451 data_alloc: 184549376 data_used: 8212480
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b703c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:57.927601+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 220 ms_handle_reset con 0x56377b703c00 session 0x5637817530e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 220 heartbeat osd_stat(store_statfs(0x1b337d000/0x0/0x1bfc00000, data 0x5e2f815/0x5fb1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157007872 unmapped: 31539200 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 ms_handle_reset con 0x56377dfb8c00 session 0x5637807ff0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:58.927719+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 ms_handle_reset con 0x56377b214000 session 0x5637817532c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 ms_handle_reset con 0x56377b214400 session 0x5637807ff680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157057024 unmapped: 31490048 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:59.927881+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157302784 unmapped: 31244288 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 222 ms_handle_reset con 0x56377e058000 session 0x563781753c20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:00.928020+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 222 heartbeat osd_stat(store_statfs(0x1b2f28000/0x0/0x1bfc00000, data 0x5e88fdf/0x6006000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157351936 unmapped: 31195136 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 ms_handle_reset con 0x56377e058c00 session 0x563781753e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:01.928202+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 ms_handle_reset con 0x56377e058c00 session 0x563781752000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 heartbeat osd_stat(store_statfs(0x1b2f25000/0x0/0x1bfc00000, data 0x5e8b431/0x6008000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157384704 unmapped: 31162368 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2364158 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:02.928344+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 ms_handle_reset con 0x56377b214400 session 0x5637807ffc20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 ms_handle_reset con 0x56377b214000 session 0x56377d2f5860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157425664 unmapped: 31121408 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 224 ms_handle_reset con 0x56377e058000 session 0x5637807ffe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:03.928539+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 224 ms_handle_reset con 0x56377dfb9000 session 0x563781460000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain systemd-journald[48157]: Data hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Nov 23 10:14:53 np0005532585.localdomain systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 225 ms_handle_reset con 0x56377b214000 session 0x56377fd08000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 225 ms_handle_reset con 0x56377b214400 session 0x5637817521e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 225 ms_handle_reset con 0x56377e058000 session 0x56377d6414a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157515776 unmapped: 31031296 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:04.928754+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.261348724s of 10.448229790s, submitted: 398
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 226 ms_handle_reset con 0x56377dfb9000 session 0x56377fd081e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 226 ms_handle_reset con 0x56377dfb8c00 session 0x563780e6b0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157532160 unmapped: 31014912 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:05.929004+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157564928 unmapped: 30982144 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 226 heartbeat osd_stat(store_statfs(0x1b2f0b000/0x0/0x1bfc00000, data 0x5e952af/0x6020000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:06.929157+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 226 ms_handle_reset con 0x56377dfb8c00 session 0x5637817523c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 157630464 unmapped: 30916608 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 227 ms_handle_reset con 0x56377b214000 session 0x56377fd08b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2392533 data_alloc: 184549376 data_used: 8212480
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:07.929320+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 227 heartbeat osd_stat(store_statfs(0x1b2f0a000/0x0/0x1bfc00000, data 0x5e97625/0x6023000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 227 handle_osd_map epochs [227,228], i have 227, src has [1,228]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 158285824 unmapped: 30261248 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:08.929509+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 159440896 unmapped: 29106176 heap: 188547072 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 229 heartbeat osd_stat(store_statfs(0x1b26e2000/0x0/0x1bfc00000, data 0x66bd9dd/0x684c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 229 ms_handle_reset con 0x56377dfb9000 session 0x56377d2ab860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:09.929630+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 229 ms_handle_reset con 0x56377e058c00 session 0x56377da78d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 230 ms_handle_reset con 0x56377e058000 session 0x56377d2aba40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 159498240 unmapped: 37445632 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 230 heartbeat osd_stat(store_statfs(0x1b16df000/0x0/0x1bfc00000, data 0x76bfd44/0x784e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:10.929780+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 159416320 unmapped: 37527552 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:11.929979+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 159498240 unmapped: 37445632 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2891776 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:12.930128+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 169033728 unmapped: 27910144 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:13.930276+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 231 heartbeat osd_stat(store_statfs(0x1ad6b1000/0x0/0x1bfc00000, data 0xb6ed152/0xb87c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168927232 unmapped: 28016640 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 231 heartbeat osd_stat(store_statfs(0x1ac6b1000/0x0/0x1bfc00000, data 0xc6ed1b7/0xc87c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:14.930442+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.869328499s of 10.154787064s, submitted: 287
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 161439744 unmapped: 35504128 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:15.930817+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 161439744 unmapped: 35504128 heap: 196943872 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:16.930975+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 231 ms_handle_reset con 0x56377b214000 session 0x56377d2abe00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 169730048 unmapped: 35610624 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3609661 data_alloc: 184549376 data_used: 8204288
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:17.931140+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 161431552 unmapped: 43909120 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:18.931313+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 161447936 unmapped: 43892736 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 232 ms_handle_reset con 0x56377dfb9000 session 0x56377da790e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:19.931462+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 232 ms_handle_reset con 0x56377e058000 session 0x56377dac32c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563780065800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 233 heartbeat osd_stat(store_statfs(0x1a5663000/0x0/0x1bfc00000, data 0x137316b0/0x138c9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177299456 unmapped: 28041216 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 233 ms_handle_reset con 0x563780065800 session 0x56377b72c3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:20.931629+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563780065400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 234 ms_handle_reset con 0x563780065400 session 0x56377d4a2000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 234 ms_handle_reset con 0x56377dfb8c00 session 0x56377f9ce3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 160940032 unmapped: 44400640 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:21.931953+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 160948224 unmapped: 44392448 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4074628 data_alloc: 184549376 data_used: 8163328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:22.932090+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 234 ms_handle_reset con 0x56377b214000 session 0x56377f9cf0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 234 handle_osd_map epochs [234,235], i have 234, src has [1,235]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 170434560 unmapped: 34906112 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:23.932251+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 235 ms_handle_reset con 0x56377e058000 session 0x56377d5f6960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 236 ms_handle_reset con 0x56377dfb9000 session 0x56377f9cfa40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 171483136 unmapped: 33857536 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:24.932398+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563780065800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781963c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 236 ms_handle_reset con 0x563781963c00 session 0x56377c5c30e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.730299950s of 10.027003288s, submitted: 249
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 237 ms_handle_reset con 0x563780065800 session 0x56377f9cef00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 163176448 unmapped: 42164224 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:25.932527+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 237 ms_handle_reset con 0x56377b214000 session 0x56377ade9a40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 237 ms_handle_reset con 0x56377dfb8c00 session 0x56377d2ab2c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 237 heartbeat osd_stat(store_statfs(0x19e46d000/0x0/0x1bfc00000, data 0x197821ea/0x1991f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 163348480 unmapped: 41992192 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:26.932694+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 171712512 unmapped: 33628160 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4849052 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:27.932848+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 171794432 unmapped: 33546240 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:28.933017+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 ms_handle_reset con 0x56377dfb9000 session 0x56377dc641e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 heartbeat osd_stat(store_statfs(0x19ac20000/0x0/0x1bfc00000, data 0x1cfce06a/0x1d16d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 171802624 unmapped: 33538048 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:29.933177+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175964160 unmapped: 29376512 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e058000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 ms_handle_reset con 0x56377e058000 session 0x56377d620d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:30.933305+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 163282944 unmapped: 42057728 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:31.933472+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 ms_handle_reset con 0x56377b214000 session 0x56377d349860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb8c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 ms_handle_reset con 0x56377dfb8c00 session 0x56377d9f8d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 164380672 unmapped: 40960000 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5312956 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:32.933643+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 ms_handle_reset con 0x56377dfb9000 session 0x56377d249e00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563780065800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 238 handle_osd_map epochs [238,239], i have 238, src has [1,239]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 239 heartbeat osd_stat(store_statfs(0x197bd9000/0x0/0x1bfc00000, data 0x20010e4f/0x201b4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,1,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 239 ms_handle_reset con 0x563780065800 session 0x56377d3483c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 172834816 unmapped: 32505856 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:33.933786+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563781962800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 239 ms_handle_reset con 0x563781962800 session 0x56377dc86960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 172916736 unmapped: 32423936 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:34.935032+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.526556969s of 10.018631935s, submitted: 294
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 164741120 unmapped: 40599552 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:35.935229+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 164904960 unmapped: 40435712 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:36.935399+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 240 ms_handle_reset con 0x56377b214000 session 0x56377d2fc780
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 240 handle_osd_map epochs [240,241], i have 240, src has [1,241]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 173735936 unmapped: 31604736 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 5830338 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:37.935576+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 heartbeat osd_stat(store_statfs(0x194353000/0x0/0x1bfc00000, data 0x23892e7f/0x23a3a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x56377dfb9000 session 0x56377e31a3c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174104576 unmapped: 31236096 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 heartbeat osd_stat(store_statfs(0x192b36000/0x0/0x1bfc00000, data 0x250ae7a5/0x25256000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:38.935749+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563780065800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x563780065800 session 0x56377d6401e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e55f800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175480832 unmapped: 29859840 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:39.935914+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x56377e55f800 session 0x56377f9cf2c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e437400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175620096 unmapped: 29720576 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x56377e437400 session 0x56377d3492c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:40.936048+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 heartbeat osd_stat(store_statfs(0x190304000/0x0/0x1bfc00000, data 0x278e3e36/0x27a8a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x56377b214000 session 0x56377d2cdc20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 167575552 unmapped: 37765120 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:41.937103+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 167804928 unmapped: 37535744 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x56377dfb9000 session 0x563781de74a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e55f800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 6452968 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:42.937242+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 ms_handle_reset con 0x56377e55f800 session 0x56377dc872c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 167944192 unmapped: 37396480 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:43.937370+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 heartbeat osd_stat(store_statfs(0x18d2c6000/0x0/0x1bfc00000, data 0x2a91fd53/0x2aac7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168026112 unmapped: 37314560 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:44.937519+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x563780065800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 ms_handle_reset con 0x563780065800 session 0x563780d2d0e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.742905617s of 10.184217453s, submitted: 253
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168534016 unmapped: 36806656 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:45.937663+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e436c00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 ms_handle_reset con 0x56377e436c00 session 0x5637817534a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 169771008 unmapped: 35569664 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:46.937979+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178200576 unmapped: 27140096 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:47.938156+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 6822612 data_alloc: 184549376 data_used: 8163328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185360384 unmapped: 19980288 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:48.938291+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 ms_handle_reset con 0x56377b214000 session 0x56377d620960
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178020352 unmapped: 27320320 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 heartbeat osd_stat(store_statfs(0x189637000/0x0/0x1bfc00000, data 0x2e1adb05/0x2e357000, compress 0x0/0x0/0x0, omap 0x647, meta 0x826f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:49.938430+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 ms_handle_reset con 0x56377dfb9000 session 0x563781de6d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e55f800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 ms_handle_reset con 0x56377e55f800 session 0x5637814614a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 ms_handle_reset con 0x56377b214400 session 0x56377d2aa5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 170459136 unmapped: 34881536 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd8400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:50.938575+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 ms_handle_reset con 0x56377dfd8400 session 0x56377db34d20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 170778624 unmapped: 34562048 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:51.938739+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd8400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 ms_handle_reset con 0x56377b214000 session 0x56377ade9680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 170409984 unmapped: 34930688 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:52.938866+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 7031174 data_alloc: 184549376 data_used: 8183808
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377b214400 session 0x56377dbfd860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 heartbeat osd_stat(store_statfs(0x18837c000/0x0/0x1bfc00000, data 0x2f467ad1/0x2f612000, compress 0x0/0x0/0x0, omap 0x647, meta 0x826f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377dfd8400 session 0x563781de7860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377c6db800 session 0x56377e31b860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377e55f800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377dfb9000 session 0x56377d2fc5a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377e55f800 session 0x56377d2cc1e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168198144 unmapped: 37142528 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 heartbeat osd_stat(store_statfs(0x1afb66000/0x0/0x1bfc00000, data 0x6c7c241/0x6e26000, compress 0x0/0x0/0x0, omap 0x647, meta 0x826f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:53.938989+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377dfb9000 session 0x56377d23cb40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 heartbeat osd_stat(store_statfs(0x1afb66000/0x0/0x1bfc00000, data 0x6c7c241/0x6e26000, compress 0x0/0x0/0x0, omap 0x647, meta 0x826f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377b214000 session 0x56377d3481e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 50
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 ms_handle_reset con 0x56377b214400 session 0x56377b72cd20
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168427520 unmapped: 36913152 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:54.939134+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168517632 unmapped: 36823040 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:55.939291+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.485372543s of 10.900598526s, submitted: 608
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168517632 unmapped: 36823040 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:56.939429+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168632320 unmapped: 36708352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:57.939575+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2619801 data_alloc: 184549376 data_used: 8183808
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168632320 unmapped: 36708352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:58.939752+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b2585000/0x0/0x1bfc00000, data 0x625e322/0x6408000, compress 0x0/0x0/0x0, omap 0x647, meta 0x726f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168632320 unmapped: 36708352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:59.939946+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168632320 unmapped: 36708352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:00.940114+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377c6db800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 245 ms_handle_reset con 0x56377c6db800 session 0x56377d4a3680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168632320 unmapped: 36708352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:01.940316+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168771584 unmapped: 36569088 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:02.940474+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2724528 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b1d42000/0x0/0x1bfc00000, data 0x6a9f3f8/0x6c4c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x726f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 246 ms_handle_reset con 0x56377b214000 session 0x56377f9cf4a0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168435712 unmapped: 36904960 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:03.940611+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 246 ms_handle_reset con 0x56377b214400 session 0x56377ade90e0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377c6db800
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 246 heartbeat osd_stat(store_statfs(0x1b18b7000/0x0/0x1bfc00000, data 0x6f258c3/0x70d6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x726f9b9), peers [0,1,2,4,5] op hist [1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 246 ms_handle_reset con 0x56377c6db800 session 0x56377c6f6f00
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168452096 unmapped: 36888576 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:04.940760+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfb9000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 247 ms_handle_reset con 0x56377dfb9000 session 0x563781461680
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168574976 unmapped: 36765696 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:05.940944+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.175250053s of 10.007154465s, submitted: 207
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168763392 unmapped: 36577280 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:06.941094+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 247 heartbeat osd_stat(store_statfs(0x1b24b2000/0x0/0x1bfc00000, data 0x632ffc6/0x64dc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x726f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168779776 unmapped: 36560896 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:07.941238+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2648727 data_alloc: 184549376 data_used: 8142848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 247 heartbeat osd_stat(store_statfs(0x1b2498000/0x0/0x1bfc00000, data 0x6349fa9/0x64f6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x726f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 168902656 unmapped: 36438016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:08.941397+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 170254336 unmapped: 35086336 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:09.941548+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 172474368 unmapped: 32866304 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:10.941711+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 172531712 unmapped: 32808960 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:11.941958+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 247 heartbeat osd_stat(store_statfs(0x1b12b9000/0x0/0x1bfc00000, data 0x6386abb/0x6534000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 172613632 unmapped: 32727040 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:12.942300+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2653963 data_alloc: 184549376 data_used: 8142848
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 247 handle_osd_map epochs [247,248], i have 247, src has [1,248]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 248 heartbeat osd_stat(store_statfs(0x1b1294000/0x0/0x1bfc00000, data 0x63aa2f2/0x6559000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 173539328 unmapped: 31801344 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:13.942438+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 248 handle_osd_map epochs [248,249], i have 248, src has [1,249]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 173563904 unmapped: 31776768 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:14.942589+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 173752320 unmapped: 31588352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:15.942729+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 249 heartbeat osd_stat(store_statfs(0x1b1253000/0x0/0x1bfc00000, data 0x63e8d54/0x659a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 249 handle_osd_map epochs [249,250], i have 249, src has [1,250]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.532207489s of 10.021286964s, submitted: 120
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:16.942850+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 173760512 unmapped: 31580160 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:17.942997+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 173785088 unmapped: 31555584 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2671318 data_alloc: 184549376 data_used: 8167424
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:18.943117+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174080000 unmapped: 31260672 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 250 heartbeat osd_stat(store_statfs(0x1b11ff000/0x0/0x1bfc00000, data 0x643ca16/0x65ef000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:19.943311+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174080000 unmapped: 31260672 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:20.943464+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175136768 unmapped: 30203904 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:21.943724+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174571520 unmapped: 30769152 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:22.943960+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174571520 unmapped: 30769152 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2677746 data_alloc: 184549376 data_used: 8167424
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:23.944166+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174628864 unmapped: 30711808 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b1176000/0x0/0x1bfc00000, data 0x64c3beb/0x6678000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:24.944470+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174915584 unmapped: 30425088 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:25.944667+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174915584 unmapped: 30425088 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.461873055s of 10.002785683s, submitted: 133
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b1154000/0x0/0x1bfc00000, data 0x64e484d/0x669a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:26.944791+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 174923776 unmapped: 30416896 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b1123000/0x0/0x1bfc00000, data 0x6516301/0x66cb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:27.944960+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176185344 unmapped: 29155328 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2698794 data_alloc: 184549376 data_used: 8179712
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:28.945166+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176226304 unmapped: 29114368 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:29.945349+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176226304 unmapped: 29114368 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:30.945496+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175276032 unmapped: 30064640 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:31.945655+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175284224 unmapped: 30056448 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:32.945792+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 175284224 unmapped: 30056448 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2712014 data_alloc: 184549376 data_used: 8192000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b1083000/0x0/0x1bfc00000, data 0x65b30d4/0x676a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b1072000/0x0/0x1bfc00000, data 0x65c5217/0x677c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:33.945952+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176513024 unmapped: 28827648 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:34.946126+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176513024 unmapped: 28827648 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:35.946331+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176775168 unmapped: 28565504 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.293190002s of 10.002427101s, submitted: 193
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:36.946488+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 176996352 unmapped: 28344320 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b1026000/0x0/0x1bfc00000, data 0x66110c3/0x67c7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:37.946648+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2707640 data_alloc: 184549376 data_used: 8192000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:38.946827+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b1024000/0x0/0x1bfc00000, data 0x66132db/0x67ca000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:39.946987+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:40.947148+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:41.947304+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:42.947455+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2715876 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b101e000/0x0/0x1bfc00000, data 0x6615745/0x67cf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:43.947572+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b101e000/0x0/0x1bfc00000, data 0x6615745/0x67cf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:44.947732+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b1020000/0x0/0x1bfc00000, data 0x66156e4/0x67ce000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:45.947865+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.758735657s of 10.008411407s, submitted: 78
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:46.948025+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:47.948179+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177004544 unmapped: 28336128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b101f000/0x0/0x1bfc00000, data 0x6615849/0x67cf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2719450 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 254 handle_osd_map epochs [254,255], i have 254, src has [1,255]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:48.948810+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177012736 unmapped: 28327936 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b1018000/0x0/0x1bfc00000, data 0x6617bcd/0x67d5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:49.948986+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177012736 unmapped: 28327936 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:50.949159+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 28319744 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:51.949344+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177020928 unmapped: 28319744 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b101a000/0x0/0x1bfc00000, data 0x6617c2b/0x67d3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:52.949512+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177029120 unmapped: 28311552 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2720174 data_alloc: 184549376 data_used: 8163328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:53.949670+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177029120 unmapped: 28311552 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:54.949784+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177037312 unmapped: 28303360 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:55.953019+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177037312 unmapped: 28303360 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:56.953128+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177037312 unmapped: 28303360 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b1016000/0x0/0x1bfc00000, data 0x661a1f5/0x67d7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.736358643s of 11.010452271s, submitted: 79
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:57.953233+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177037312 unmapped: 28303360 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2727222 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:58.953355+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177045504 unmapped: 28295168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b1016000/0x0/0x1bfc00000, data 0x661a25a/0x67d7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:59.953507+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177045504 unmapped: 28295168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b1015000/0x0/0x1bfc00000, data 0x661a390/0x67d9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b1015000/0x0/0x1bfc00000, data 0x661a3f5/0x67d9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:00.953672+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177045504 unmapped: 28295168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:01.953885+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177045504 unmapped: 28295168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:02.954137+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177045504 unmapped: 28295168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2730380 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _renew_subs
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:03.954306+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177045504 unmapped: 28295168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1014000/0x0/0x1bfc00000, data 0x661a488/0x67d9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:04.954507+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177053696 unmapped: 28286976 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:05.954667+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177061888 unmapped: 28278784 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:06.954779+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177061888 unmapped: 28278784 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.885275841s of 10.010887146s, submitted: 39
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:07.954930+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177061888 unmapped: 28278784 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2734476 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1011000/0x0/0x1bfc00000, data 0x661c802/0x67dd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1011000/0x0/0x1bfc00000, data 0x661c802/0x67dd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:08.955088+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177061888 unmapped: 28278784 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:09.955199+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177061888 unmapped: 28278784 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:10.955353+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177078272 unmapped: 28262400 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:11.955533+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177078272 unmapped: 28262400 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:12.955699+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177086464 unmapped: 28254208 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2737984 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:13.955855+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177086464 unmapped: 28254208 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b100f000/0x0/0x1bfc00000, data 0x661ca3a/0x67de000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:14.955958+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:15.956143+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:16.956299+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.892723083s of 10.008543015s, submitted: 25
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:17.956453+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2737470 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:18.956607+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1014000/0x0/0x1bfc00000, data 0x661c92e/0x67da000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:19.956759+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:20.956958+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1014000/0x0/0x1bfc00000, data 0x661c92e/0x67da000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:21.957183+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1014000/0x0/0x1bfc00000, data 0x661c92e/0x67da000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:22.957357+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177094656 unmapped: 28246016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2736704 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1014000/0x0/0x1bfc00000, data 0x661c92e/0x67da000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:23.957518+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177111040 unmapped: 28229632 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:24.957654+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177111040 unmapped: 28229632 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:25.957844+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177111040 unmapped: 28229632 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:26.957978+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:27.958156+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2740176 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.949753761s of 11.009596825s, submitted: 15
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1010000/0x0/0x1bfc00000, data 0x661caf9/0x67dd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:28.958323+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b1010000/0x0/0x1bfc00000, data 0x661cc6d/0x67dd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:29.958549+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:30.959005+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:31.959199+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:32.959376+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2742190 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:33.959544+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177111040 unmapped: 28229632 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:34.959711+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b100e000/0x0/0x1bfc00000, data 0x661ceb8/0x67df000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:35.960120+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177119232 unmapped: 28221440 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:36.960294+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 177127424 unmapped: 28213248 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:37.960467+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178429952 unmapped: 26910720 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2765018 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:38.960639+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178569216 unmapped: 26771456 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.425615311s of 10.664380074s, submitted: 53
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0f7b000/0x0/0x1bfc00000, data 0x66b1b24/0x6873000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:39.960776+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178667520 unmapped: 26673152 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:40.960953+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178774016 unmapped: 26566656 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:41.962126+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178774016 unmapped: 26566656 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0af8000/0x0/0x1bfc00000, data 0x67341ca/0x68f6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:42.962306+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0af8000/0x0/0x1bfc00000, data 0x67341ca/0x68f6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 178954240 unmapped: 26386432 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2773306 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:43.962463+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 180027392 unmapped: 25313280 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:44.962612+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 180420608 unmapped: 24920064 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0a62000/0x0/0x1bfc00000, data 0x67c71ba/0x698a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:45.962767+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 180699136 unmapped: 24641536 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:46.962931+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 180723712 unmapped: 24616960 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:47.963963+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 181780480 unmapped: 23560192 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2777194 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:48.965276+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.420427322s of 10.002858162s, submitted: 128
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 181788672 unmapped: 23552000 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b09da000/0x0/0x1bfc00000, data 0x6852865/0x6a14000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:49.965414+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 181796864 unmapped: 23543808 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:50.965555+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 181846016 unmapped: 23494656 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:51.966076+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 182362112 unmapped: 22978560 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:52.966258+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0954000/0x0/0x1bfc00000, data 0x68d3cbd/0x6a97000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 182411264 unmapped: 22929408 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2789844 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:53.966671+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 183042048 unmapped: 22298624 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:54.967175+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 183050240 unmapped: 22290432 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:55.967340+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 183050240 unmapped: 22290432 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:56.967948+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b08cd000/0x0/0x1bfc00000, data 0x695e8f7/0x6b21000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 183173120 unmapped: 22167552 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0879000/0x0/0x1bfc00000, data 0x69b1cf4/0x6b74000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:57.968152+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0879000/0x0/0x1bfc00000, data 0x69b1cf4/0x6b74000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 182099968 unmapped: 23240704 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2808066 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:58.968354+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 182206464 unmapped: 23134208 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.874759674s of 10.357366562s, submitted: 112
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:59.968537+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 182206464 unmapped: 23134208 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:00.968713+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184410112 unmapped: 20930560 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:01.969017+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0811000/0x0/0x1bfc00000, data 0x6a1d536/0x6bdd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184410112 unmapped: 20930560 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:02.969292+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184434688 unmapped: 20905984 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2811616 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:03.969511+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184705024 unmapped: 20635648 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:04.970314+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184705024 unmapped: 20635648 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:05.970611+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184729600 unmapped: 20611072 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:06.970803+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 184991744 unmapped: 20348928 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b078b000/0x0/0x1bfc00000, data 0x6aa342a/0x6c63000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:07.971000+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 257 handle_osd_map epochs [257,258], i have 257, src has [1,258]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186114048 unmapped: 19226624 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2819210 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:08.971245+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186236928 unmapped: 19103744 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.534915924s of 10.008509636s, submitted: 105
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:09.971486+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185016320 unmapped: 20324352 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:10.971632+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185024512 unmapped: 20316160 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:11.971794+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185024512 unmapped: 20316160 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 258 heartbeat osd_stat(store_statfs(0x1b06f7000/0x0/0x1bfc00000, data 0x6b34e57/0x6cf7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:12.971950+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 258 handle_osd_map epochs [258,259], i have 258, src has [1,259]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185008128 unmapped: 20332544 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2832876 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:13.972211+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185008128 unmapped: 20332544 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:14.972368+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185008128 unmapped: 20332544 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:15.972522+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 185147392 unmapped: 20193280 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:16.972662+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 259 heartbeat osd_stat(store_statfs(0x1b067f000/0x0/0x1bfc00000, data 0x6baabfb/0x6d6e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186269696 unmapped: 19070976 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:17.972829+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186376192 unmapped: 18964480 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2840280 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:18.972988+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186621952 unmapped: 18718720 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.874903679s of 10.321949005s, submitted: 119
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:19.973120+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186630144 unmapped: 18710528 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 260 heartbeat osd_stat(store_statfs(0x1b0624000/0x0/0x1bfc00000, data 0x6c03856/0x6dc9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:20.973277+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186736640 unmapped: 18604032 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:21.973439+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186966016 unmapped: 18374656 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:22.973609+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186007552 unmapped: 19333120 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2852484 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:23.973791+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 261 heartbeat osd_stat(store_statfs(0x1b05af000/0x0/0x1bfc00000, data 0x6c761f8/0x6e3e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186007552 unmapped: 19333120 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:24.973956+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186122240 unmapped: 19218432 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:25.974126+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 186286080 unmapped: 19054592 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:26.974271+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 187342848 unmapped: 17997824 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:27.974411+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 261 handle_osd_map epochs [261,262], i have 261, src has [1,262]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188399616 unmapped: 16941056 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2861738 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:28.974563+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 262 heartbeat osd_stat(store_statfs(0x1b0533000/0x0/0x1bfc00000, data 0x6cf250a/0x6eba000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188506112 unmapped: 16834560 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:29.974758+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188506112 unmapped: 16834560 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.278877258s of 10.697354317s, submitted: 147
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:30.974949+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 262 heartbeat osd_stat(store_statfs(0x1b0511000/0x0/0x1bfc00000, data 0x6d1589c/0x6edd000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188719104 unmapped: 16621568 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:31.975141+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188719104 unmapped: 16621568 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:32.975316+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188817408 unmapped: 16523264 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2866978 data_alloc: 184549376 data_used: 8200192
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:33.975470+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 263 heartbeat osd_stat(store_statfs(0x1b04c2000/0x0/0x1bfc00000, data 0x6d615b2/0x6f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188817408 unmapped: 16523264 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:34.975635+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 263 heartbeat osd_stat(store_statfs(0x1b04c2000/0x0/0x1bfc00000, data 0x6d615b2/0x6f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188956672 unmapped: 16384000 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:35.975836+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 188956672 unmapped: 16384000 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:36.976034+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 187711488 unmapped: 17629184 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:37.976228+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 187801600 unmapped: 17539072 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2882196 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:38.976426+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 187998208 unmapped: 17342464 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 264 heartbeat osd_stat(store_statfs(0x1b0436000/0x0/0x1bfc00000, data 0x6de8b1b/0x6fb7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:39.976619+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 187998208 unmapped: 17342464 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.687466621s of 10.008725166s, submitted: 96
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:40.976853+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 187998208 unmapped: 17342464 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:41.977041+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 264 heartbeat osd_stat(store_statfs(0x1b0439000/0x0/0x1bfc00000, data 0x6de8b79/0x6fb5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 264 heartbeat osd_stat(store_statfs(0x1b0439000/0x0/0x1bfc00000, data 0x6de8b79/0x6fb5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 189317120 unmapped: 16023552 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:42.977165+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 189464576 unmapped: 15876096 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2890682 data_alloc: 184549376 data_used: 8146944
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:43.977626+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 189472768 unmapped: 15867904 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:44.977781+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 189841408 unmapped: 15499264 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:45.977950+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 189849600 unmapped: 15491072 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:46.978117+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 189980672 unmapped: 15360000 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 265 heartbeat osd_stat(store_statfs(0x1b031d000/0x0/0x1bfc00000, data 0x6f03017/0x70d1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:47.978254+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 265 handle_osd_map epochs [265,266], i have 265, src has [1,266]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190013440 unmapped: 15327232 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2902390 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:48.978468+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377dfd8400
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 266 heartbeat osd_stat(store_statfs(0x1b02e6000/0x0/0x1bfc00000, data 0x6f38b33/0x7107000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191266816 unmapped: 14073856 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:49.978707+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 51
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191414272 unmapped: 13926400 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:50.978846+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191430656 unmapped: 13910016 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.046366692s of 11.009084702s, submitted: 161
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:51.980022+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 266 heartbeat osd_stat(store_statfs(0x1b02ba000/0x0/0x1bfc00000, data 0x6f6399a/0x7134000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190906368 unmapped: 14434304 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:52.980268+0000)
Nov 23 10:14:53 np0005532585.localdomain sshd[342352]: Received disconnect from 175.126.166.172 port 48112:11: Bye Bye [preauth]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain sshd[342352]: Disconnected from invalid user root11 175.126.166.172 port 48112 [preauth]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b0256000/0x0/0x1bfc00000, data 0x6fc4aa4/0x7197000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191234048 unmapped: 14106624 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2923402 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:53.980774+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b0256000/0x0/0x1bfc00000, data 0x6fc4aa4/0x7197000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191234048 unmapped: 14106624 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:54.980933+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190308352 unmapped: 15032320 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:55.981196+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190316544 unmapped: 15024128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:56.981389+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b021e000/0x0/0x1bfc00000, data 0x6ffcc97/0x71d0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190316544 unmapped: 15024128 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:57.981560+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190496768 unmapped: 14843904 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2923636 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:58.981779+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190513152 unmapped: 14827520 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b0208000/0x0/0x1bfc00000, data 0x701335d/0x71e6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:59.981955+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190513152 unmapped: 14827520 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:00.982101+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 190464000 unmapped: 14876672 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:01.982286+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.248165131s of 10.422602654s, submitted: 54
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191619072 unmapped: 13721600 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b01b7000/0x0/0x1bfc00000, data 0x70628b1/0x7236000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:02.982602+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191676416 unmapped: 13664256 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2933760 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:03.982865+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 191856640 unmapped: 13484032 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:04.983090+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192077824 unmapped: 13262848 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:05.983332+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192077824 unmapped: 13262848 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:06.983500+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b0139000/0x0/0x1bfc00000, data 0x70e1595/0x72b5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192380928 unmapped: 12959744 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:07.983666+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192405504 unmapped: 12935168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2934468 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:08.983971+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b011c000/0x0/0x1bfc00000, data 0x70feb52/0x72d2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192405504 unmapped: 12935168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:09.984265+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192405504 unmapped: 12935168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b010b000/0x0/0x1bfc00000, data 0x710fee2/0x72e3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:10.984490+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 192405504 unmapped: 12935168 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:11.984736+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.118917465s of 10.314707756s, submitted: 41
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193470464 unmapped: 11870208 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:12.984983+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193568768 unmapped: 11771904 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2945512 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:13.985180+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193568768 unmapped: 11771904 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:14.985364+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193724416 unmapped: 11616256 heap: 205340672 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:15.985562+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b006f000/0x0/0x1bfc00000, data 0x71a878d/0x737f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195018752 unmapped: 11370496 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:16.985704+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195026944 unmapped: 11362304 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:17.985885+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195026944 unmapped: 11362304 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2952436 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:18.986101+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195026944 unmapped: 11362304 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:19.986357+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195026944 unmapped: 11362304 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:20.986521+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195026944 unmapped: 11362304 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:21.986720+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b0015000/0x0/0x1bfc00000, data 0x720244f/0x73d9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195026944 unmapped: 11362304 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:22.986880+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.700469971s of 10.902329445s, submitted: 41
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195256320 unmapped: 11132928 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2962426 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:23.987151+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 195256320 unmapped: 11132928 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:24.987361+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193830912 unmapped: 12558336 heap: 206389248 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:25.987520+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1aedd6000/0x0/0x1bfc00000, data 0x72a10ea/0x7477000, compress 0x0/0x0/0x0, omap 0x647, meta 0x99af9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193888256 unmapped: 13549568 heap: 207437824 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:26.987663+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1aedbb000/0x0/0x1bfc00000, data 0x72bb53b/0x7492000, compress 0x0/0x0/0x0, omap 0x647, meta 0x99af9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 193986560 unmapped: 13451264 heap: 207437824 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:27.987814+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196411392 unmapped: 12075008 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2965324 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1adbf9000/0x0/0x1bfc00000, data 0x72de482/0x74b5000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:28.988081+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196542464 unmapped: 11943936 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:29.989363+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196542464 unmapped: 11943936 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:30.989541+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196550656 unmapped: 11935744 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:31.989807+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196575232 unmapped: 11911168 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:32.990098+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.742299080s of 10.004232407s, submitted: 59
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196870144 unmapped: 11616256 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2978004 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:33.990238+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1adb5f000/0x0/0x1bfc00000, data 0x737ada7/0x754f000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196952064 unmapped: 11534336 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:34.990395+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196952064 unmapped: 11534336 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:35.990608+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197115904 unmapped: 11370496 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:36.990799+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197115904 unmapped: 11370496 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:37.990994+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1adb0c000/0x0/0x1bfc00000, data 0x73ce464/0x75a2000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197115904 unmapped: 11370496 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2976464 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:38.991198+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197263360 unmapped: 11223040 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:39.991480+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197263360 unmapped: 11223040 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:40.991687+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197263360 unmapped: 11223040 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:41.992008+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1adad1000/0x0/0x1bfc00000, data 0x740a978/0x75dd000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197263360 unmapped: 11223040 heap: 208486400 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:42.992155+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.809960365s of 10.002861023s, submitted: 36
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197353472 unmapped: 12181504 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2979444 data_alloc: 184549376 data_used: 8187904
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:43.992389+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197468160 unmapped: 12066816 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:44.992618+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197468160 unmapped: 12066816 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:45.992978+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197484544 unmapped: 12050432 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:46.993192+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197492736 unmapped: 12042240 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:47.993394+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1ada7d000/0x0/0x1bfc00000, data 0x7460441/0x7631000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [1])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196354048 unmapped: 13180928 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2984700 data_alloc: 184549376 data_used: 8192000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:48.993644+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196354048 unmapped: 13180928 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:49.993827+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 heartbeat osd_stat(store_statfs(0x1ada42000/0x0/0x1bfc00000, data 0x749ac8e/0x766c000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 196354048 unmapped: 13180928 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:50.994038+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197574656 unmapped: 11960320 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:51.994267+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197574656 unmapped: 11960320 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:52.994428+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.811488152s of 10.001764297s, submitted: 38
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 267 handle_osd_map epochs [267,268], i have 267, src has [1,268]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197795840 unmapped: 11739136 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2994860 data_alloc: 184549376 data_used: 8204288
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:53.994632+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 268 heartbeat osd_stat(store_statfs(0x1ad9c5000/0x0/0x1bfc00000, data 0x7515bd1/0x76e8000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 52
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197820416 unmapped: 11714560 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:54.994762+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 268 heartbeat osd_stat(store_statfs(0x1ad9c5000/0x0/0x1bfc00000, data 0x7515bd1/0x76e8000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 197984256 unmapped: 11550720 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:55.994975+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198123520 unmapped: 11411456 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:56.995139+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198123520 unmapped: 11411456 heap: 209534976 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:57.995283+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198066176 unmapped: 12517376 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2999602 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:58.995496+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 heartbeat osd_stat(store_statfs(0x1ad982000/0x0/0x1bfc00000, data 0x75568d7/0x772b000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198066176 unmapped: 12517376 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:59.996004+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198066176 unmapped: 12517376 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:00.996205+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:01.996367+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198066176 unmapped: 12517376 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:02.996543+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 198082560 unmapped: 12500992 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.747048378s of 10.001671791s, submitted: 78
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:03.996773+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199131136 unmapped: 11452416 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3006950 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 heartbeat osd_stat(store_statfs(0x1ad93d000/0x0/0x1bfc00000, data 0x759b105/0x7771000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:04.996939+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199221248 unmapped: 11362304 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:05.997150+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199327744 unmapped: 11255808 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:06.997312+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199327744 unmapped: 11255808 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 heartbeat osd_stat(store_statfs(0x1ad90a000/0x0/0x1bfc00000, data 0x75ce2be/0x77a4000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:07.997502+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199401472 unmapped: 11182080 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:08.997645+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199409664 unmapped: 11173888 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3008278 data_alloc: 184549376 data_used: 8151040
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 heartbeat osd_stat(store_statfs(0x1ad8e6000/0x0/0x1bfc00000, data 0x75f1c42/0x77c8000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:09.997828+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199409664 unmapped: 11173888 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:10.998050+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199434240 unmapped: 11149312 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:11.998281+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199442432 unmapped: 11141120 heap: 210583552 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 heartbeat osd_stat(store_statfs(0x1ad889000/0x0/0x1bfc00000, data 0x764f294/0x7825000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:12.998571+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200581120 unmapped: 12099584 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.822029114s of 10.004167557s, submitted: 35
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 269 handle_osd_map epochs [269,270], i have 269, src has [1,270]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 270 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 270 heartbeat osd_stat(store_statfs(0x1ad871000/0x0/0x1bfc00000, data 0x76649db/0x783c000, compress 0x0/0x0/0x0, omap 0x647, meta 0xab4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:14.016910+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200589312 unmapped: 12091392 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3019946 data_alloc: 184549376 data_used: 8163328
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:15.017054+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200589312 unmapped: 12091392 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:16.017218+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200884224 unmapped: 11796480 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:17.017378+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199393280 unmapped: 13287424 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 270 heartbeat osd_stat(store_statfs(0x1ad424000/0x0/0x1bfc00000, data 0x76b2f9f/0x788a000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:18.017549+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199401472 unmapped: 13279232 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 270 heartbeat osd_stat(store_statfs(0x1ad424000/0x0/0x1bfc00000, data 0x76b2f9f/0x788a000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 270 handle_osd_map epochs [270,271], i have 270, src has [1,271]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:19.067846+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199532544 unmapped: 13148160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3030898 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:20.068033+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199892992 unmapped: 12787712 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:21.068166+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199892992 unmapped: 12787712 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:22.068414+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199901184 unmapped: 12779520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 271 heartbeat osd_stat(store_statfs(0x1ad3d2000/0x0/0x1bfc00000, data 0x7703cb0/0x78dc000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:23.068587+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199901184 unmapped: 12779520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:24.068781+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 199991296 unmapped: 12689408 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3032352 data_alloc: 184549376 data_used: 8175616
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 271 heartbeat osd_stat(store_statfs(0x1ad39f000/0x0/0x1bfc00000, data 0x7737788/0x790f000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.673622131s of 10.983532906s, submitted: 95
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 271 ms_handle_reset con 0x56377dfd8400 session 0x5637814603c0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:25.068956+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200417280 unmapped: 12263424 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:26.069102+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200523776 unmapped: 12156928 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 53
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:27.069230+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200802304 unmapped: 11878400 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:28.069434+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200810496 unmapped: 11870208 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 272 heartbeat osd_stat(store_statfs(0x1ad354000/0x0/0x1bfc00000, data 0x7783b1c/0x795a000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:29.069740+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200826880 unmapped: 11853824 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3036892 data_alloc: 184549376 data_used: 8183808
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:30.070059+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200826880 unmapped: 11853824 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:31.070340+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200826880 unmapped: 11853824 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 272 heartbeat osd_stat(store_statfs(0x1ad350000/0x0/0x1bfc00000, data 0x7785f4c/0x795d000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 272 heartbeat osd_stat(store_statfs(0x1ad350000/0x0/0x1bfc00000, data 0x7785f4c/0x795d000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:32.070553+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200826880 unmapped: 11853824 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:33.070720+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200826880 unmapped: 11853824 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 272 handle_osd_map epochs [272,273], i have 272, src has [1,273]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:34.071057+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200892416 unmapped: 11788288 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 27K writes, 102K keys, 27K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.01 MB/s
                                                          Cumulative WAL: 27K writes, 9964 syncs, 2.79 writes per sync, written: 0.09 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 17K writes, 61K keys, 17K commit groups, 1.0 writes per commit group, ingest: 49.73 MB, 0.08 MB/s
                                                          Interval WAL: 17K writes, 7175 syncs, 2.42 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:35.071201+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200916992 unmapped: 11763712 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:36.071395+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:37.071659+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:38.071874+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:39.072167+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:40.072683+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:41.072917+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:42.073501+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200925184 unmapped: 11755520 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:43.073756+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:44.074044+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:45.074198+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:46.074292+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:47.074438+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:48.074665+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:49.075045+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:50.075221+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200949760 unmapped: 11730944 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:51.075582+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:52.075778+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:53.075987+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:54.076263+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:55.076655+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:56.076860+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:57.077213+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:58.077440+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200966144 unmapped: 11714560 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:59.077628+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200982528 unmapped: 11698176 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:00.078017+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:01.078199+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:02.078398+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:03.078580+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:04.078970+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:05.079199+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:06.079461+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200990720 unmapped: 11689984 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:07.079718+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:08.079930+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:09.080167+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:10.080437+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:11.080764+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:12.080946+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:13.081142+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:14.081353+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 200998912 unmapped: 11681792 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:15.081487+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201007104 unmapped: 11673600 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:16.081675+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201007104 unmapped: 11673600 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:17.081853+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201007104 unmapped: 11673600 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:18.082026+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201007104 unmapped: 11673600 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:19.082289+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201007104 unmapped: 11673600 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:20.082482+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201015296 unmapped: 11665408 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:21.082680+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201015296 unmapped: 11665408 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:22.082877+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201015296 unmapped: 11665408 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:23.083049+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:24.083192+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:25.083322+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:26.083484+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:27.083681+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:28.083886+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:29.084098+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:30.084303+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201031680 unmapped: 11649024 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:31.084488+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:32.084712+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:33.085030+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:34.085189+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:35.085483+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:36.085628+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:37.085822+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:38.086018+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201039872 unmapped: 11640832 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:39.086158+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:40.086290+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:41.086606+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:42.086846+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:43.087113+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:44.088019+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:45.088198+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:46.088352+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201056256 unmapped: 11624448 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:47.088548+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201064448 unmapped: 11616256 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:48.088756+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201064448 unmapped: 11616256 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:49.088939+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201064448 unmapped: 11616256 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3040542 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:50.089106+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad339000/0x0/0x1bfc00000, data 0x779a8b0/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201072640 unmapped: 11608064 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:51.089256+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201072640 unmapped: 11608064 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:52.089423+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 201072640 unmapped: 11608064 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:53.089561+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 88.794242859s of 89.085174561s, submitted: 386
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 ms_handle_reset con 0x56377c530400 session 0x56377ade6b40
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202776576 unmapped: 9904128 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:54.089782+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202514432 unmapped: 10166272 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Got map version 54
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:55.089979+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:56.090224+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:57.090399+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:58.090624+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:59.090817+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:00.091020+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:01.091213+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:02.091487+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202539008 unmapped: 10141696 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:03.091664+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202555392 unmapped: 10125312 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:04.091776+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202555392 unmapped: 10125312 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:05.091918+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202555392 unmapped: 10125312 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:06.092075+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202563584 unmapped: 10117120 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:07.092427+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202563584 unmapped: 10117120 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:08.092599+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202563584 unmapped: 10117120 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:09.092759+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202563584 unmapped: 10117120 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:10.092956+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202563584 unmapped: 10117120 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:11.093118+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:12.093448+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:13.093686+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:14.093832+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:15.094126+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:16.094450+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:17.094737+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:18.094993+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202596352 unmapped: 10084352 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:19.095142+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:20.095288+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:21.095455+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:22.095668+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:23.095827+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:24.095974+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:25.096119+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:26.096378+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202604544 unmapped: 10076160 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:27.096597+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202637312 unmapped: 10043392 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:28.096762+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202637312 unmapped: 10043392 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:29.096981+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202637312 unmapped: 10043392 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:30.097159+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202645504 unmapped: 10035200 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:31.097466+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202645504 unmapped: 10035200 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:32.098002+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202645504 unmapped: 10035200 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:33.098242+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202645504 unmapped: 10035200 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 ms_handle_reset con 0x56377daf5400 session 0x56377d9f9860
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: handle_auth_request added challenge on 0x56377b214000
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:34.098406+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202645504 unmapped: 10035200 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:35.098573+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:36.098718+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:37.098882+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:38.099063+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:39.099255+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:40.099453+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:41.099643+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:42.099853+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202653696 unmapped: 10027008 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:43.099996+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:44.100144+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:45.100265+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:46.100431+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:47.100562+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:48.100697+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:49.100875+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:50.101068+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202670080 unmapped: 10010624 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:51.101250+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202678272 unmapped: 10002432 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:52.101428+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202678272 unmapped: 10002432 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:53.101596+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202678272 unmapped: 10002432 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:54.101771+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202686464 unmapped: 9994240 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:55.101948+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202694656 unmapped: 9986048 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:56.102159+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202694656 unmapped: 9986048 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:57.102316+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202694656 unmapped: 9986048 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:58.102510+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202694656 unmapped: 9986048 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:59.102725+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:00.102962+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:01.103215+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:02.103456+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:03.103666+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:04.103871+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:05.104103+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:06.104326+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202711040 unmapped: 9969664 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:07.104468+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202719232 unmapped: 9961472 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:08.104671+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202719232 unmapped: 9961472 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:09.104829+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202719232 unmapped: 9961472 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:10.104922+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202719232 unmapped: 9961472 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:11.105091+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202727424 unmapped: 9953280 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:12.105306+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202727424 unmapped: 9953280 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:13.105425+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202727424 unmapped: 9953280 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:14.105578+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202727424 unmapped: 9953280 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:15.105734+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202752000 unmapped: 9928704 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:16.105852+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202752000 unmapped: 9928704 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:17.105928+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202752000 unmapped: 9928704 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:18.106067+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202752000 unmapped: 9928704 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:19.106183+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: bluestore.MempoolThread(0x563779ecfb60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3038622 data_alloc: 184549376 data_used: 8196096
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202752000 unmapped: 9928704 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:20.106288+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'config diff' '{prefix=config diff}'
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'config show' '{prefix=config show}'
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'counter dump' '{prefix=counter dump}'
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202735616 unmapped: 9945088 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'counter schema' '{prefix=counter schema}'
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:21.106439+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: osd.3 273 heartbeat osd_stat(store_statfs(0x1ad33a000/0x0/0x1bfc00000, data 0x779aac3/0x7974000, compress 0x0/0x0/0x0, omap 0x647, meta 0xaf4f9b9), peers [0,1,2,4,5] op hist [])
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202334208 unmapped: 10346496 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: tick
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_tickets
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:22.106586+0000)
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: prioritycache tune_memory target: 3561601228 mapped: 202530816 unmapped: 10149888 heap: 212680704 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:53 np0005532585.localdomain ceph-osd[32858]: do_command 'log dump' '{prefix=log dump}'
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: pgmap v773: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.69083 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.59110 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.49599 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.69104 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.59122 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.49605 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1655980376' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3388039984' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1724465177' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3868625184' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4266730576' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/469495303' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2333125060' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 23 10:14:53 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3479882488' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3629738133' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.69122 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.59137 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.49617 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.69134 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.59149 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.49629 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2023594198' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.69140 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/959013144' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2333125060' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.59164 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.49641 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3365955498' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3479882488' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4235157921' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3326372094' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3629738133' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:54 np0005532585.localdomain crontab[342743]: (root) LIST (root)
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 23 10:14:54 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3882673784' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon stat"} v 0)
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3530510980' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:14:55.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: pgmap v774: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.69155 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.59176 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.49653 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.69167 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.59191 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3028301708' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.49665 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3882673784' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3420049058' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1806367899' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3530510980' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "node ls"} v 0)
Nov 23 10:14:55 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2158521499' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 23 10:14:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.
Nov 23 10:14:55 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.
Nov 23 10:14:56 np0005532585.localdomain systemd[1]: tmp-crun.EbJaM6.mount: Deactivated successfully.
Nov 23 10:14:56 np0005532585.localdomain podman[342944]: 2025-11-23 10:14:56.031068231 +0000 UTC m=+0.080220895 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute)
Nov 23 10:14:56 np0005532585.localdomain podman[342944]: 2025-11-23 10:14:56.069206994 +0000 UTC m=+0.118359638 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:14:56 np0005532585.localdomain systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully.
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "node ls"} v 0)
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3568888528' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain podman[342943]: 2025-11-23 10:14:56.077572611 +0000 UTC m=+0.129043536 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 23 10:14:56 np0005532585.localdomain podman[342943]: 2025-11-23 10:14:56.16119161 +0000 UTC m=+0.212662525 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 23 10:14:56 np0005532585.localdomain systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully.
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.69182 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.59203 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.49677 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.69194 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.59215 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3213237305' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2158521499' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3880682301' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3568888528' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3046286260' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/973500516' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1217552270' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4251710309' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1526207643' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Nov 23 10:14:56 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4217072223' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/545459147' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/411863729' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:52.692428+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:53.692593+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:54.692779+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:55.693015+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:56.693146+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:57.693507+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:58.693629+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:50:59.693778+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:00.693940+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:01.694077+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:02.694257+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:03.694472+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:04.694636+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:05.694766+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:06.695042+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:07.695243+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 11 from mon.np0005532585 (according to old e11)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 11
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:51:38.635516+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: mon.np0005532585 at [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] went away
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _reopen_session rank -1
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _add_conns ranks=[0,2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532583 con 0x55d7bb67d400 addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532584 con 0x55d7bbb0a400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532586 con 0x55d7bc2ec800 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7bb67d400 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7bc2ec800 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7bbb0a400 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_done global_id 14298 payload 293
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_hunting 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: found mon.np0005532583
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532583 at v2:172.18.0.105:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:08.660100+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532583 at v2:172.18.0.105:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 11 from mon.np0005532583 (according to old e11)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 11
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:51:38.635516+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_config config(7 keys)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: set_mon_vals no callback set
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 28
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2632531473,v1:172.18.0.106:6811/2632531473]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 11 from mon.np0005532583 (according to old e11)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 11
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:51:38.635516+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:08.695367+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:09.695546+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:10.695740+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:11.695880+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:12.696059+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:13.696246+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:14.696382+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:15.696552+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:16.696702+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:17.696875+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:18.697048+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:19.697232+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:20.697382+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:21.697551+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:22.697724+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:23.697856+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:24.698027+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:25.698217+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:26.698335+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:27.698503+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:28.698670+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:29.698840+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:30.699013+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 4944 writes, 22K keys, 4944 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4944 writes, 665 syncs, 7.43 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 100 writes, 292 keys, 100 commit groups, 1.0 writes per commit group, ingest: 0.32 MB, 0.00 MB/s
                                                          Interval WAL: 100 writes, 47 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:31.699159+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:32.699330+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:33.699487+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:34.699607+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:35.699762+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:36.699984+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:37.700135+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:38.700280+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:39.700469+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:40.700661+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 heartbeat osd_stat(store_statfs(0x1bb973000/0x0/0x1bfc00000, data 0x16412cb/0x16ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 703408 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:41.700826+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 73383936 unmapped: 3743744 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532583 at v2:172.18.0.105:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 29
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now 
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/2632531473
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect No active mgr available yet
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 81.605445862s of 81.630111694s, submitted: 7
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:42.700998+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 ms_handle_reset con 0x55d7bc168800 session 0x55d7b911b0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb0a400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74637312 unmapped: 2490368 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:43.701187+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 30
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: get_auth_request con 0x55d7bb67ac00 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_configure stats_period=5
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74833920 unmapped: 2293760 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:44.701366+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 31
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74956800 unmapped: 2170880 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:45.701747+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74956800 unmapped: 2170880 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:46.701964+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74956800 unmapped: 2170880 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 32
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:47.702180+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75104256 unmapped: 2023424 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 33
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:48.702374+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:49.702546+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:50.702698+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:51.702879+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:52.703096+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:53.703280+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:54.703457+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:55.703595+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:56.703761+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:57.703978+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:58.704225+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:51:59.704382+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:00.704547+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:01.704700+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:02.704822+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:03.704994+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:04.705140+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:05.705292+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:06.705466+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:07.705683+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:08.705851+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:09.706024+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:10.706174+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74866688 unmapped: 2260992 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 12 from mon.np0005532583 (according to old e12)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 12
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:41.580241+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: mon.np0005532583 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] went away
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _reopen_session rank -1
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _add_conns ranks=[0,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532586 con 0x55d7b9740c00 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532584 con 0x55d7b9741000 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9740c00 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9741000 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_done global_id 14298 payload 293
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_hunting 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: found mon.np0005532584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532584 at v2:172.18.0.103:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:11.603240+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:11.706360+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: renew subs? -- yes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs - empty
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: get_auth_request con 0x55d7b9741000 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: --2- [v2:172.18.0.107:6800/1293390152,v1:172.18.0.107:6801/1293390152] >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55d7b9741000 0x55d7bb6a2100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: ms_handle_reset current mon [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _reopen_session rank -1
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _add_conns ranks=[0,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532586 con 0x55d7b9740c00 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532584 con 0x55d7b9741400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 ms_handle_reset con 0x55d7b9741000 session 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9740c00 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9741400 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_done global_id 14298 payload 293
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_hunting 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: found mon.np0005532586
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532586 at v2:172.18.0.108:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:11.810738+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532586 at v2:172.18.0.108:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 12 from mon.np0005532586 (according to old e12)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 12
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:41.580241+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_config config(7 keys)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: set_mon_vals no callback set
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 33
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 13 from mon.np0005532586 (according to old e13)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 13
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:42.566799+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:12.706534+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:13.706694+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:14.706870+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:15.707062+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:16.707205+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:17.707405+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:18.707561+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:19.707796+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 14 from mon.np0005532586 (according to old e14)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 14
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:50.591476+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
                                                          3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532583
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:20.707965+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:21.708124+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:22.708278+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:23.708452+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:24.708603+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:25.708747+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:26.709014+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:27.709242+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:28.709439+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 15 from mon.np0005532586 (according to old e15)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 15
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:52:59.410256+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005532586
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:29.709594+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:30.709755+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:31.709946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:32.710130+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:33.710261+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:34.710417+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:35.710559+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:36.710711+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:37.710999+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:38.711209+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:39.711353+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:40.711472+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:41.711628+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:42.711744+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:43.711907+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:44.712043+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:45.712144+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:46.712300+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:47.712507+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:48.712856+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:49.713026+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:50.713235+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:51.713412+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:52.713572+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74891264 unmapped: 2236416 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:53.713711+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 16 from mon.np0005532586 (according to old e16)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 16
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:53:23.789795+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: mon.np0005532586 at [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] went away
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _reopen_session rank -1
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _add_conns ranks=[0,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532584 con 0x55d7b9741400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532585 con 0x55d7b9741000 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9741400 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9741000 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_done global_id 14298 payload 293
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_hunting 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: found mon.np0005532585
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:53.815876+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: ms_handle_reset current mon [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _reopen_session rank -1
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _add_conns ranks=[0,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532584 con 0x55d7b9741400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): picked mon.np0005532585 con 0x55d7b9741800 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): start opening mon connection
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 ms_handle_reset con 0x55d7b9741000 session 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request con 0x55d7b9741800 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth method 2
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): _init_auth already have auth, reseting
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more payload_len 9
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient(hunting): handle_auth_done global_id 14298 payload 293
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_hunting 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: found mon.np0005532585
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:53.824646+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 16 from mon.np0005532585 (according to old e16)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 16
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:53:23.789795+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_config config(7 keys)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: set_mon_vals no callback set
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 33
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:54.713875+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:55.714123+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:56.714330+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:57.714529+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:58.714686+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:52:59.714841+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:00.715003+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:01.715170+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:02.715342+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:03.715581+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:04.715758+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:05.715877+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:06.716096+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:07.716389+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb96f000/0x0/0x1bfc00000, data 0x1643843/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:08.716563+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:09.716710+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_monmap mon_map magic: 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient:  got monmap 17 from mon.np0005532585 (according to old e17)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: dump:
                                                          epoch 17
                                                          fsid 46550e70-79cb-5f55-bf6d-1204b97e083b
                                                          last_changed 2025-11-23T09:53:40.507961+0000
                                                          created 2025-11-23T07:39:05.590972+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005532584
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005532585
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005532586
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:10.716869+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 706608 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:11.717091+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74702848 unmapped: 2424832 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:12.717261+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.189353943s of 90.214775085s, submitted: 7
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:13.717420+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb970000/0x0/0x1bfc00000, data 0x164395d/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:14.717579+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 34
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/4027327596,v1:172.18.0.107:6811/4027327596]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:15.717735+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb970000/0x0/0x1bfc00000, data 0x164395d/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 705520 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:16.718015+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:17.718197+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb970000/0x0/0x1bfc00000, data 0x164395d/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:18.718356+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:19.718493+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:20.718629+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 705520 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:21.718782+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb970000/0x0/0x1bfc00000, data 0x164395d/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:22.718957+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:23.719124+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:24.719300+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:25.719458+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb970000/0x0/0x1bfc00000, data 0x164395d/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 705520 data_alloc: 184549376 data_used: 167936
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:26.719652+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:27.719870+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:28.720057+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:29.721162+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 heartbeat osd_stat(store_statfs(0x1bb970000/0x0/0x1bfc00000, data 0x164395d/0x16be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74752000 unmapped: 2375680 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 35
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now 
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/4027327596
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect No active mgr available yet
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.723566055s of 17.734361649s, submitted: 3
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 ms_handle_reset con 0x55d7bbb0a400 session 0x55d7bb7c7680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:30.721313+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74792960 unmapped: 2334720 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 36
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: get_auth_request con 0x55d7b9741000 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_configure stats_period=5
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:31.721488+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74801152 unmapped: 2326528 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:32.721642+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74801152 unmapped: 2326528 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:33.721779+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 37
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74792960 unmapped: 2334720 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:34.721943+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74792960 unmapped: 2334720 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:35.722082+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74792960 unmapped: 2334720 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 38
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:36.722247+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74981376 unmapped: 2146304 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:37.722428+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:38.722573+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:39.722722+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:40.722882+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:41.723071+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:42.723210+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:43.723339+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:44.723452+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:45.723639+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74645504 unmapped: 2482176 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:46.723803+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 39
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/335107178,v1:172.18.0.108:6811/335107178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:47.724436+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:48.724565+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:49.724947+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:50.725339+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:51.725577+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:52.725957+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:53.726234+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:54.726512+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:55.726818+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:56.726998+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:57.727313+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:58.727580+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:53:59.727827+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:00.728053+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:01.728252+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:02.728499+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:03.728674+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:04.728879+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:05.729123+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:06.729305+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:07.729546+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:08.729729+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:09.729967+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:10.730142+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:11.730295+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:12.730477+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:13.730622+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:14.730778+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:15.730960+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:16.731049+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:17.731245+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:18.731415+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:19.731580+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:20.731789+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:21.731940+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:22.732080+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:23.732217+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:24.732341+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:25.732491+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:26.732617+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:27.732772+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:28.732918+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:29.733108+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:30.733196+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:31.733337+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:32.733467+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:33.733652+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:34.733843+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:35.734033+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:36.734196+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:37.734366+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:38.734513+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:39.734710+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:40.734884+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:41.735125+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:42.735307+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:43.735475+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:44.735660+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:45.735851+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:46.736020+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:47.736244+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 heartbeat osd_stat(store_statfs(0x1bb96c000/0x0/0x1bfc00000, data 0x1645e67/0x16c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:48.736378+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:49.736538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:50.737092+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:51.737461+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 709692 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74530816 unmapped: 2596864 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:52.737739+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now 
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect Terminating session with v2:172.18.0.108:6810/335107178
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect No active mgr available yet
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 82.573966980s of 82.597618103s, submitted: 5
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb6952c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74743808 unmapped: 2383872 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:53.737885+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 41
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: get_auth_request con 0x55d7bbc00400 auth_method 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_configure stats_period=5
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74743808 unmapped: 2383872 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:54.738055+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74743808 unmapped: 2383872 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:55.738228+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 42
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74743808 unmapped: 2383872 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:56.738830+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:57.739066+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74743808 unmapped: 2383872 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:58.739248+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74743808 unmapped: 2383872 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 43
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:54:59.739388+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:00.739532+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:01.739687+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:02.739860+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:03.740009+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:04.740150+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:05.740306+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:06.740437+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:07.740648+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:08.740802+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:09.740973+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:10.741119+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74899456 unmapped: 2228224 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:11.741262+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:12.741405+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:13.741535+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:14.741651+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:15.741818+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:16.741958+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:17.742124+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:18.742313+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74907648 unmapped: 2220032 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:19.742808+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:20.742917+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:21.743050+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:22.743180+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:23.743340+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:24.743436+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:25.743535+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:26.743672+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:27.743841+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:28.744007+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:29.744170+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:30.744300+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:31.744408+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:32.744534+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:33.744660+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:34.744775+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74915840 unmapped: 2211840 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:35.744958+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:36.745137+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:37.745306+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:38.745443+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:39.745603+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:40.745760+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:41.745928+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:42.746088+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:43.746179+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:44.746340+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:45.746481+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74924032 unmapped: 2203648 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:46.746632+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:47.746827+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:48.746990+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:49.747172+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:50.747327+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:51.747488+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:52.747655+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:53.747800+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:54.747967+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:55.748099+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:56.748233+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:57.748364+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:58.748514+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74932224 unmapped: 2195456 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:55:59.748639+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:00.748722+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:01.748858+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:02.749027+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:03.749179+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:04.749330+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:05.749518+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:06.749657+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:07.749840+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:08.750001+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:09.750167+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:10.750312+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:11.750439+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 712724 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:12.750611+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:13.750767+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74940416 unmapped: 2187264 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:14.750935+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 heartbeat osd_stat(store_statfs(0x1bb968000/0x0/0x1bfc00000, data 0x1648639/0x16c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2bcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 74948608 unmapped: 2179072 heap: 77127680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 82.322250366s of 82.345703125s, submitted: 6
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:15.751043+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75030528 unmapped: 18882560 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:16.751166+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75030528 unmapped: 18882560 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 768106 data_alloc: 184549376 data_used: 176128
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 88 ms_handle_reset con 0x55d7bbd18400 session 0x55d7bb71c5a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:17.751392+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75046912 unmapped: 18866176 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:18.751527+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75046912 unmapped: 18866176 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 ms_handle_reset con 0x55d7bbd18800 session 0x55d7bb9454a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:19.751661+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75087872 unmapped: 18825216 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5c000/0x0/0x1bfc00000, data 0x1e4cd6f/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:20.751804+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75087872 unmapped: 18825216 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:21.751946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75087872 unmapped: 18825216 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 780224 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:22.752076+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 75087872 unmapped: 18825216 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5c000/0x0/0x1bfc00000, data 0x1e4cd6f/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:23.752215+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 44
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:24.752383+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:25.752503+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:26.752663+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:27.752848+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:28.753002+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:29.753151+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76144640 unmapped: 17768448 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:30.753307+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:31.753447+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets getting new tickets!
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:32.753729+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _finish_auth 0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:32.768663+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:33.753877+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:34.754090+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:35.754231+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:36.754372+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:37.754547+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:38.754690+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:39.754838+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:40.754978+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:41.755173+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:42.755364+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:43.755516+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:44.755659+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:45.755786+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:46.755904+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:47.756071+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:48.756180+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:49.756346+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:50.756486+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:51.756640+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:52.756816+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:53.756973+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76152832 unmapped: 17760256 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:54.757146+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:55.757301+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:56.757432+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:57.757591+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:58.757721+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:56:59.757873+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:00.758030+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:01.758123+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:02.758194+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:03.758330+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:04.758470+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:05.758612+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:06.758752+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:07.758925+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:08.759037+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:09.759228+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76161024 unmapped: 17752064 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:10.759370+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:11.759537+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:12.759707+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:13.759857+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:14.759991+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:15.760134+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:16.760282+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:17.760467+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:18.760604+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:19.760758+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:20.760948+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:21.761106+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76169216 unmapped: 17743872 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 779536 data_alloc: 184549376 data_used: 188416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:22.761244+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76177408 unmapped: 17735680 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:23.761413+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 76177408 unmapped: 17735680 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:24.761580+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 69.336410522s of 69.405174255s, submitted: 13
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78356480 unmapped: 15556608 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:25.761732+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bad5d000/0x0/0x1bfc00000, data 0x1e4ce89/0x1ed1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78356480 unmapped: 15556608 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:26.761939+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78356480 unmapped: 15556608 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 784865 data_alloc: 184549376 data_used: 200704
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:27.762109+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78356480 unmapped: 15556608 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 90 heartbeat osd_stat(store_statfs(0x1bad58000/0x0/0x1bfc00000, data 0x1e4f1f1/0x1ed5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:28.762384+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78356480 unmapped: 15556608 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:29.762538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78364672 unmapped: 15548416 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 91 handle_osd_map epochs [91,91], i have 91, src has [1,91]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:30.762669+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 91 ms_handle_reset con 0x55d7bbd18c00 session 0x55d7ba5b1860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:31.762939+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 787593 data_alloc: 184549376 data_used: 212992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:32.763070+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bad54000/0x0/0x1bfc00000, data 0x1e515ad/0x1ed9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:33.763286+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:34.763521+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:35.763706+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:36.763858+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 91 heartbeat osd_stat(store_statfs(0x1bad54000/0x0/0x1bfc00000, data 0x1e515ad/0x1ed9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 78454784 unmapped: 15458304 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 787593 data_alloc: 184549376 data_used: 212992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.269268036s of 12.462445259s, submitted: 50
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba5b0780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:37.764057+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9741000 session 0x55d7ba5b12c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84680704 unmapped: 9232384 heap: 93913088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:38.764236+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 85049344 unmapped: 14639104 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:39.764390+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84615168 unmapped: 15073280 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:40.764483+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84615168 unmapped: 15073280 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:41.764678+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b9e04000/0x0/0x1bfc00000, data 0x2d9f80b/0x2e2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84615168 unmapped: 15073280 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 932870 data_alloc: 184549376 data_used: 5988352
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:42.764951+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:43.765126+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:44.765333+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:45.765471+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b9ddf000/0x0/0x1bfc00000, data 0x2dc382e/0x2e4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:46.765632+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 937747 data_alloc: 184549376 data_used: 5988352
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:47.765804+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:48.765946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b9ddf000/0x0/0x1bfc00000, data 0x2dc382e/0x2e4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:49.766109+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b9ddf000/0x0/0x1bfc00000, data 0x2dc382e/0x2e4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:50.766253+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:51.766387+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84688896 unmapped: 14999552 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 937747 data_alloc: 184549376 data_used: 5988352
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:52.766521+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84697088 unmapped: 14991360 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:53.766650+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b9ddf000/0x0/0x1bfc00000, data 0x2dc382e/0x2e4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84697088 unmapped: 14991360 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:54.766790+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84697088 unmapped: 14991360 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b9ddf000/0x0/0x1bfc00000, data 0x2dc382e/0x2e4f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x2fcf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:55.766958+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd18400 session 0x55d7ba56f680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84852736 unmapped: 14835712 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd18800 session 0x55d7b9119e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:56.767104+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.531969070s of 19.757734299s, submitted: 53
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 87228416 unmapped: 12460032 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 974320 data_alloc: 184549376 data_used: 5992448
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd19000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd19000 session 0x55d7b91190e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:57.767275+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84959232 unmapped: 14729216 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:58.767478+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 84967424 unmapped: 14721024 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:57:59.767653+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b7dcc000/0x0/0x1bfc00000, data 0x3c3383e/0x3cc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 90554368 unmapped: 9134080 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:00.767819+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba41eb40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 90800128 unmapped: 8888320 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b7d02000/0x0/0x1bfc00000, data 0x3cfe83e/0x3d8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,0,2])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9741000 session 0x55d7b9a47e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:01.768073+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 91168768 unmapped: 8519680 heap: 99688448 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1067082 data_alloc: 184549376 data_used: 5992448
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:02.768220+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 91406336 unmapped: 11960320 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:03.768516+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 91545600 unmapped: 11821056 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd18400 session 0x55d7bb6d32c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:04.768683+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b7298000/0x0/0x1bfc00000, data 0x476083e/0x47ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd19400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd18800 session 0x55d7bb6d3c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 91766784 unmapped: 11599872 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:05.768878+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd19800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b6a12000/0x0/0x1bfc00000, data 0x4fef83e/0x507c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 92364800 unmapped: 11001856 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:06.769159+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95305728 unmapped: 8060928 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1071963 data_alloc: 184549376 data_used: 9965568
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:07.769499+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd19c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd19c00 session 0x55d7ba5ae3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95305728 unmapped: 8060928 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:08.769703+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd19c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd19c00 session 0x55d7b9b292c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95330304 unmapped: 8036352 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb9441e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:09.769911+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.855277061s of 12.674122810s, submitted: 206
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb90e960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95379456 unmapped: 7987200 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:10.770127+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b80dd000/0x0/0x1bfc00000, data 0x392581b/0x39b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95428608 unmapped: 7938048 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:11.770281+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd18400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95903744 unmapped: 7462912 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1077465 data_alloc: 184549376 data_used: 10489856
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:12.770457+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 95903744 unmapped: 7462912 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b80dd000/0x0/0x1bfc00000, data 0x392581b/0x39b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:13.770633+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b80dd000/0x0/0x1bfc00000, data 0x392581b/0x39b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 96002048 unmapped: 7364608 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:14.770946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 96002048 unmapped: 7364608 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:15.771152+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 97771520 unmapped: 5595136 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:16.771357+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b76b3000/0x0/0x1bfc00000, data 0x431581b/0x43a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 98516992 unmapped: 4849664 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1168685 data_alloc: 184549376 data_used: 10489856
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:17.771569+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 98516992 unmapped: 4849664 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:18.771721+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd18400 session 0x55d7ba94d0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd18800 session 0x55d7ba56f0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99549184 unmapped: 3817472 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:19.771839+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99598336 unmapped: 3768320 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.329719543s of 10.826125145s, submitted: 130
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:20.771979+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99737600 unmapped: 3629056 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:21.772124+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b700d000/0x0/0x1bfc00000, data 0x49f581b/0x4a81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1228389 data_alloc: 184549376 data_used: 10600448
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99885056 unmapped: 3481600 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:22.772255+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99336192 unmapped: 4030464 heap: 103366656 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:23.772408+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b700a000/0x0/0x1bfc00000, data 0x49f881b/0x4a84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [1,0,0,0,2])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99614720 unmapped: 7512064 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:24.772568+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99483648 unmapped: 7643136 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:25.772735+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb90e780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7ba5afe00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100057088 unmapped: 7069696 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:26.772928+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7bb90f4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1161032 data_alloc: 184549376 data_used: 10072064
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99655680 unmapped: 7471104 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:27.773107+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99680256 unmapped: 7446528 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:28.773290+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 heartbeat osd_stat(store_statfs(0x1b70f0000/0x0/0x1bfc00000, data 0x418f80b/0x421a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99680256 unmapped: 7446528 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:29.773442+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99680256 unmapped: 7446528 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:30.773605+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb7c7a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.640479088s of 10.146912575s, submitted: 129
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 ms_handle_reset con 0x55d7bbd19400 session 0x55d7ba41ef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 98680832 unmapped: 8445952 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:31.773780+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1163236 data_alloc: 184549376 data_used: 10084352
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 98680832 unmapped: 8445952 heap: 107126784 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:32.773970+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b99c21e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100728832 unmapped: 26345472 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:33.774124+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb71cd20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b662f000/0x0/0x1bfc00000, data 0x53d2b73/0x545f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100737024 unmapped: 26337280 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:34.774273+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100737024 unmapped: 26337280 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:35.774372+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100737024 unmapped: 26337280 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:36.774508+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7ba94c3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1184154 data_alloc: 184549376 data_used: 9986048
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100679680 unmapped: 26394624 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:37.774683+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7bb1fa780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100679680 unmapped: 26394624 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:38.774810+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 heartbeat osd_stat(store_statfs(0x1b7708000/0x0/0x1bfc00000, data 0x42f46c3/0x4386000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 95 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc04800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 96 ms_handle_reset con 0x55d7bbc04800 session 0x55d7bcf441e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 96 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcf445a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 96 ms_handle_reset con 0x55d7bbd19800 session 0x55d7bb6d2780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100704256 unmapped: 26370048 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:39.774958+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 97 ms_handle_reset con 0x55d7b9741000 session 0x55d7bcf44780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 97 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7b9b29a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc04400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 97 ms_handle_reset con 0x55d7bbc04400 session 0x55d7ba41e1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 97 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7b9118b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 97 heartbeat osd_stat(store_statfs(0x1b7702000/0x0/0x1bfc00000, data 0x42f6a65/0x438b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99131392 unmapped: 27942912 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:40.775079+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.454657555s of 10.002099037s, submitted: 142
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 98 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9118000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116850688 unmapped: 10223616 heap: 127074304 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:41.775201+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 98 ms_handle_reset con 0x55d7b9741000 session 0x55d7ba41eb40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1189556 data_alloc: 201326592 data_used: 15974400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:42.775313+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115466240 unmapped: 16130048 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 99 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7bbade960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:43.775427+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115113984 unmapped: 16482304 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbd19800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:44.775557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115113984 unmapped: 16482304 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 100 ms_handle_reset con 0x55d7bbd19800 session 0x55d7ba41e000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b79c1000/0x0/0x1bfc00000, data 0x4033533/0x40cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:45.775661+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106373120 unmapped: 25223168 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:46.775796+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106373120 unmapped: 25223168 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1020935 data_alloc: 184549376 data_used: 6033408
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:47.775936+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100974592 unmapped: 30621696 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:48.776063+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100974592 unmapped: 30621696 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 101 heartbeat osd_stat(store_statfs(0x1b8bfd000/0x0/0x1bfc00000, data 0x2df679d/0x2e90000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:49.776196+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100974592 unmapped: 30621696 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:50.776335+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100974592 unmapped: 30621696 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:51.776482+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100974592 unmapped: 30621696 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.213771820s of 11.627817154s, submitted: 102
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:52.776630+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1023937 data_alloc: 184549376 data_used: 6033408
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 102 heartbeat osd_stat(store_statfs(0x1b8bf9000/0x0/0x1bfc00000, data 0x2df89eb/0x2e94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:53.776756+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:54.776931+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 102 heartbeat osd_stat(store_statfs(0x1b8bf9000/0x0/0x1bfc00000, data 0x2df89eb/0x2e94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:55.777124+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:56.777323+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:57.777520+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1024097 data_alloc: 184549376 data_used: 6037504
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:58.777675+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 102 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9b29a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:58:59.777826+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 102 heartbeat osd_stat(store_statfs(0x1b8bf9000/0x0/0x1bfc00000, data 0x2df89eb/0x2e94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 103 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb1fa780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:00.777960+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101040128 unmapped: 30556160 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 103 heartbeat osd_stat(store_statfs(0x1b8bf4000/0x0/0x1bfc00000, data 0x2dfadb5/0x2e99000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 103 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7ba94d0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:01.778119+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101687296 unmapped: 29908992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.881555557s of 10.089842796s, submitted: 42
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 104 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7b99c21e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc2ec400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:02.778254+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1179296 data_alloc: 184549376 data_used: 6053888
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101801984 unmapped: 29794304 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 104 handle_osd_map epochs [104,105], i have 104, src has [1,105]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 ms_handle_reset con 0x55d7bc2ec400 session 0x55d7b99c25a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 ms_handle_reset con 0x55d7bbc05400 session 0x55d7b911a780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:03.778430+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 heartbeat osd_stat(store_statfs(0x1b7b7c000/0x0/0x1bfc00000, data 0x3e6d4c6/0x3f10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101924864 unmapped: 29671424 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9119860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb944d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7ba56f680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:04.778599+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101998592 unmapped: 29597696 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc2ec400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 105 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:05.778767+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 106 ms_handle_reset con 0x55d7bc2ec400 session 0x55d7b911b0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99188736 unmapped: 32407552 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 106 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7ba56e3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:06.778959+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99221504 unmapped: 32374784 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 106 heartbeat osd_stat(store_statfs(0x1b8b09000/0x0/0x1bfc00000, data 0x2ee086b/0x2f83000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:07.779197+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1067758 data_alloc: 184549376 data_used: 6057984
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99229696 unmapped: 32366592 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:08.779355+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99229696 unmapped: 32366592 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb90e780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb6d3c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 ms_handle_reset con 0x55d7bbc05400 session 0x55d7be1c0d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:09.779523+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2c000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 ms_handle_reset con 0x55d7bbc2c000 session 0x55d7be1c0960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb6945a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 98795520 unmapped: 32800768 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 ms_handle_reset con 0x55d7b9741000 session 0x55d7ba56eb40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 108 ms_handle_reset con 0x55d7bbc05400 session 0x55d7b911a1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:10.779703+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106184704 unmapped: 25411584 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:11.779830+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106250240 unmapped: 25346048 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 109 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7b91192c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.440273285s of 10.165362358s, submitted: 166
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:12.779970+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 110 handle_osd_map epochs [109,110], i have 110, src has [1,110]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1270196 data_alloc: 184549376 data_used: 12386304
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106315776 unmapped: 25280512 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 110 heartbeat osd_stat(store_statfs(0x1b77c6000/0x0/0x1bfc00000, data 0x421c287/0x42c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:13.780106+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106283008 unmapped: 25313280 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 111 ms_handle_reset con 0x55d7ba944400 session 0x55d7b9119860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:14.780239+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99409920 unmapped: 32186368 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:15.780538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 99409920 unmapped: 32186368 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:16.780707+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:17.780935+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:18.781077+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:19.781227+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:20.781397+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:21.781557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:22.781706+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:23.781831+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:24.781965+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:25.782106+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:26.782251+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:27.782441+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:28.782605+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:29.782751+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:30.782932+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:31.783040+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:32.784424+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:33.784598+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:34.784772+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:35.784980+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:36.785145+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:37.785322+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:38.785608+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:39.785746+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:40.785931+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:41.786111+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:42.786263+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:43.786405+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:44.786653+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:45.786814+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:46.787234+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:47.787451+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:48.787597+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:49.787744+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:50.787931+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:51.788050+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:52.788175+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:53.788310+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:54.788488+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:55.788638+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:56.788784+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:57.788971+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:58.789097+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T09:59:59.789241+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:00.789397+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:01.789557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:02.789745+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:03.789909+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:04.790056+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:05.790178+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:06.790333+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:07.790507+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:08.790645+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:09.790767+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:10.791485+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:11.791586+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:12.792194+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:13.792600+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:14.793025+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:15.793356+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:16.793504+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:17.793689+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:18.794557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:19.794748+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:20.795058+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:21.795881+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:22.796232+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:23.796394+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:24.796693+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:25.796856+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:26.796976+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:27.797286+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:28.797446+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:29.797603+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100474880 unmapped: 31121408 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:30.797715+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:31.797983+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:32.798136+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 973363 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:33.798272+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:34.798446+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9b60000/0x0/0x1bfc00000, data 0x1e7fac9/0x1f2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:35.798582+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:36.798738+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100483072 unmapped: 31113216 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 84.207000732s of 84.567947388s, submitted: 133
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:37.798962+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 113 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9b28000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 977037 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100491264 unmapped: 31105024 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:38.799118+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100548608 unmapped: 31047680 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b9b5b000/0x0/0x1bfc00000, data 0x1e81e31/0x1f31000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:39.799267+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100548608 unmapped: 31047680 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:40.799454+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 114 ms_handle_reset con 0x55d7b9741000 session 0x55d7b9b29680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100581376 unmapped: 31014912 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:41.799620+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100581376 unmapped: 31014912 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:42.799791+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 980813 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100597760 unmapped: 30998528 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:43.799938+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:44.800073+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b54000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:45.800210+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:46.800339+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:47.800510+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 983815 data_alloc: 184549376 data_used: 3203072
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:49.154764+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:50.154910+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b54000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100605952 unmapped: 30990336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:51.155076+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7ba944400 session 0x55d7b9b28d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102645760 unmapped: 28950528 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b54000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bbc05400 session 0x55d7b9b28780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:52.155207+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101597184 unmapped: 29999104 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:53.155340+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 989575 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101597184 unmapped: 29999104 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:54.155534+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b54000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101597184 unmapped: 29999104 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:55.155705+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101613568 unmapped: 29982720 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc2cc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bbc2cc00 session 0x55d7be1c0f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:56.155860+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101613568 unmapped: 29982720 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:57.156034+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101613568 unmapped: 29982720 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:58.156234+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.794036865s of 20.860162735s, submitted: 36
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 988695 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba56e960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101629952 unmapped: 29966336 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b55000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:00:59.156381+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7b9741000 session 0x55d7b911bc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101023744 unmapped: 30572544 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:00.156501+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101023744 unmapped: 30572544 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba3c0b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:01.156917+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:02.157071+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b55000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:03.157227+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 994561 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:04.157363+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:05.157512+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:06.157654+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b55000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:07.157802+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:08.171455+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 994561 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:09.171599+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b55000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:10.171732+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b55000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:11.171867+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100663296 unmapped: 30932992 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b55000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.461134911s of 13.599196434s, submitted: 30
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bbc05400 session 0x55d7b99c3e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:12.171971+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100679680 unmapped: 30916608 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:13.172129+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 996035 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100679680 unmapped: 30916608 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc2d6400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bc2d6400 session 0x55d7ba94de00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba94c960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:14.172285+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100753408 unmapped: 30842880 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:15.172420+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7b9741000 session 0x55d7ba3c0f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100794368 unmapped: 30801920 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b54000/0x0/0x1bfc00000, data 0x1e8644b/0x1f3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:16.172571+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100794368 unmapped: 30801920 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7ba944400 session 0x55d7b9abd4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bbc05400 session 0x55d7bcf44960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:17.174572+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100900864 unmapped: 30695424 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:18.174781+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 998468 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100900864 unmapped: 30695424 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b53000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:19.175003+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:20.175979+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b53000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:21.176473+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:22.176857+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:23.177022+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 998468 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b53000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:24.177215+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:25.177368+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:26.177694+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100892672 unmapped: 30703616 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:27.177861+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100900864 unmapped: 30695424 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:28.178056+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 998468 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100900864 unmapped: 30695424 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b53000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:29.178213+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100900864 unmapped: 30695424 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:30.178363+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34e800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.344186783s of 18.573247910s, submitted: 54
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bc34e800 session 0x55d7b9b28d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100933632 unmapped: 30662656 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b53000/0x0/0x1bfc00000, data 0x1e8643b/0x1f39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 7241 writes, 30K keys, 7241 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 7241 writes, 1653 syncs, 4.38 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 2297 writes, 8467 keys, 2297 commit groups, 1.0 writes per commit group, ingest: 8.52 MB, 0.01 MB/s
                                                          Interval WAL: 2297 writes, 988 syncs, 2.32 writes per sync, written: 0.01 GB, 0.01 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:31.178497+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100933632 unmapped: 30662656 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:32.178645+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9b28780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100966400 unmapped: 30629888 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7b9741000 session 0x55d7b9118780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:33.178967+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 997303 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb1fbe00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:34.179225+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 100982784 unmapped: 30613504 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b54000/0x0/0x1bfc00000, data 0x1e8649d/0x1f3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:35.179394+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 ms_handle_reset con 0x55d7bbc05400 session 0x55d7bbaded20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101064704 unmapped: 30531584 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:36.179520+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101064704 unmapped: 30531584 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:37.179675+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101064704 unmapped: 30531584 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:38.179850+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1000154 data_alloc: 184549376 data_used: 6090752
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101064704 unmapped: 30531584 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:39.180089+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101064704 unmapped: 30531584 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:40.180316+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b9b53000/0x0/0x1bfc00000, data 0x1e8644b/0x1f3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101081088 unmapped: 30515200 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 116 heartbeat osd_stat(store_statfs(0x1b9b4f000/0x0/0x1bfc00000, data 0x1e887b3/0x1f3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:41.180469+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101081088 unmapped: 30515200 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 116 heartbeat osd_stat(store_statfs(0x1b9b4f000/0x0/0x1bfc00000, data 0x1e887b3/0x1f3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:42.180631+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101081088 unmapped: 30515200 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:43.180793+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1003319 data_alloc: 184549376 data_used: 6103040
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 116 heartbeat osd_stat(store_statfs(0x1b9b4f000/0x0/0x1bfc00000, data 0x1e887b3/0x1f3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101089280 unmapped: 30507008 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:44.180915+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101089280 unmapped: 30507008 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:45.181067+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101089280 unmapped: 30507008 heap: 131596288 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34e400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.234522820s of 15.398621559s, submitted: 39
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:46.181224+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 109535232 unmapped: 30457856 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:47.181381+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 117 ms_handle_reset con 0x55d7bc34e400 session 0x55d7bcf45680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101203968 unmapped: 38789120 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:48.181601+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1117215 data_alloc: 184549376 data_used: 6115328
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101203968 unmapped: 38789120 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 117 heartbeat osd_stat(store_statfs(0x1b8b4a000/0x0/0x1bfc00000, data 0x2e8ab1b/0x2f42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:49.181759+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 118 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba5af0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101212160 unmapped: 38780928 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:50.181958+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101212160 unmapped: 38780928 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 119 ms_handle_reset con 0x55d7b9741000 session 0x55d7b9a47a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:51.182082+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101253120 unmapped: 38739968 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:52.182183+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101253120 unmapped: 38739968 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 119 heartbeat osd_stat(store_statfs(0x1b9744000/0x0/0x1bfc00000, data 0x1e8f283/0x1f49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:53.182308+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1014300 data_alloc: 184549376 data_used: 6115328
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101253120 unmapped: 38739968 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:54.182441+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101253120 unmapped: 38739968 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 119 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:55.183022+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 120 ms_handle_reset con 0x55d7ba944400 session 0x55d7bcf443c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101285888 unmapped: 38707200 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.199265480s of 10.425443649s, submitted: 52
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:56.183145+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 121 ms_handle_reset con 0x55d7bbc05400 session 0x55d7bb90f4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101376000 unmapped: 38617088 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:57.183271+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 122 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7b9abda40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 122 heartbeat osd_stat(store_statfs(0x1b9731000/0x0/0x1bfc00000, data 0x1e976b4/0x1f5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101425152 unmapped: 38567936 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:58.183423+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1045961 data_alloc: 184549376 data_used: 6139904
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101466112 unmapped: 38526976 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:01:59.183557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 124 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd3e4d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101474304 unmapped: 38518784 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:00.183736+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 124 ms_handle_reset con 0x55d7b9741000 session 0x55d7bd3e50e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101539840 unmapped: 38453248 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 125 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd3e54a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 125 heartbeat osd_stat(store_statfs(0x1b9725000/0x0/0x1bfc00000, data 0x1e9bf2e/0x1f66000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:01.183953+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 101539840 unmapped: 38453248 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:02.184092+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 125 heartbeat osd_stat(store_statfs(0x1b9722000/0x0/0x1bfc00000, data 0x1e9e2f8/0x1f6b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 126 ms_handle_reset con 0x55d7bbc05400 session 0x55d7bd3e5680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102596608 unmapped: 37396480 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:03.184278+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 126 heartbeat osd_stat(store_statfs(0x1b9720000/0x0/0x1bfc00000, data 0x1ea06a4/0x1f6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1055578 data_alloc: 184549376 data_used: 6078464
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102596608 unmapped: 37396480 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:04.184442+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 127 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7bd3e5a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102694912 unmapped: 37298176 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 128 heartbeat osd_stat(store_statfs(0x1b971b000/0x0/0x1bfc00000, data 0x1ea2bb1/0x1f72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:05.184587+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 128 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd3e5c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119529472 unmapped: 20463616 heap: 139993088 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.402566910s of 10.012523651s, submitted: 138
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:06.184764+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34f400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102948864 unmapped: 41246720 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 128 ms_handle_reset con 0x55d7bc34f400 session 0x55d7bb694960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:07.184977+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110837760 unmapped: 33357824 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:08.185185+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 130 ms_handle_reset con 0x55d7bbc05400 session 0x55d7ba5b10e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1826473 data_alloc: 184549376 data_used: 6098944
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102653952 unmapped: 41541632 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34f000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:09.185297+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 102883328 unmapped: 41312256 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:10.185463+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 131 ms_handle_reset con 0x55d7bc34f000 session 0x55d7bcdb3860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 131 heartbeat osd_stat(store_statfs(0x1afb0f000/0x0/0x1bfc00000, data 0xbaaacbd/0xbb7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34fc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111689728 unmapped: 32505856 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:11.185579+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 131 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 103727104 unmapped: 40468480 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 132 heartbeat osd_stat(store_statfs(0x1ad710000/0x0/0x1bfc00000, data 0xdeaac9a/0xdf7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:12.185706+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 132 ms_handle_reset con 0x55d7bc34fc00 session 0x55d7bcdb3a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 112287744 unmapped: 31907840 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:13.185862+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2897117 data_alloc: 184549376 data_used: 6123520
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 108511232 unmapped: 35684352 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:14.186005+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 108863488 unmapped: 35332096 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:15.186174+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 104841216 unmapped: 39354368 heap: 144195584 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:16.186352+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.497295380s of 10.080521584s, submitted: 213
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 105275392 unmapped: 43122688 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:17.186475+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 133 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcdb3c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 133 heartbeat osd_stat(store_statfs(0x1a170a000/0x0/0x1bfc00000, data 0x19eae82e/0x19f84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 109723648 unmapped: 38674432 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:18.186633+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4147917 data_alloc: 184549376 data_used: 6135808
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122806272 unmapped: 25591808 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7bbc05400 session 0x55d7bcdb3e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:19.186793+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd3e5e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7b9741000 session 0x55d7b9b29680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106217472 unmapped: 42180608 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34f000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:20.186967+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7bc34f000 session 0x55d7bcf44b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 heartbeat osd_stat(store_statfs(0x19cb05000/0x0/0x1bfc00000, data 0x1eab0a7c/0x1eb88000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106258432 unmapped: 42139648 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:21.187094+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 heartbeat osd_stat(store_statfs(0x1abb06000/0x0/0x1bfc00000, data 0xeeb0f18/0xef88000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [1,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7b9741000 session 0x55d7bcf445a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106389504 unmapped: 42008576 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:22.187339+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106430464 unmapped: 41967616 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7ba944400 session 0x55d7be1c0960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcdb2f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7ba944000 session 0x55d7bda6e3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:23.187595+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc05400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 ms_handle_reset con 0x55d7bbc05400 session 0x55d7b9118960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1209663 data_alloc: 184549376 data_used: 6135808
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106471424 unmapped: 41926656 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 134 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 135 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bda6f0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:24.187754+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106536960 unmapped: 41861120 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 135 ms_handle_reset con 0x55d7b9741000 session 0x55d7b99c32c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:25.187926+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 135 ms_handle_reset con 0x55d7ba944000 session 0x55d7bcf44d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106569728 unmapped: 41828352 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:26.188249+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106569728 unmapped: 41828352 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:27.188387+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 136 heartbeat osd_stat(store_statfs(0x1b96fe000/0x0/0x1bfc00000, data 0x1eb514a/0x1f8e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 136 ms_handle_reset con 0x55d7ba944400 session 0x55d7b9b29680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106569728 unmapped: 41828352 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.254441261s of 11.698754311s, submitted: 270
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:28.188588+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34f400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 137 ms_handle_reset con 0x55d7bc34f400 session 0x55d7bd3e4d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1219793 data_alloc: 184549376 data_used: 6148096
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106627072 unmapped: 41771008 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:29.188735+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106627072 unmapped: 41771008 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:30.188914+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd3e5680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106635264 unmapped: 41762816 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:31.189054+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 heartbeat osd_stat(store_statfs(0x1b96f6000/0x0/0x1bfc00000, data 0x1eb9924/0x1f97000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 ms_handle_reset con 0x55d7b9741000 session 0x55d7bcdb3860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 ms_handle_reset con 0x55d7ba944000 session 0x55d7bcdb3e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106651648 unmapped: 41746432 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:32.189205+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 ms_handle_reset con 0x55d7ba944400 session 0x55d7b9118780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc438c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 ms_handle_reset con 0x55d7bc438c00 session 0x55d7ba3c0f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106692608 unmapped: 41705472 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:33.189325+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 139 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd1a90e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1226773 data_alloc: 184549376 data_used: 6152192
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106733568 unmapped: 41664512 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 140 ms_handle_reset con 0x55d7b9741000 session 0x55d7bd1a92c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:34.189460+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 140 heartbeat osd_stat(store_statfs(0x1b96ed000/0x0/0x1bfc00000, data 0x1ebdf2e/0x1f9f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106733568 unmapped: 41664512 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 140 ms_handle_reset con 0x55d7ba944000 session 0x55d7bd1a9680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:35.189645+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106741760 unmapped: 41656320 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:36.189834+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 140 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd1a9e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106741760 unmapped: 41656320 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:37.189975+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb0a800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7bbb0a800 session 0x55d7ba41fc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 heartbeat osd_stat(store_statfs(0x1b96ee000/0x0/0x1bfc00000, data 0x1ebdecc/0x1f9e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106766336 unmapped: 41631744 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:38.190155+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.404092789s of 10.818413734s, submitted: 131
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1238768 data_alloc: 184549376 data_used: 6160384
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb6950e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd3e4960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106815488 unmapped: 41582592 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:39.190295+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7ba944000 session 0x55d7bb71c5a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106823680 unmapped: 41574400 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:40.190501+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7ba944400 session 0x55d7bcdb3e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106872832 unmapped: 41525248 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bb7c7680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:41.190657+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 ms_handle_reset con 0x55d7bbc02000 session 0x55d7ba92d2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 heartbeat osd_stat(store_statfs(0x1b96eb000/0x0/0x1bfc00000, data 0x1ec02b2/0x1fa3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106930176 unmapped: 41467904 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 142 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd1a9860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:42.190806+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106954752 unmapped: 41443328 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7b9741000 session 0x55d7be1c10e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:43.191018+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7ba944000 session 0x55d7be1c1c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb695680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1244277 data_alloc: 184549376 data_used: 6189056
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd1a94a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106962944 unmapped: 41435136 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7bbc02400 session 0x55d7ba92d4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:44.191174+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7b9741000 session 0x55d7bd3e41e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106962944 unmapped: 41435136 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:45.191360+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7ba944000 session 0x55d7b9b292c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 heartbeat osd_stat(store_statfs(0x1b96e3000/0x0/0x1bfc00000, data 0x1ec49d9/0x1faa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106979328 unmapped: 41418752 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:46.191538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7bbc02000 session 0x55d7b9b29c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 106987520 unmapped: 41410560 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:47.191686+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9b294a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107053056 unmapped: 41345024 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 ms_handle_reset con 0x55d7b9741000 session 0x55d7bb945c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:48.191941+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 handle_osd_map epochs [144,145], i have 145, src has [1,145]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1248441 data_alloc: 184549376 data_used: 6205440
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107061248 unmapped: 41336832 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:49.192071+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.103990555s of 10.520611763s, submitted: 137
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 ms_handle_reset con 0x55d7bbc02400 session 0x55d7ba92d0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 ms_handle_reset con 0x55d7ba944000 session 0x55d7bdb36000
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: pgmap v775: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.49704 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.69230 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.59245 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3626344404' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1217552270' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4147579157' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/4251710309' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/37439561' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2776723438' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1526207643' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4004221687' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1303286139' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/4217072223' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2474890893' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/78207861' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/545459147' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/847359979' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/840506094' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/411863729' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcdc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107143168 unmapped: 41254912 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:50.192226+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 ms_handle_reset con 0x55d7bcfcdc00 session 0x55d7bdb365a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107143168 unmapped: 41254912 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:51.192457+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 heartbeat osd_stat(store_statfs(0x1b96dd000/0x0/0x1bfc00000, data 0x1ec8fff/0x1fb1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107143168 unmapped: 41254912 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:52.192611+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107151360 unmapped: 41246720 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:53.211113+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 heartbeat osd_stat(store_statfs(0x1b96d8000/0x0/0x1bfc00000, data 0x1ecb24d/0x1fb5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1253763 data_alloc: 184549376 data_used: 6283264
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bdb36960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107167744 unmapped: 41230336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:54.211305+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 ms_handle_reset con 0x55d7b9741000 session 0x55d7bdb36d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 ms_handle_reset con 0x55d7ba944000 session 0x55d7bdb374a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107487232 unmapped: 40910848 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:55.211478+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107487232 unmapped: 40910848 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:56.214104+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 heartbeat osd_stat(store_statfs(0x1b8b7e000/0x0/0x1bfc00000, data 0x2a252af/0x2b10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bdb37860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcd800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 107601920 unmapped: 40796160 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:57.214274+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcd400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcd000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111329280 unmapped: 37068800 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:58.214434+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 147 ms_handle_reset con 0x55d7bcfcd400 session 0x55d7bdb36b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1424993 data_alloc: 184549376 data_used: 6295552
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 108331008 unmapped: 40067072 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 148 ms_handle_reset con 0x55d7bcfcd000 session 0x55d7bb9454a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:02:59.214581+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 148 ms_handle_reset con 0x55d7bcfcd800 session 0x55d7bdb37a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 108331008 unmapped: 40067072 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.968194962s of 10.854300499s, submitted: 102
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:00.214734+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 108339200 unmapped: 40058880 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 150 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bdb37e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 150 heartbeat osd_stat(store_statfs(0x1b8246000/0x0/0x1bfc00000, data 0x3352d59/0x3445000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:01.214912+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 109428736 unmapped: 38969344 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 45
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9741000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 150 ms_handle_reset con 0x55d7ba944000 session 0x55d7be1c1680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:02.215070+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 150 ms_handle_reset con 0x55d7b9741000 session 0x55d7b911a960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 150 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bbadeb40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111149056 unmapped: 37249024 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 151 ms_handle_reset con 0x55d7ba944000 session 0x55d7bd1a8000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:03.215230+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcc400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 151 ms_handle_reset con 0x55d7bcfcc400 session 0x55d7bd1a8d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc2d6000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 151 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bcf454a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1508821 data_alloc: 184549376 data_used: 6295552
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110411776 unmapped: 37986304 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 151 ms_handle_reset con 0x55d7bc2d6000 session 0x55d7ba56e000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:04.215470+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110411776 unmapped: 37986304 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:05.215645+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 152 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba5b01e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 handle_osd_map epochs [151,153], i have 153, src has [1,153]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 ms_handle_reset con 0x55d7ba944000 session 0x55d7bda6e000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110510080 unmapped: 37888000 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:06.215783+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bcdb23c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcc400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc2d6c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 ms_handle_reset con 0x55d7bcfcc400 session 0x55d7b967d680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b822d000/0x0/0x1bfc00000, data 0x3362b6f/0x345d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110354432 unmapped: 38043648 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:07.215941+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 154 ms_handle_reset con 0x55d7bc2d6c00 session 0x55d7b9b28f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110436352 unmapped: 37961728 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:08.216126+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1389891 data_alloc: 184549376 data_used: 6307840
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110632960 unmapped: 37765120 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 156 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba3c01e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:09.216274+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 156 heartbeat osd_stat(store_statfs(0x1b8b3c000/0x0/0x1bfc00000, data 0x2a4fbf1/0x2b4d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 156 ms_handle_reset con 0x55d7ba944000 session 0x55d7ba5b03c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 156 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bcdb2b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110665728 unmapped: 37732352 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.269522667s of 10.002008438s, submitted: 240
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:10.216410+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcc400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110731264 unmapped: 37666816 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:11.216647+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 157 heartbeat osd_stat(store_statfs(0x1b968d000/0x0/0x1bfc00000, data 0x1f002de/0x1fff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 158 ms_handle_reset con 0x55d7bcfcc400 session 0x55d7ba94de00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110739456 unmapped: 37658624 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 158 heartbeat osd_stat(store_statfs(0x1b968d000/0x0/0x1bfc00000, data 0x1f002de/0x1fff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:12.216846+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7d800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 158 ms_handle_reset con 0x55d7bbe7d800 session 0x55d7be1c1680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 46
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110714880 unmapped: 37683200 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 159 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b967cf00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:13.217015+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 159 ms_handle_reset con 0x55d7ba944000 session 0x55d7b911a960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1323694 data_alloc: 184549376 data_used: 6336512
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110764032 unmapped: 37634048 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:14.217160+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 159 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bdb36d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110780416 unmapped: 37617664 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:15.217339+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110780416 unmapped: 37617664 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:16.217540+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110780416 unmapped: 37617664 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:17.217724+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 159 heartbeat osd_stat(store_statfs(0x1b9686000/0x0/0x1bfc00000, data 0x1f0595f/0x2007000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 159 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110788608 unmapped: 37609472 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:18.217936+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1325234 data_alloc: 184549376 data_used: 6332416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110862336 unmapped: 37535744 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:19.218129+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111017984 unmapped: 37380096 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:20.218288+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111017984 unmapped: 37380096 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:21.218448+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 160 heartbeat osd_stat(store_statfs(0x1b91aa000/0x0/0x1bfc00000, data 0x1fdfe38/0x20e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111017984 unmapped: 37380096 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:22.218597+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcc400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.013717651s of 12.430390358s, submitted: 155
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 160 ms_handle_reset con 0x55d7bcfcc400 session 0x55d7b99c21e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110993408 unmapped: 37404672 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:23.218775+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1ea400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1338868 data_alloc: 184549376 data_used: 6332416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 160 handle_osd_map epochs [160,161], i have 160, src has [1,161]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 161 ms_handle_reset con 0x55d7be1ea400 session 0x55d7ba56fc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110993408 unmapped: 37404672 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:24.218940+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 161 heartbeat osd_stat(store_statfs(0x1b91a4000/0x0/0x1bfc00000, data 0x1fe21b0/0x20e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 110993408 unmapped: 37404672 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:25.219083+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 161 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bd1a9c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 162 ms_handle_reset con 0x55d7be1eb000 session 0x55d7ba56e780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111214592 unmapped: 37183488 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:26.219218+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eac00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 162 ms_handle_reset con 0x55d7be1eac00 session 0x55d7bd1a8b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 162 ms_handle_reset con 0x55d7ba944400 session 0x55d7b99c2960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:27.219366+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:28.219570+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1348421 data_alloc: 184549376 data_used: 6356992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:29.219712+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:30.219860+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 162 heartbeat osd_stat(store_statfs(0x1b9176000/0x0/0x1bfc00000, data 0x200f33c/0x2117000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:31.220003+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:32.220133+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 162 heartbeat osd_stat(store_statfs(0x1b9176000/0x0/0x1bfc00000, data 0x200f33c/0x2117000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111255552 unmapped: 37142528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:33.220274+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.516347885s of 10.761432648s, submitted: 61
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1351071 data_alloc: 184549376 data_used: 6356992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111263744 unmapped: 37134336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:34.220400+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b9172000/0x0/0x1bfc00000, data 0x201158a/0x211b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:35.220586+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111263744 unmapped: 37134336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:36.220719+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111263744 unmapped: 37134336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:37.220854+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111263744 unmapped: 37134336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:38.221103+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 111263744 unmapped: 37134336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1387085 data_alloc: 184549376 data_used: 6356992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:39.221259+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 114704384 unmapped: 33693696 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b8b9d000/0x0/0x1bfc00000, data 0x25e758a/0x26f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:40.221433+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115351552 unmapped: 33046528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:41.221629+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113942528 unmapped: 34455552 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:42.221756+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113942528 unmapped: 34455552 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:43.221950+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113958912 unmapped: 34439168 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b8b91000/0x0/0x1bfc00000, data 0x25f358a/0x26fd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1397065 data_alloc: 184549376 data_used: 6356992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:44.222108+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113958912 unmapped: 34439168 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:45.222280+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113958912 unmapped: 34439168 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.107888222s of 12.339326859s, submitted: 66
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 ms_handle_reset con 0x55d7ba944000 session 0x55d7bb90f2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b8b91000/0x0/0x1bfc00000, data 0x25f358a/0x26fd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:46.222480+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113188864 unmapped: 35209216 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 heartbeat osd_stat(store_statfs(0x1b8b8f000/0x0/0x1bfc00000, data 0x25f35fc/0x26ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:47.223185+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113188864 unmapped: 35209216 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 164 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb6be1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eac00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:48.223344+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113516544 unmapped: 34881536 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 164 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba92da40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 164 heartbeat osd_stat(store_statfs(0x1b8b8a000/0x0/0x1bfc00000, data 0x25f5964/0x2703000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1403823 data_alloc: 184549376 data_used: 6369280
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:49.223525+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113532928 unmapped: 34865152 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 164 handle_osd_map epochs [164,165], i have 164, src has [1,165]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 165 ms_handle_reset con 0x55d7be1eb000 session 0x55d7bb6954a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 165 ms_handle_reset con 0x55d7be1eac00 session 0x55d7ba56f0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:50.223681+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113541120 unmapped: 34856960 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 165 heartbeat osd_stat(store_statfs(0x1b8b85000/0x0/0x1bfc00000, data 0x25f7d82/0x2708000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:51.223817+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113557504 unmapped: 34840576 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 165 handle_osd_map epochs [165,166], i have 165, src has [1,166]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 166 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bbadfe00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 166 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bb6d2d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 166 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba5b0780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:52.223989+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 113623040 unmapped: 34775040 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 167 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb71da40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eac00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 167 ms_handle_reset con 0x55d7be1eac00 session 0x55d7bd1a85a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:53.224162+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115531776 unmapped: 32866304 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfcc800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 167 ms_handle_reset con 0x55d7bcfcc800 session 0x55d7bd3e4f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 167 handle_osd_map epochs [167,168], i have 167, src has [1,168]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 ms_handle_reset con 0x55d7be1eb000 session 0x55d7bb7c7680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1540319 data_alloc: 184549376 data_used: 6369280
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bb6d2780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:54.224307+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115654656 unmapped: 32743424 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:55.224472+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115687424 unmapped: 32710656 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bd1a85a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eac00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba94d4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7b9b28d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.158756256s of 10.003108025s, submitted: 195
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 169 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba41fc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 169 ms_handle_reset con 0x55d7be1eac00 session 0x55d7bb71da40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 169 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bcdb25a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:56.224701+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115802112 unmapped: 32595968 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 169 heartbeat osd_stat(store_statfs(0x1b8b23000/0x0/0x1bfc00000, data 0x26091e6/0x2721000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 170 ms_handle_reset con 0x55d7ba944400 session 0x55d7bcdb2f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:57.225065+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115802112 unmapped: 32595968 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 170 ms_handle_reset con 0x55d7bbc02400 session 0x55d7ba5b1860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 171 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba3c01e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:58.225260+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 115867648 unmapped: 32530432 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 172 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bb6bfe00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 172 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1463211 data_alloc: 184549376 data_used: 6393856
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 173 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba41ef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 173 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bb7c7680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:03:59.225445+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116056064 unmapped: 32342016 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 173 heartbeat osd_stat(store_statfs(0x1b8b43000/0x0/0x1bfc00000, data 0x2625e17/0x2747000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 47
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:00.225636+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116080640 unmapped: 32317440 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eac00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 173 ms_handle_reset con 0x55d7be1eac00 session 0x55d7ba94c960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 173 heartbeat osd_stat(store_statfs(0x1b8b43000/0x0/0x1bfc00000, data 0x2625e17/0x2747000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 173 handle_osd_map epochs [173,174], i have 173, src has [1,174]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:01.225851+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116080640 unmapped: 32317440 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 174 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b9b29c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:02.226075+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116137984 unmapped: 32260096 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 175 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bce06f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 175 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bb6d32c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 175 heartbeat osd_stat(store_statfs(0x1b8b3d000/0x0/0x1bfc00000, data 0x262a5af/0x274f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 175 handle_osd_map epochs [175,176], i have 175, src has [1,176]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 176 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba92d0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:03.226206+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 176 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7bdb361e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116170752 unmapped: 32227328 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1473852 data_alloc: 184549376 data_used: 6406144
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 176 handle_osd_map epochs [176,177], i have 176, src has [1,177]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:04.226308+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 177 ms_handle_reset con 0x55d7ba944400 session 0x55d7bce07a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116236288 unmapped: 32161792 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 177 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bcdcc1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:05.226443+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116293632 unmapped: 32104448 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 178 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bcdcc3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.843595505s of 10.029370308s, submitted: 344
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:06.226599+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 178 handle_osd_map epochs [178,179], i have 178, src has [1,179]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116375552 unmapped: 32022528 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:07.226772+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116498432 unmapped: 31899648 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 179 heartbeat osd_stat(store_statfs(0x1b8b1b000/0x0/0x1bfc00000, data 0x264a159/0x2773000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:08.227074+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116547584 unmapped: 31850496 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1435862 data_alloc: 184549376 data_used: 6434816
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:09.227232+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 180 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7bd4585a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 116613120 unmapped: 31784960 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:10.227376+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 181 ms_handle_reset con 0x55d7be1eb000 session 0x55d7bd458780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 117694464 unmapped: 30703616 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:11.227586+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 117694464 unmapped: 30703616 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 182 heartbeat osd_stat(store_statfs(0x1b91e9000/0x0/0x1bfc00000, data 0x1f79be7/0x20a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:12.227946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118743040 unmapped: 29655040 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 182 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd458960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:13.228138+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 117710848 unmapped: 30687232 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1440813 data_alloc: 184549376 data_used: 6459392
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:14.228279+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118759424 unmapped: 29638656 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 183 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bd458b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:15.228425+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118775808 unmapped: 29622272 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.918422699s of 10.464485168s, submitted: 200
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:16.228557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118824960 unmapped: 29573120 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 185 heartbeat osd_stat(store_statfs(0x1b91d6000/0x0/0x1bfc00000, data 0x1f86cea/0x20b7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:17.228907+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118865920 unmapped: 29532160 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 186 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bd458f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:18.229079+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118964224 unmapped: 29433856 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1450936 data_alloc: 184549376 data_used: 6459392
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:19.229226+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118964224 unmapped: 29433856 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:20.229466+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b91c3000/0x0/0x1bfc00000, data 0x1f97181/0x20cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118964224 unmapped: 29433856 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:21.229607+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118996992 unmapped: 29401088 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:22.229806+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 118996992 unmapped: 29401088 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:23.229956+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119046144 unmapped: 29351936 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1454828 data_alloc: 184549376 data_used: 6471680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:24.230145+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119046144 unmapped: 29351936 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:25.230373+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119046144 unmapped: 29351936 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 heartbeat osd_stat(store_statfs(0x1b91ad000/0x0/0x1bfc00000, data 0x1faa08d/0x20e0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:26.230534+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119046144 unmapped: 29351936 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:27.230695+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119046144 unmapped: 29351936 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.066532135s of 11.310238838s, submitted: 91
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:28.230988+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119087104 unmapped: 29310976 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 heartbeat osd_stat(store_statfs(0x1b91a1000/0x0/0x1bfc00000, data 0x1fb7052/0x20ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7bd4592c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7be1eb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1456372 data_alloc: 184549376 data_used: 6471680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:29.231165+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119103488 unmapped: 29294592 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 ms_handle_reset con 0x55d7be1eb000 session 0x55d7bd4594a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:30.231356+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119111680 unmapped: 29286400 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:31.231554+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119111680 unmapped: 29286400 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:32.231713+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119111680 unmapped: 29286400 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd459860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:33.231837+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122724352 unmapped: 25673728 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bd459c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bd458780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 heartbeat osd_stat(store_statfs(0x1b9032000/0x0/0x1bfc00000, data 0x21254da/0x225c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1537538 data_alloc: 184549376 data_used: 6471680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:34.231986+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119545856 unmapped: 28852224 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:35.232140+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119545856 unmapped: 28852224 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:36.232331+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119545856 unmapped: 28852224 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7bd458960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:37.232493+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 heartbeat osd_stat(store_statfs(0x1b8889000/0x0/0x1bfc00000, data 0x28cbb6a/0x2a04000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119455744 unmapped: 28942336 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:38.232805+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbcbd000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.372467995s of 10.800517082s, submitted: 99
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119480320 unmapped: 28917760 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 ms_handle_reset con 0x55d7bbcbd000 session 0x55d7bd3e54a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:39.232974+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1543572 data_alloc: 184549376 data_used: 6483968
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119488512 unmapped: 28909568 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd3e5c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:40.233125+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119488512 unmapped: 28909568 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 heartbeat osd_stat(store_statfs(0x1b887d000/0x0/0x1bfc00000, data 0x28d7355/0x2a10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:41.233273+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 heartbeat osd_stat(store_statfs(0x1b887d000/0x0/0x1bfc00000, data 0x28d7355/0x2a10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119488512 unmapped: 28909568 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:42.233440+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119488512 unmapped: 28909568 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b8875000/0x0/0x1bfc00000, data 0x28df70d/0x2a19000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bd3e50e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bd3e4d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:43.233621+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119504896 unmapped: 28893184 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7bcdb3860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:44.233750+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9540c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1552539 data_alloc: 184549376 data_used: 6496256
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7b9540c00 session 0x55d7bcf44d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119513088 unmapped: 28884992 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7ba944400 session 0x55d7bce07a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:45.234012+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119521280 unmapped: 28876800 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:46.234213+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119521280 unmapped: 28876800 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bbc02000 session 0x55d7b967de00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:47.234474+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 119529472 unmapped: 28868608 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:48.234717+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.741504669s of 10.000480652s, submitted: 68
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120619008 unmapped: 27779072 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b8867000/0x0/0x1bfc00000, data 0x28ea59d/0x2a27000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:49.234875+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1550996 data_alloc: 184549376 data_used: 6496256
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bbc02400 session 0x55d7ba94c1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120619008 unmapped: 27779072 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b885d000/0x0/0x1bfc00000, data 0x28f4dbc/0x2a31000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:50.234993+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120627200 unmapped: 27770880 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7ba41fc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:51.235166+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120676352 unmapped: 27721728 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9541800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b8857000/0x0/0x1bfc00000, data 0x28fbfdd/0x2a37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7b9541800 session 0x55d7ba41ef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7ba944400 session 0x55d7bcdb2f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:52.235314+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b8857000/0x0/0x1bfc00000, data 0x28fbfdd/0x2a37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120741888 unmapped: 27656192 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 ms_handle_reset con 0x55d7bbc02000 session 0x55d7be1c0000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:53.235440+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120774656 unmapped: 27623424 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:54.235561+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1564439 data_alloc: 184549376 data_used: 6508544
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 191 ms_handle_reset con 0x55d7bbc02400 session 0x55d7be1c14a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 191 heartbeat osd_stat(store_statfs(0x1b8843000/0x0/0x1bfc00000, data 0x290d63b/0x2a4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120774656 unmapped: 27623424 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 191 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7be1c0f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:55.235751+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120733696 unmapped: 27664384 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc06000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 191 ms_handle_reset con 0x55d7bbc06000 session 0x55d7bbadf2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:56.235934+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 192 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bcf44f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120758272 unmapped: 27639808 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:57.236100+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc06000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120766464 unmapped: 27631616 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 192 ms_handle_reset con 0x55d7bbc06000 session 0x55d7bcdcc960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 192 ms_handle_reset con 0x55d7ba944400 session 0x55d7bbadf860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 192 heartbeat osd_stat(store_statfs(0x1b882c000/0x0/0x1bfc00000, data 0x2920617/0x2a61000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 192 handle_osd_map epochs [192,193], i have 192, src has [1,193]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 193 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bb945c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:58.236289+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.405726433s of 10.003749847s, submitted: 180
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120823808 unmapped: 27574272 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:04:59.236433+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1573790 data_alloc: 184549376 data_used: 6537216
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120881152 unmapped: 27516928 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:00.236584+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 195 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bb945e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 195 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb944960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120889344 unmapped: 27508736 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 195 heartbeat osd_stat(store_statfs(0x1b8414000/0x0/0x1bfc00000, data 0x2931177/0x2a77000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4d6f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:01.236781+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 195 ms_handle_reset con 0x55d7bb67b400 session 0x55d7b99c3680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120897536 unmapped: 27500544 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 196 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bce065a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:02.236963+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 196 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bd4594a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc06000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120938496 unmapped: 27459584 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 196 heartbeat osd_stat(store_statfs(0x1b840d000/0x0/0x1bfc00000, data 0x2937714/0x2a80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4d6f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 196 ms_handle_reset con 0x55d7bbc06000 session 0x55d7bd459860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:03.237100+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120971264 unmapped: 27426816 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:04.237255+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 198 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd458960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1589162 data_alloc: 184549376 data_used: 6549504
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 120979456 unmapped: 27418624 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 198 ms_handle_reset con 0x55d7bb67b400 session 0x55d7b99c3680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:05.237395+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 199 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bb945e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 121020416 unmapped: 27377664 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:06.237539+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 121044992 unmapped: 27353088 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 199 heartbeat osd_stat(store_statfs(0x1b724c000/0x0/0x1bfc00000, data 0x2951c3a/0x2aa0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:07.237766+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 200 ms_handle_reset con 0x55d7bbc02400 session 0x55d7b9b294a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 121053184 unmapped: 27344896 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc06000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 200 ms_handle_reset con 0x55d7bbc06000 session 0x55d7bd1a9e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:08.238116+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.437504768s of 10.000068665s, submitted: 177
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 121135104 unmapped: 27262976 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 201 ms_handle_reset con 0x55d7ba944400 session 0x55d7bbadf0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:09.238272+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 202 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bcdcc960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1611766 data_alloc: 184549376 data_used: 6549504
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122208256 unmapped: 26189824 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:10.238464+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 202 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bcf44f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122273792 unmapped: 26124288 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bcfccc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:11.238705+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 203 ms_handle_reset con 0x55d7bbc02400 session 0x55d7bbadf2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 203 heartbeat osd_stat(store_statfs(0x1b7225000/0x0/0x1bfc00000, data 0x29733ba/0x2ac9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122388480 unmapped: 26009600 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc34f000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:12.238848+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 203 handle_osd_map epochs [203,204], i have 203, src has [1,204]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 204 ms_handle_reset con 0x55d7bc34f000 session 0x55d7be1c0f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 204 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7bd3e4000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 204 ms_handle_reset con 0x55d7ba944400 session 0x55d7bcdb2f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122445824 unmapped: 25952256 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 204 ms_handle_reset con 0x55d7bbc02000 session 0x55d7b9119680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 204 ms_handle_reset con 0x55d7bb67b400 session 0x55d7ba94c1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:13.238988+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 205 ms_handle_reset con 0x55d7bbc02400 session 0x55d7ba3c0780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122421248 unmapped: 25976832 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:14.239123+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1620310 data_alloc: 184549376 data_used: 6569984
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122454016 unmapped: 25944064 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 206 ms_handle_reset con 0x55d7ba944400 session 0x55d7bef6fe00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:15.239402+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 206 heartbeat osd_stat(store_statfs(0x1b81ad000/0x0/0x1bfc00000, data 0x29c5762/0x2b1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122494976 unmapped: 25903104 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 206 heartbeat osd_stat(store_statfs(0x1b819a000/0x0/0x1bfc00000, data 0x29d8d25/0x2b31000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:16.239558+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122798080 unmapped: 25600000 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 207 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bda6fe00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 207 heartbeat osd_stat(store_statfs(0x1b818b000/0x0/0x1bfc00000, data 0x29e64df/0x2b42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:17.239990+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 208 ms_handle_reset con 0x55d7bbc02000 session 0x55d7bd3e43c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 122871808 unmapped: 25526272 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 208 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7ba3c0b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:18.240215+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.058491707s of 10.002174377s, submitted: 315
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 124076032 unmapped: 24322048 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fa000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 209 ms_handle_reset con 0x55d7bd9fa000 session 0x55d7ba92c960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 209 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba94c3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:19.240395+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1640345 data_alloc: 184549376 data_used: 6582272
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 209 heartbeat osd_stat(store_statfs(0x1b813b000/0x0/0x1bfc00000, data 0x2a2f68f/0x2b8e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 123330560 unmapped: 25067520 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:20.240623+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 123412480 unmapped: 24985600 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:21.240767+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 123469824 unmapped: 24928256 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 209 heartbeat osd_stat(store_statfs(0x1b810d000/0x0/0x1bfc00000, data 0x2a605a7/0x2bc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [0,0,1,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:22.240972+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 123592704 unmapped: 24805376 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:23.241142+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 210 ms_handle_reset con 0x55d7b9740c00 session 0x55d7b99c25a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 210 heartbeat osd_stat(store_statfs(0x1b80f6000/0x0/0x1bfc00000, data 0x2a778bd/0x2bd8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 125288448 unmapped: 23109632 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 48
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:24.241277+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1662591 data_alloc: 184549376 data_used: 6594560
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 125313024 unmapped: 23085056 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:25.241488+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126402560 unmapped: 21995520 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 211 heartbeat osd_stat(store_statfs(0x1b809f000/0x0/0x1bfc00000, data 0x2aca1a9/0x2c2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:26.241690+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126459904 unmapped: 21938176 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:27.242007+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126484480 unmapped: 21913600 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 211 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bcdcd2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:28.242267+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.442548752s of 10.001214027s, submitted: 363
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126468096 unmapped: 21929984 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:29.242455+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1676937 data_alloc: 184549376 data_used: 6606848
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126484480 unmapped: 21913600 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:30.242558+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126730240 unmapped: 21667840 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:31.242761+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 212 heartbeat osd_stat(store_statfs(0x1b8013000/0x0/0x1bfc00000, data 0x2b54f7a/0x2cba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126238720 unmapped: 22159360 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:32.242966+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 49
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 212 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7be1c12c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 126238720 unmapped: 22159360 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 212 heartbeat osd_stat(store_statfs(0x1b8015000/0x0/0x1bfc00000, data 0x2b54eb3/0x2cb9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:33.243131+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fa400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 213 ms_handle_reset con 0x55d7bd9fa400 session 0x55d7b911a5a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127565824 unmapped: 20832256 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:34.243288+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1686145 data_alloc: 184549376 data_used: 6619136
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127680512 unmapped: 20717568 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:35.243488+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 213 heartbeat osd_stat(store_statfs(0x1b7faa000/0x0/0x1bfc00000, data 0x2bbca2c/0x2d24000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 213 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb6bfc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127713280 unmapped: 20684800 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 213 ms_handle_reset con 0x55d7ba944400 session 0x55d7bd3e4780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:36.243624+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127721472 unmapped: 20676608 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 213 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bda6e3c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:37.243745+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127868928 unmapped: 20529152 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:38.243943+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.511963844s of 10.003737450s, submitted: 170
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fa800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7bcdcc000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 ms_handle_reset con 0x55d7bd9fa800 session 0x55d7bb6d2d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 heartbeat osd_stat(store_statfs(0x1b7fad000/0x0/0x1bfc00000, data 0x2bbc857/0x2d20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127926272 unmapped: 20471808 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba92dc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:39.244088+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1691602 data_alloc: 184549376 data_used: 6631424
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127959040 unmapped: 20439040 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:40.244236+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 128065536 unmapped: 20332544 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:41.244369+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 128065536 unmapped: 20332544 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:42.244528+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127328256 unmapped: 21069824 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:43.244690+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 127328256 unmapped: 21069824 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:44.244871+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 heartbeat osd_stat(store_statfs(0x1b7f6e000/0x0/0x1bfc00000, data 0x2bf91ad/0x2d60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1689074 data_alloc: 184549376 data_used: 6631424
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129531904 unmapped: 18866176 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:45.245054+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129531904 unmapped: 18866176 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:46.245222+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129531904 unmapped: 18866176 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 ms_handle_reset con 0x55d7ba944400 session 0x55d7bef6e1e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:47.245516+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129531904 unmapped: 18866176 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 heartbeat osd_stat(store_statfs(0x1b6dbd000/0x0/0x1bfc00000, data 0x2c08a0d/0x2d71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:48.245736+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.753311157s of 10.000692368s, submitted: 59
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 ms_handle_reset con 0x55d7bb67b400 session 0x55d7ba5b14a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129671168 unmapped: 18726912 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7bcdcc780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:49.245881+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1702249 data_alloc: 184549376 data_used: 6631424
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129843200 unmapped: 18554880 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fac00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 215 ms_handle_reset con 0x55d7bd9fac00 session 0x55d7bdb372c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 215 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcdccb40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:50.246077+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129908736 unmapped: 18489344 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 ms_handle_reset con 0x55d7bd9fb000 session 0x55d7bb6d2f00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 ms_handle_reset con 0x55d7ba944400 session 0x55d7b9abd4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:51.246223+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 handle_osd_map epochs [215,216], i have 216, src has [1,216]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bcdb32c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130990080 unmapped: 17408000 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:52.246386+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129990656 unmapped: 18407424 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7bda6e780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:53.246557+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 heartbeat osd_stat(store_statfs(0x1b6d5e000/0x0/0x1bfc00000, data 0x2c5f1eb/0x2dcb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 129990656 unmapped: 18407424 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:54.246733+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1714008 data_alloc: 184549376 data_used: 6656000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130007040 unmapped: 18391040 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 218 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bcf44b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 218 ms_handle_reset con 0x55d7ba944400 session 0x55d7bbadf680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 218 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7bb1fa780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 218 handle_osd_map epochs [217,218], i have 218, src has [1,218]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:55.247084+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 heartbeat osd_stat(store_statfs(0x1b6d59000/0x0/0x1bfc00000, data 0x2c638eb/0x2dd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 ms_handle_reset con 0x55d7bd9fb000 session 0x55d7bbadef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 ms_handle_reset con 0x55d7bd9fb400 session 0x55d7ba5af0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 ms_handle_reset con 0x55d7ba944400 session 0x55d7bcdcdc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130064384 unmapped: 18333696 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb6d3c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:56.247250+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130064384 unmapped: 18333696 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 ms_handle_reset con 0x55d7bb67b400 session 0x55d7ba5b1c20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbe7c400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:57.247450+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 ms_handle_reset con 0x55d7bd9fb000 session 0x55d7bcdcc5a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 ms_handle_reset con 0x55d7bbe7c400 session 0x55d7ba5b0780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 ms_handle_reset con 0x55d7bd9fb800 session 0x55d7ba5b0d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130088960 unmapped: 18309120 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb694b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb6d2000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:58.247658+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.071632385s of 10.064979553s, submitted: 218
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 heartbeat osd_stat(store_statfs(0x1b6d52000/0x0/0x1bfc00000, data 0x2c680d9/0x2ddc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 221 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bdb36000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130179072 unmapped: 18219008 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fbc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 221 ms_handle_reset con 0x55d7bd9fb000 session 0x55d7ba87f680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 221 ms_handle_reset con 0x55d7bd9fbc00 session 0x55d7bdb37860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:05:59.247843+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1728901 data_alloc: 184549376 data_used: 6680576
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130211840 unmapped: 18186240 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:00.247978+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130244608 unmapped: 18153472 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 222 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bdb363c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:01.248126+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130244608 unmapped: 18153472 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 223 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb1fa5a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:02.248274+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130244608 unmapped: 18153472 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 223 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bcdcd2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 224 heartbeat osd_stat(store_statfs(0x1b6d49000/0x0/0x1bfc00000, data 0x2c6ee57/0x2de4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:03.248423+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fb800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 224 ms_handle_reset con 0x55d7bd9fb800 session 0x55d7bd1a8d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130301952 unmapped: 18096128 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 224 ms_handle_reset con 0x55d7ba944400 session 0x55d7bda6f0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 224 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bda6ef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fbc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:04.248569+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bf048000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 225 ms_handle_reset con 0x55d7bd9fbc00 session 0x55d7b99c3680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7a400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 225 ms_handle_reset con 0x55d7bbb7a400 session 0x55d7bef6fc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1749037 data_alloc: 184549376 data_used: 6705152
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130342912 unmapped: 18055168 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 225 handle_osd_map epochs [225,226], i have 225, src has [1,226]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:05.248731+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 226 ms_handle_reset con 0x55d7bf048000 session 0x55d7bb7c7680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 226 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcdcde00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130367488 unmapped: 18030592 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:06.248872+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bf048000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 226 heartbeat osd_stat(store_statfs(0x1b6d38000/0x0/0x1bfc00000, data 0x2c75dc4/0x2df4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 130375680 unmapped: 18022400 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 226 ms_handle_reset con 0x55d7bf048000 session 0x55d7b9abcd20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:07.249425+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 227 ms_handle_reset con 0x55d7ba944400 session 0x55d7ba3c01e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 138788864 unmapped: 9609216 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:08.249616+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.629940987s of 10.177999496s, submitted: 230
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139542528 unmapped: 8855552 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:09.249753+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1981675 data_alloc: 184549376 data_used: 6701056
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 229 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7bcf441e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 229 ms_handle_reset con 0x55d7bb67b400 session 0x55d7bbade000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131145728 unmapped: 17252352 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:10.249954+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 229 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba5ae780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 230 heartbeat osd_stat(store_statfs(0x1b4d30000/0x0/0x1bfc00000, data 0x4c7c2c6/0x4dfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131227648 unmapped: 17170432 heap: 148398080 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:11.250122+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 230 ms_handle_reset con 0x55d7ba944400 session 0x55d7b967d4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131235840 unmapped: 25559040 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:12.250291+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 230 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7bb6be000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bf048000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131252224 unmapped: 25542656 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:13.250424+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 231 ms_handle_reset con 0x55d7bf048000 session 0x55d7ba87fa40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139681792 unmapped: 17113088 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:14.250585+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2206149 data_alloc: 184549376 data_used: 6721536
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131219456 unmapped: 25575424 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:15.250752+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131219456 unmapped: 25575424 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 231 heartbeat osd_stat(store_statfs(0x1b1928000/0x0/0x1bfc00000, data 0x7c80930/0x7e05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:16.250967+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131301376 unmapped: 25493504 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:17.251119+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 231 ms_handle_reset con 0x55d7bbb7b800 session 0x55d7bb7c7a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131301376 unmapped: 25493504 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:18.251295+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.505836487s of 10.027492523s, submitted: 122
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131325952 unmapped: 25468928 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:19.251438+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2430431 data_alloc: 184549376 data_used: 6721536
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 232 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bce070e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139739136 unmapped: 17055744 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:20.251608+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bf048000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 233 ms_handle_reset con 0x55d7bf048000 session 0x55d7bcdb2000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 233 heartbeat osd_stat(store_statfs(0x1b0123000/0x0/0x1bfc00000, data 0x9482bf1/0x960b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139821056 unmapped: 16973824 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:21.251780+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7bef6eb40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 ms_handle_reset con 0x55d7ba944400 session 0x55d7b911b0e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 131440640 unmapped: 25354240 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:22.251959+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7a400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 132513792 unmapped: 24281088 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fbc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 ms_handle_reset con 0x55d7bd9fbc00 session 0x55d7ba5b0d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:23.252457+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 133586944 unmapped: 23207936 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 235 ms_handle_reset con 0x55d7bbb7a400 session 0x55d7ba87f680
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:24.252855+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 235 heartbeat osd_stat(store_statfs(0x1ad113000/0x0/0x1bfc00000, data 0xc489ed0/0xc619000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 235 handle_osd_map epochs [235,236], i have 235, src has [1,236]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2834384 data_alloc: 184549376 data_used: 6746112
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 133586944 unmapped: 23207936 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:25.252995+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb90f2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb6bfc20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 133578752 unmapped: 23216128 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7ba92da40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:26.253142+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 handle_osd_map epochs [236,237], i have 237, src has [1,237]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 133586944 unmapped: 23207936 heap: 156794880 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:27.253272+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 142000128 unmapped: 23191552 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:28.253464+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 143073280 unmapped: 22118400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 238 heartbeat osd_stat(store_statfs(0x1ac10c000/0x0/0x1bfc00000, data 0xd490352/0xd621000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.565268517s of 10.432284355s, submitted: 210
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:29.253620+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 238 heartbeat osd_stat(store_statfs(0x1ac10c000/0x0/0x1bfc00000, data 0xd490352/0xd621000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3058605 data_alloc: 184549376 data_used: 6746112
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134668288 unmapped: 30523392 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:30.253766+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134668288 unmapped: 30523392 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:31.253932+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 238 heartbeat osd_stat(store_statfs(0x1aa909000/0x0/0x1bfc00000, data 0xec903a2/0xee23000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134676480 unmapped: 30515200 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bd9fbc00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:32.254053+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 238 ms_handle_reset con 0x55d7bd9fbc00 session 0x55d7ba92d860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134701056 unmapped: 30490624 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:33.254149+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 239 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba92d4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134733824 unmapped: 30457856 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:34.254291+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 239 heartbeat osd_stat(store_statfs(0x1a8107000/0x0/0x1bfc00000, data 0x114925a0/0x11625000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3445161 data_alloc: 184549376 data_used: 6758400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134733824 unmapped: 30457856 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 239 heartbeat osd_stat(store_statfs(0x1a7909000/0x0/0x1bfc00000, data 0x11c925a0/0x11e25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:35.254426+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 143163392 unmapped: 22028288 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:36.254585+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 239 heartbeat osd_stat(store_statfs(0x1a6909000/0x0/0x1bfc00000, data 0x12c925a0/0x12e25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134815744 unmapped: 30375936 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:37.254754+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 heartbeat osd_stat(store_statfs(0x1a6103000/0x0/0x1bfc00000, data 0x13494a66/0x1362b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134840320 unmapped: 30351360 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:38.254946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb1fad20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.490653038s of 10.004101753s, submitted: 94
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134864896 unmapped: 30326784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 heartbeat osd_stat(store_statfs(0x1a60fe000/0x0/0x1bfc00000, data 0x13496e37/0x1362f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:39.255115+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7a400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 ms_handle_reset con 0x55d7bbb7a400 session 0x55d7bb1fb860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 3678823 data_alloc: 184549376 data_used: 6782976
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134922240 unmapped: 30269440 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7b9b28d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:40.255269+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bf048000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134946816 unmapped: 30244864 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:41.255426+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 ms_handle_reset con 0x55d7bf048000 session 0x55d7bd3e4960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134963200 unmapped: 30228480 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:42.255562+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 134963200 unmapped: 30228480 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:43.255699+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136028160 unmapped: 29163520 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:44.255863+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bb1fad20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4008393 data_alloc: 184549376 data_used: 6795264
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 heartbeat osd_stat(store_statfs(0x1a28fa000/0x0/0x1bfc00000, data 0x16c98fbb/0x16e32000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136052736 unmapped: 29138944 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:45.256047+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb90f2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 20742144 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:46.256197+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7a400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 ms_handle_reset con 0x55d7bbb7a400 session 0x55d7bb6bef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136118272 unmapped: 29073408 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:47.256343+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 144531456 unmapped: 20660224 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:48.256520+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136167424 unmapped: 29024256 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:49.256664+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.889733315s of 10.519783020s, submitted: 79
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 4394540 data_alloc: 184549376 data_used: 6799360
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7bb71de00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:50.256788+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb6f5000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 ms_handle_reset con 0x55d7bb6f5000 session 0x55d7bcdb3a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 ms_handle_reset con 0x55d7bb67b800 session 0x55d7bef6e780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 heartbeat osd_stat(store_statfs(0x19f0fc000/0x0/0x1bfc00000, data 0x1a498f89/0x1a632000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136945664 unmapped: 28246016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:51.257096+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136945664 unmapped: 28246016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:52.257261+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137035776 unmapped: 28155904 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:53.257411+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcdb30e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 ms_handle_reset con 0x55d7bbc02000 session 0x55d7be1c0b40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb7c6960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 heartbeat osd_stat(store_statfs(0x19ee32000/0x0/0x1bfc00000, data 0x18f5c80b/0x190fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7a400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 ms_handle_reset con 0x55d7bbb7a400 session 0x55d7be1c10e0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136634368 unmapped: 28557312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:54.257564+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 heartbeat osd_stat(store_statfs(0x1b5e31000/0x0/0x1bfc00000, data 0x375c7d6/0x38fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 50
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 ms_handle_reset con 0x55d7b9740c00 session 0x55d7ba41ed20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1891991 data_alloc: 184549376 data_used: 6807552
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:55.259370+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:56.259517+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:57.259670+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 244 handle_osd_map epochs [244,245], i have 245, src has [1,245]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:58.259856+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 245 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb6bf2c0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:06:59.259983+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1897399 data_alloc: 184549376 data_used: 6819840
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.536033630s of 10.560031891s, submitted: 441
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 245 heartbeat osd_stat(store_statfs(0x1b68ef000/0x0/0x1bfc00000, data 0x2c9f92d/0x2e3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:00.260121+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:01.260239+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136200192 unmapped: 28991488 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:02.260380+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136224768 unmapped: 28966912 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:03.260532+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 246 ms_handle_reset con 0x55d7bb67b800 session 0x55d7bd458780
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbc02000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 246 heartbeat osd_stat(store_statfs(0x1b60ec000/0x0/0x1bfc00000, data 0x349f9da/0x3642000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [0,2])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136224768 unmapped: 28966912 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:04.260706+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 246 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7bbaded20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1965156 data_alloc: 184549376 data_used: 6848512
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 ms_handle_reset con 0x55d7bbc02000 session 0x55d7b99c3a40
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136249344 unmapped: 28942336 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:05.260855+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bcdcd4a0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 heartbeat osd_stat(store_statfs(0x1b68e4000/0x0/0x1bfc00000, data 0x2ca41b2/0x2e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136249344 unmapped: 28942336 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:06.261008+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 ms_handle_reset con 0x55d7ba944400 session 0x55d7bef6ef00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 ms_handle_reset con 0x55d7bb67b800 session 0x55d7bb90e960
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136298496 unmapped: 28893184 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:07.261166+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136298496 unmapped: 28893184 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:08.261366+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136306688 unmapped: 28884992 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:09.261543+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1907360 data_alloc: 184549376 data_used: 6844416
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136306688 unmapped: 28884992 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:10.261705+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.742587090s of 10.212919235s, submitted: 108
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 heartbeat osd_stat(store_statfs(0x1b68e8000/0x0/0x1bfc00000, data 0x2ca4084/0x2e46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136314880 unmapped: 28876800 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:11.261928+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bbb7b000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 ms_handle_reset con 0x55d7bbb7b000 session 0x55d7be1c1860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136314880 unmapped: 28876800 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:12.262071+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136314880 unmapped: 28876800 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:13.262437+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb6f4800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136314880 unmapped: 28876800 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:14.262613+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 249 ms_handle_reset con 0x55d7bb6f4800 session 0x55d7bd1a9860
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 249 heartbeat osd_stat(store_statfs(0x1b68e1000/0x0/0x1bfc00000, data 0x2ca62bd/0x2e4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1918443 data_alloc: 184549376 data_used: 6860800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136339456 unmapped: 28852224 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:15.262809+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7b9740c00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 249 ms_handle_reset con 0x55d7b9740c00 session 0x55d7bbade000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7ba944400
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136339456 unmapped: 28852224 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:16.263005+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 250 ms_handle_reset con 0x55d7ba944400 session 0x55d7bb944d20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136372224 unmapped: 28819456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:17.263156+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bb67b800
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 250 ms_handle_reset con 0x55d7bb67b800 session 0x55d7bcdcc000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136372224 unmapped: 28819456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:18.263343+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136372224 unmapped: 28819456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:19.263530+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1920709 data_alloc: 184549376 data_used: 6856704
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 250 heartbeat osd_stat(store_statfs(0x1b68d9000/0x0/0x1bfc00000, data 0x2caaa2b/0x2e52000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136372224 unmapped: 28819456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:20.263692+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.867943764s of 10.144845963s, submitted: 74
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136380416 unmapped: 28811264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:21.263844+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 136380416 unmapped: 28811264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:22.264035+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 250 handle_osd_map epochs [250,251], i have 250, src has [1,251]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137461760 unmapped: 27729920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:23.264186+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137494528 unmapped: 27697152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:24.264345+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b68d4000/0x0/0x1bfc00000, data 0x2cacd12/0x2e57000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1927567 data_alloc: 184549376 data_used: 6868992
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137494528 unmapped: 27697152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:25.264509+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137494528 unmapped: 27697152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:26.264675+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137494528 unmapped: 27697152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:27.264810+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137494528 unmapped: 27697152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:28.265001+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137502720 unmapped: 27688960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:29.265202+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1929803 data_alloc: 184549376 data_used: 6881280
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 252 heartbeat osd_stat(store_statfs(0x1b68d3000/0x0/0x1bfc00000, data 0x2caefe9/0x2e5a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137510912 unmapped: 27680768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:30.265391+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137510912 unmapped: 27680768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:31.265933+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.075522423s of 11.410874367s, submitted: 132
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:32.266137+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137510912 unmapped: 27680768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:33.266320+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137510912 unmapped: 27680768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:34.266531+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137519104 unmapped: 27672576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1929407 data_alloc: 184549376 data_used: 6881280
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:35.266683+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137519104 unmapped: 27672576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 252 heartbeat osd_stat(store_statfs(0x1b68d3000/0x0/0x1bfc00000, data 0x2caef4e/0x2e59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:36.266826+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137519104 unmapped: 27672576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:37.266998+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137707520 unmapped: 27484160 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:38.267278+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137928704 unmapped: 27262976 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:39.267471+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 137953280 unmapped: 27238400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1951575 data_alloc: 184549376 data_used: 6893568
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:40.267695+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 138117120 unmapped: 27074560 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:41.267942+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 138346496 unmapped: 26845184 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 254 heartbeat osd_stat(store_statfs(0x1b683f000/0x0/0x1bfc00000, data 0x2d400dc/0x2eef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.737618446s of 10.127841949s, submitted: 103
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:42.268157+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139501568 unmapped: 25690112 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:43.268360+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139542528 unmapped: 25649152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:44.268449+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139993088 unmapped: 25198592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1962369 data_alloc: 184549376 data_used: 6893568
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:45.268576+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140034048 unmapped: 25157632 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 254 heartbeat osd_stat(store_statfs(0x1b67a2000/0x0/0x1bfc00000, data 0x2dd8fcd/0x2f8a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:46.268738+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140189696 unmapped: 25001984 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:47.268929+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139681792 unmapped: 25509888 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:48.269150+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 139689984 unmapped: 25501696 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:49.269296+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141164544 unmapped: 24027136 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1964527 data_alloc: 184549376 data_used: 6905856
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:50.269444+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140558336 unmapped: 24633344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b6724000/0x0/0x1bfc00000, data 0x2e58f5c/0x300a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:51.269625+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140566528 unmapped: 24625152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.789386749s of 10.194364548s, submitted: 115
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:52.269795+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140673024 unmapped: 24518656 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b66b7000/0x0/0x1bfc00000, data 0x2ec31db/0x3075000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:53.269925+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140713984 unmapped: 24477696 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:54.270068+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140730368 unmapped: 24461312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1982719 data_alloc: 184549376 data_used: 6918144
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:55.270227+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 256 heartbeat osd_stat(store_statfs(0x1b667d000/0x0/0x1bfc00000, data 0x2efc6d6/0x30af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 140754944 unmapped: 24436736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:56.270393+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141205504 unmapped: 23986176 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:57.270591+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141385728 unmapped: 23805952 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:58.270789+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141246464 unmapped: 23945216 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 256 heartbeat osd_stat(store_statfs(0x1b65fc000/0x0/0x1bfc00000, data 0x2f7d34e/0x3130000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:07:59.270974+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141844480 unmapped: 23347200 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 256 heartbeat osd_stat(store_statfs(0x1b65c4000/0x0/0x1bfc00000, data 0x2fb719f/0x316a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1989853 data_alloc: 184549376 data_used: 6918144
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:00.271153+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141844480 unmapped: 23347200 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:01.271339+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 141901824 unmapped: 23289856 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 256 heartbeat osd_stat(store_statfs(0x1b65c4000/0x0/0x1bfc00000, data 0x2fb719f/0x316a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:02.271500+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.757401466s of 10.157071114s, submitted: 121
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 143269888 unmapped: 21921792 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:03.271709+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 256 handle_osd_map epochs [256,257], i have 256, src has [1,257]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 142303232 unmapped: 22888448 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:04.271968+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 143523840 unmapped: 21667840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b6126000/0x0/0x1bfc00000, data 0x3050484/0x3206000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 1997597 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:05.272169+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 143769600 unmapped: 21422080 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:06.272326+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 144039936 unmapped: 21151744 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:07.272533+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 144089088 unmapped: 21102592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b60be000/0x0/0x1bfc00000, data 0x30bb73e/0x3270000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:08.272752+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 145350656 unmapped: 19841024 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:09.272950+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 145375232 unmapped: 19816448 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2018467 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:10.273182+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 145375232 unmapped: 19816448 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:11.273335+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 20742144 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b6028000/0x0/0x1bfc00000, data 0x31511e0/0x3305000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:12.273528+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 144449536 unmapped: 20742144 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.679770470s of 10.122138977s, submitted: 111
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:13.273687+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 145620992 unmapped: 19570688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:14.273865+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b6003000/0x0/0x1bfc00000, data 0x31766ab/0x332a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 145981440 unmapped: 19210240 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2019991 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:15.274037+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 145981440 unmapped: 19210240 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:16.274190+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147120128 unmapped: 18071552 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:17.274404+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147415040 unmapped: 17776640 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:18.274637+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147152896 unmapped: 18038784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5f8f000/0x0/0x1bfc00000, data 0x31eaf12/0x339f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:19.274781+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147283968 unmapped: 17907712 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2039275 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:20.274964+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147603456 unmapped: 17588224 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:21.275149+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147726336 unmapped: 17465344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:22.275298+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 147726336 unmapped: 17465344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.014319420s of 10.401030540s, submitted: 85
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:23.275452+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 148914176 unmapped: 16277504 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5ebf000/0x0/0x1bfc00000, data 0x32b7864/0x346e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:24.275623+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 148922368 unmapped: 16269312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2051213 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:25.275824+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 148922368 unmapped: 16269312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5e74000/0x0/0x1bfc00000, data 0x32fe751/0x34b6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:26.275979+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 148324352 unmapped: 16867328 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5e39000/0x0/0x1bfc00000, data 0x333e9ce/0x34f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:27.276118+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 148332544 unmapped: 16859136 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5e2a000/0x0/0x1bfc00000, data 0x334da41/0x3504000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:28.276325+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 149651456 unmapped: 15540224 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:29.276499+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5dd6000/0x0/0x1bfc00000, data 0x339ff8d/0x3556000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,2])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150102016 unmapped: 15089664 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2053869 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:30.276663+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150102016 unmapped: 15089664 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:31.276866+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150118400 unmapped: 15073280 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:32.277000+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150429696 unmapped: 14761984 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.791955948s of 10.256890297s, submitted: 118
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:33.277168+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150118400 unmapped: 15073280 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:34.277325+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150265856 unmapped: 14925824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5d1c000/0x0/0x1bfc00000, data 0x345d61e/0x3612000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2060879 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:35.277449+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150732800 unmapped: 14458880 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:36.277604+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150831104 unmapped: 14360576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:37.277772+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150839296 unmapped: 14352384 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cbf000/0x0/0x1bfc00000, data 0x34b9ea3/0x366e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:38.277948+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150953984 unmapped: 14237696 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:39.278105+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cbc000/0x0/0x1bfc00000, data 0x34b9f9f/0x366f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150953984 unmapped: 14237696 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2063405 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:40.278247+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150953984 unmapped: 14237696 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:41.278404+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150962176 unmapped: 14229504 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cbe000/0x0/0x1bfc00000, data 0x34b9f68/0x366f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:42.278538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150962176 unmapped: 14229504 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:43.278692+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.856874466s of 10.179021835s, submitted: 73
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150962176 unmapped: 14229504 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:44.278829+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150970368 unmapped: 14221312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:45.278987+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2064225 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150970368 unmapped: 14221312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:46.279134+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150970368 unmapped: 14221312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:47.279946+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cbe000/0x0/0x1bfc00000, data 0x34b9ed4/0x366e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150970368 unmapped: 14221312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:48.280275+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150970368 unmapped: 14221312 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:49.280446+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:50.280785+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2063953 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:51.281340+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:52.281851+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cc0000/0x0/0x1bfc00000, data 0x34b9e0d/0x366d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:53.282112+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.857705116s of 10.000739098s, submitted: 27
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cc0000/0x0/0x1bfc00000, data 0x34b9e0d/0x366d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,1])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:54.282288+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:55.282523+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2061977 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cc0000/0x0/0x1bfc00000, data 0x34b9dda/0x366d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:56.282701+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:57.282955+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:58.283252+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150978560 unmapped: 14213120 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:08:59.283436+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150986752 unmapped: 14204928 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:00.283640+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2063937 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:01.284003+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cbf000/0x0/0x1bfc00000, data 0x34b9ea7/0x366e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:02.284175+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:03.284433+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.900843620s of 10.000195503s, submitted: 20
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:04.284621+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:05.284871+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2064951 data_alloc: 184549376 data_used: 6930432
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b5cbd000/0x0/0x1bfc00000, data 0x34b9ed0/0x366e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:06.285091+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:07.285302+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150994944 unmapped: 14196736 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:08.285529+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150724608 unmapped: 14467072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:09.285703+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150724608 unmapped: 14467072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:10.285846+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 258 heartbeat osd_stat(store_statfs(0x1b5cba000/0x0/0x1bfc00000, data 0x34bc33c/0x3673000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2070153 data_alloc: 184549376 data_used: 6942720
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 258 heartbeat osd_stat(store_statfs(0x1b5cba000/0x0/0x1bfc00000, data 0x34bc33c/0x3673000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150724608 unmapped: 14467072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:11.285980+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150724608 unmapped: 14467072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:12.286130+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150732800 unmapped: 14458880 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:13.286292+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.873456955s of 10.000396729s, submitted: 45
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150732800 unmapped: 14458880 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:14.286477+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150732800 unmapped: 14458880 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:15.286672+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2075939 data_alloc: 184549376 data_used: 6955008
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150740992 unmapped: 14450688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 259 heartbeat osd_stat(store_statfs(0x1b5cb6000/0x0/0x1bfc00000, data 0x34be5df/0x3677000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:16.286849+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150740992 unmapped: 14450688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:17.287001+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150740992 unmapped: 14450688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:18.287201+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150634496 unmapped: 14557184 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:19.287381+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150634496 unmapped: 14557184 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:20.287502+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2077987 data_alloc: 184549376 data_used: 6967296
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 260 heartbeat osd_stat(store_statfs(0x1b5cb1000/0x0/0x1bfc00000, data 0x34c092c/0x367b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150650880 unmapped: 14540800 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:21.287643+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150659072 unmapped: 14532608 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:22.287808+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150659072 unmapped: 14532608 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:23.288002+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.789683342s of 10.000141144s, submitted: 63
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 261 heartbeat osd_stat(store_statfs(0x1b5caf000/0x0/0x1bfc00000, data 0x34c2c6e/0x367e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150667264 unmapped: 14524416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:24.288184+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 261 heartbeat osd_stat(store_statfs(0x1b5cae000/0x0/0x1bfc00000, data 0x34c2d09/0x367f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150667264 unmapped: 14524416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:25.288520+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2083903 data_alloc: 184549376 data_used: 6979584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150667264 unmapped: 14524416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:26.288710+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150667264 unmapped: 14524416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:27.288855+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 150667264 unmapped: 14524416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:28.289036+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:29.289179+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:30.289319+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2088503 data_alloc: 184549376 data_used: 6979584
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 262 heartbeat osd_stat(store_statfs(0x1b5cab000/0x0/0x1bfc00000, data 0x34c4f27/0x3682000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:31.289515+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:32.289690+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 262 heartbeat osd_stat(store_statfs(0x1b5caa000/0x0/0x1bfc00000, data 0x34c4fc2/0x3683000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:33.289979+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.878720284s of 10.000403404s, submitted: 76
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:34.290163+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:35.290297+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 263 heartbeat osd_stat(store_statfs(0x1b5ca6000/0x0/0x1bfc00000, data 0x34c7393/0x3687000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2091663 data_alloc: 184549376 data_used: 6991872
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:36.290490+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 263 heartbeat osd_stat(store_statfs(0x1b5ca6000/0x0/0x1bfc00000, data 0x34c7393/0x3687000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:37.290682+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:38.290873+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 263 handle_osd_map epochs [263,264], i have 263, src has [1,264]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:39.291060+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:40.291177+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2094295 data_alloc: 184549376 data_used: 7004160
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 264 heartbeat osd_stat(store_statfs(0x1b5ca3000/0x0/0x1bfc00000, data 0x34c9546/0x368a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:41.291332+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:42.291494+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:43.291648+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.796175957s of 10.000386238s, submitted: 48
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 264 heartbeat osd_stat(store_statfs(0x1b5ca3000/0x0/0x1bfc00000, data 0x34c9546/0x368a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:44.291802+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 265 heartbeat osd_stat(store_statfs(0x1b5ca4000/0x0/0x1bfc00000, data 0x34c9546/0x368a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:45.292011+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2103085 data_alloc: 184549376 data_used: 7016448
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 265 heartbeat osd_stat(store_statfs(0x1b5c9c000/0x0/0x1bfc00000, data 0x34cba3d/0x3690000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151764992 unmapped: 13426688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:46.292223+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:47.292453+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 265 heartbeat osd_stat(store_statfs(0x1b5c9c000/0x0/0x1bfc00000, data 0x34cba3d/0x3690000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:48.292697+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:49.292861+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:50.293096+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 51
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2108629 data_alloc: 184549376 data_used: 7016448
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:51.293282+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151822336 unmapped: 13369344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:52.294726+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151822336 unmapped: 13369344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 266 heartbeat osd_stat(store_statfs(0x1b5c98000/0x0/0x1bfc00000, data 0x34cdf34/0x3695000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:53.294857+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.806436539s of 10.005609512s, submitted: 73
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 266 handle_osd_map epochs [266,267], i have 266, src has [1,267]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151822336 unmapped: 13369344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:54.295078+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151822336 unmapped: 13369344 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:55.295426+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c93000/0x0/0x1bfc00000, data 0x34d02b4/0x369a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2115087 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:56.295629+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:57.295978+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:58.296297+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:09:59.296619+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:00.296784+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2111621 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c95000/0x0/0x1bfc00000, data 0x34d0175/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:01.297102+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:02.297298+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:03.297654+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c94000/0x0/0x1bfc00000, data 0x34d0173/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:04.297992+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151830528 unmapped: 13361152 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:05.298386+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.142256737s of 12.224919319s, submitted: 27
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2111925 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:06.298618+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c94000/0x0/0x1bfc00000, data 0x34d0118/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:07.298851+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:08.299113+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:09.299300+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:10.299540+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2112457 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c94000/0x0/0x1bfc00000, data 0x34d0051/0x3698000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:11.299877+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:12.300183+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:13.300441+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:14.300594+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:15.300796+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2113425 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c95000/0x0/0x1bfc00000, data 0x34d00ec/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:16.300982+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c95000/0x0/0x1bfc00000, data 0x34d00ec/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:17.301163+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.165029526s of 12.236442566s, submitted: 15
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:18.301359+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:19.301574+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:20.301739+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2112911 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c96000/0x0/0x1bfc00000, data 0x34d0051/0x3698000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:21.301998+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:22.302209+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:23.302414+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:24.302662+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c96000/0x0/0x1bfc00000, data 0x34d0051/0x3698000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:25.302913+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2112735 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:26.303304+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:27.303587+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.492272377s of 10.534236908s, submitted: 8
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:28.304413+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:29.304567+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:30.304763+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c96000/0x0/0x1bfc00000, data 0x34d0051/0x3698000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2112735 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151838720 unmapped: 13352960 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:31.304980+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151846912 unmapped: 13344768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:32.305308+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151846912 unmapped: 13344768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:33.305470+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151846912 unmapped: 13344768 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:34.305692+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:35.305865+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2112237 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c97000/0x0/0x1bfc00000, data 0x34cffb6/0x3697000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:36.306086+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:37.306291+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:38.306465+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.951592445s of 10.005253792s, submitted: 11
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:39.306646+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c99000/0x0/0x1bfc00000, data 0x34cfeeb/0x3695000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:40.306843+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2112433 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:41.307018+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c98000/0x0/0x1bfc00000, data 0x34cff86/0x3696000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c98000/0x0/0x1bfc00000, data 0x34cff86/0x3696000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:42.307238+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c98000/0x0/0x1bfc00000, data 0x34cff86/0x3696000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:43.307429+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:44.307621+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:45.307857+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2111567 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c98000/0x0/0x1bfc00000, data 0x34cff86/0x3696000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:46.307997+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c99000/0x0/0x1bfc00000, data 0x34cfeeb/0x3695000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c99000/0x0/0x1bfc00000, data 0x34cfeeb/0x3695000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:47.308175+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:48.308340+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c99000/0x0/0x1bfc00000, data 0x34cfeeb/0x3695000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.922338486s of 10.966897011s, submitted: 11
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:49.308475+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c98000/0x0/0x1bfc00000, data 0x34cff86/0x3696000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:50.308653+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2115103 data_alloc: 184549376 data_used: 7028736
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:51.308814+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151855104 unmapped: 13336576 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:52.308974+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151863296 unmapped: 13328384 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b5c97000/0x0/0x1bfc00000, data 0x34d0021/0x3697000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:53.309125+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151863296 unmapped: 13328384 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b5c99000/0x0/0x1bfc00000, data 0x34cfeeb/0x3695000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:54.309309+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151863296 unmapped: 13328384 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 52
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:55.309450+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2117397 data_alloc: 184549376 data_used: 7041024
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:56.309599+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:57.309771+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:58.309954+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b5c94000/0x0/0x1bfc00000, data 0x34d22bc/0x3699000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 268 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 268 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:10:59.310113+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:00.310283+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2119885 data_alloc: 184549376 data_used: 7041024
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:01.310460+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:02.310586+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.873686790s of 14.028739929s, submitted: 52
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:03.310709+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:04.310942+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 269 heartbeat osd_stat(store_statfs(0x1b5c91000/0x0/0x1bfc00000, data 0x34d446f/0x369c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:05.311138+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2120773 data_alloc: 184549376 data_used: 7041024
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:06.311306+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:07.311510+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:08.311705+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 269 heartbeat osd_stat(store_statfs(0x1b5c91000/0x0/0x1bfc00000, data 0x34d450a/0x369d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 269 heartbeat osd_stat(store_statfs(0x1b5c91000/0x0/0x1bfc00000, data 0x34d450a/0x369d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:09.311874+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:10.312092+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 269 heartbeat osd_stat(store_statfs(0x1b5c91000/0x0/0x1bfc00000, data 0x34d450a/0x369d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2120773 data_alloc: 184549376 data_used: 7041024
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:11.312226+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:12.312370+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151724032 unmapped: 13467648 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 269 heartbeat osd_stat(store_statfs(0x1b5c8f000/0x0/0x1bfc00000, data 0x34d4640/0x369f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.966162682s of 10.000057220s, submitted: 7
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:13.312555+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 270 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:14.312695+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:15.312829+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2127661 data_alloc: 184549376 data_used: 7053312
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:16.313032+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:17.313185+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:18.313338+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 270 heartbeat osd_stat(store_statfs(0x1b5c8d000/0x0/0x1bfc00000, data 0x34d68db/0x36a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _renew_subs
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:19.313533+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b5c89000/0x0/0x1bfc00000, data 0x34d8a8e/0x36a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:20.313675+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b5c89000/0x0/0x1bfc00000, data 0x34d8a8e/0x36a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2128525 data_alloc: 184549376 data_used: 7065600
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:21.313875+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:22.314132+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b5c89000/0x0/0x1bfc00000, data 0x34d8a8e/0x36a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.879426956s of 10.004870415s, submitted: 56
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:23.314269+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151789568 unmapped: 13402112 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:24.314456+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 13082624 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:25.314635+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 13082624 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2129947 data_alloc: 184549376 data_used: 7065600
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:26.314805+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152109056 unmapped: 13082624 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 53
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:27.315046+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152264704 unmapped: 12926976 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b5c89000/0x0/0x1bfc00000, data 0x34d8b29/0x36a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:28.315199+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152264704 unmapped: 12926976 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:29.315352+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152264704 unmapped: 12926976 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:30.315538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152264704 unmapped: 12926976 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 19K writes, 73K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 19K writes, 6508 syncs, 2.94 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 43K keys, 11K commit groups, 1.0 writes per commit group, ingest: 32.13 MB, 0.05 MB/s
                                                          Interval WAL: 11K writes, 4855 syncs, 2.45 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2133459 data_alloc: 184549376 data_used: 7077888
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:31.315789+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:32.316019+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.883021355s of 10.004156113s, submitted: 224
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:33.316129+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 272 heartbeat osd_stat(store_statfs(0x1b5c86000/0x0/0x1bfc00000, data 0x34dae8f/0x36a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 272 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:34.316509+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:35.316980+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:36.317191+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:37.317665+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:38.318016+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:39.318188+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:40.318613+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:41.318794+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:42.319004+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:43.319222+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:44.319378+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:45.319585+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:46.319775+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:47.320044+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:48.320468+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:49.320615+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:50.320825+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152272896 unmapped: 12918784 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:51.321005+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:52.321182+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:53.321444+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:54.321586+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:55.321782+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:56.321950+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:57.322071+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:58.322260+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152281088 unmapped: 12910592 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:11:59.322403+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:00.322542+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:01.322741+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:02.322957+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:03.323147+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:04.323315+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:05.323449+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:06.323631+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152289280 unmapped: 12902400 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:07.323803+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:08.323995+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:09.324145+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:10.324298+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:11.324445+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:12.324797+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:13.324952+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:14.325189+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152297472 unmapped: 12894208 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:15.325383+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:16.325621+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:17.325836+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:18.326123+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:19.326281+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:20.326408+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:21.326570+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:22.326844+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152305664 unmapped: 12886016 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:23.327067+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 12877824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:24.327250+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 12877824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:25.327383+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 12877824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:26.327538+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 12877824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:27.327673+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 12877824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:28.327859+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152313856 unmapped: 12877824 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:29.327984+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152322048 unmapped: 12869632 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:30.328144+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152322048 unmapped: 12869632 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:31.328247+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:32.328408+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:33.328549+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:34.328772+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:35.328973+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:36.329351+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:37.329545+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:38.329728+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 152338432 unmapped: 12853248 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:39.329950+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:40.330188+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:41.330419+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:42.330723+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:43.330860+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:44.331146+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151764992 unmapped: 13426688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:45.331322+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151764992 unmapped: 13426688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:46.331580+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151764992 unmapped: 13426688 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:47.331861+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:48.332181+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:49.332407+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:50.332576+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c81000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:51.332801+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2136781 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:52.332994+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:53.333137+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 80.032791138s of 80.074943542s, submitted: 15
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 ms_handle_reset con 0x55d7bcfccc00 session 0x55d7b9b29e00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151543808 unmapped: 13647872 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:54.333282+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Got map version 54
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/2037590349,v1:172.18.0.106:6811/2037590349]
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:55.333464+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:56.333638+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:57.333953+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:58.334144+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:12:59.334304+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:00.334459+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _send_mon_message to mon.np0005532585 at v2:172.18.0.104:3300/0
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:01.334719+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:02.335043+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:03.335182+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:04.335506+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:05.335827+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:06.336340+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:07.336531+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:08.336748+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:09.336929+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:10.337115+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151691264 unmapped: 13500416 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:11.337324+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151699456 unmapped: 13492224 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:12.337500+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151707648 unmapped: 13484032 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:13.337820+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151707648 unmapped: 13484032 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:14.338015+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151707648 unmapped: 13484032 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:15.338192+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151707648 unmapped: 13484032 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:16.338440+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151707648 unmapped: 13484032 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:17.338659+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151707648 unmapped: 13484032 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:18.338929+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:19.339115+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:20.339288+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:21.339507+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:22.339709+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:23.339942+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:24.340114+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:25.340291+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151715840 unmapped: 13475840 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:26.340477+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:27.340674+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:28.340951+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:29.341168+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:30.341307+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:31.341454+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:32.341645+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 ms_handle_reset con 0x55d7bc438000 session 0x55d7b911be00
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: handle_auth_request added challenge on 0x55d7bc438000
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:33.341803+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:34.341958+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:35.342174+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151732224 unmapped: 13459456 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:36.342366+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:37.342507+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:38.342703+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:39.342929+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:40.343087+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:41.343300+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151740416 unmapped: 13451264 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:42.343529+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:43.343713+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:44.343869+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:45.343997+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:46.344136+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:47.344347+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:48.344579+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151748608 unmapped: 13443072 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:49.344793+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:50.345078+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:51.345313+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:52.345514+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:53.345686+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:54.345961+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151773184 unmapped: 13418496 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:55.346189+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:56.346361+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:57.346532+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151781376 unmapped: 13410304 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:58.346746+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:13:59.346925+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:00.347152+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:01.347337+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:02.347544+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:03.347695+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:04.347928+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:05.348059+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151797760 unmapped: 13393920 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:06.348207+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:07.348388+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:08.348683+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:09.349097+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:10.349258+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:11.349461+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:12.349628+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:13.349743+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:14.349936+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151805952 unmapped: 13385728 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:15.350091+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:16.350225+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:17.350370+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:18.350523+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:19.350682+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:20.350795+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:21.350955+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:22.351084+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:23.351202+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151814144 unmapped: 13377536 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b5c82000/0x0/0x1bfc00000, data 0x34dd0fd/0x36ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:24.351323+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'config diff' '{prefix=config diff}'
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'config show' '{prefix=config show}'
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'counter dump' '{prefix=counter dump}'
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151437312 unmapped: 13754368 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:25.351448+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'counter schema' '{prefix=counter schema}'
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151216128 unmapped: 13975552 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: tick
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_tickets
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-23T10:14:26.351559+0000)
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.307692
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0727273
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: bluestore.MempoolThread(0x55d7b81f7b60) _resize_shards cache_size: 2222054675 kv_alloc: 922746880 kv_used: 2144 kv_onode_alloc: 218103808 kv_onode_used: 464 meta_alloc: 855638016 meta_used: 2135901 data_alloc: 184549376 data_used: 7090176
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: prioritycache tune_memory target: 3561601228 mapped: 151511040 unmapped: 13680640 heap: 165191680 old mem: 2222054675 new mem: 2222054675
Nov 23 10:14:57 np0005532585.localdomain ceph-osd[31905]: do_command 'log dump' '{prefix=log dump}'
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/238444029' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain rsyslogd[760]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1454252422' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Nov 23 10:14:57 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1074650265' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1760421930' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2401713440' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain sudo[343278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 23 10:14:58 np0005532585.localdomain sudo[343278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:14:58 np0005532585.localdomain sudo[343278]: pam_unix(sudo:session): session closed for user root
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3752014094' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/238444029' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1472507372' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3179017315' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/789313433' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1454252422' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1806976131' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1074650265' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3710450185' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1386348520' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/4083050300' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1996105483' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1760421930' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2401713440' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain sudo[343317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 23 10:14:58 np0005532585.localdomain sudo[343317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd metadata"} v 0)
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3972546766' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Nov 23 10:14:58 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1116885043' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain sudo[343317]: pam_unix(sudo:session): session closed for user root
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd utilization"} v 0)
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1645843298' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2410477169' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain sudo[343519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 23 10:14:59 np0005532585.localdomain sudo[343519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 23 10:14:59 np0005532585.localdomain sudo[343519]: pam_unix(sudo:session): session closed for user root
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd utilization"} v 0)
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3818470369' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: pgmap v776: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/603213704' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1750636861' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1252139861' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1159776330' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3972546766' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1116885043' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3682782114' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1091272045' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/4180868075' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/449342979' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1645843298' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2410477169' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3818470369' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 23 10:14:59 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:14:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:14:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 23 10:14:59 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:14:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 23 10:15:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:15:00 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 23 10:15:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:15:00 np0005532585.localdomain openstack_network_exporter[242668]: ERROR   10:15:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 23 10:15:00 np0005532585.localdomain openstack_network_exporter[242668]: 
Nov 23 10:15:00 np0005532585.localdomain systemd[1]: Starting Hostname Service...
Nov 23 10:15:00 np0005532585.localdomain systemd[1]: Started Hostname Service.
Nov 23 10:15:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:00.287 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 23 10:15:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:00.289 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:15:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:00.289 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 23 10:15:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:00.289 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:15:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:00.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 23 10:15:00 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:00.296 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.49824 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.49818 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.69350 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.59377 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.59371 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/315890215' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.32:0/315890215' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 23 10:15:00 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3312852132' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "versions"} v 0)
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/386555267' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.49830 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.69356 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: pgmap v777: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.69362 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.49836 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.59383 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.59389 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.49842 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.69368 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.69374 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.59398 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.69392 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.49866 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1306439543' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3312852132' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1749679263' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/908857166' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/386555267' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Nov 23 10:15:01 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4009233458' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/137974925' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.59419 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.69401 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.49881 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.59434 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.69419 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.49893 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.59449 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1732530448' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/4009233458' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3972355834' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/137974925' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3994683674' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1804286699' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3422511344' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config dump"} v 0)
Nov 23 10:15:02 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1179237640' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: pgmap v778: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='client.69431 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='client.59461 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='client.49905 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/1179237640' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2571751432' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3696670753' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Nov 23 10:15:03 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3527446563' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df"} v 0)
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/558626968' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "fs dump"} v 0)
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2388337302' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.59524 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.49971 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.69494 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3527446563' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' 
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/221772974' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3296177126' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/558626968' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/3501950428' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 23 10:15:04 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3809090580' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "fs ls"} v 0)
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2111585222' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 23 10:15:05 np0005532585.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 23 10:15:05 np0005532585.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 23 10:15:05 np0005532585.localdomain kernel: cfg80211: failed to load regulatory.db
Nov 23 10:15:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:05.290 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:15:05 np0005532585.localdomain nova_compute[281952]: 2025-11-23 10:15:05.296 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: pgmap v779: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2388337302' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/1233460552' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2347069213' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2111585222' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2104826111' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 23 10:15:05 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3419578660' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mds stat"} v 0)
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2539135732' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump"} v 0)
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/458824040' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.50010 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.50016 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: pgmap v780: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.69530 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/2539135732' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2406866128' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/1359836781' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/458824040' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 23 10:15:06 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2837804128' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/137774513' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.59578 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/2656935356' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.50043 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/137774513' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.69557 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/33475535' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 23 10:15:07 np0005532585.localdomain ceph-mon[300199]: from='client.59590 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.
Nov 23 10:15:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.
Nov 23 10:15:07 np0005532585.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.
Nov 23 10:15:08 np0005532585.localdomain podman[344924]: 2025-11-23 10:15:08.048639943 +0000 UTC m=+0.099663683 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 10:15:08 np0005532585.localdomain podman[344925]: 2025-11-23 10:15:08.101495287 +0000 UTC m=+0.143283543 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., vcs-type=git)
Nov 23 10:15:08 np0005532585.localdomain podman[344923]: 2025-11-23 10:15:08.142523768 +0000 UTC m=+0.192433844 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:15:08 np0005532585.localdomain podman[344923]: 2025-11-23 10:15:08.176580465 +0000 UTC m=+0.226490551 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 10:15:08 np0005532585.localdomain systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully.
Nov 23 10:15:08 np0005532585.localdomain podman[344925]: 2025-11-23 10:15:08.193645469 +0000 UTC m=+0.235433715 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 23 10:15:08 np0005532585.localdomain systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully.
Nov 23 10:15:08 np0005532585.localdomain podman[344924]: 2025-11-23 10:15:08.230163031 +0000 UTC m=+0.281186761 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 23 10:15:08 np0005532585.localdomain systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully.
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd dump"} v 0)
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3932864351' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.108:0/3329296311' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: pgmap v781: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.50058 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.59596 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.69575 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.50067 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.107:0/3932864351' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.69587 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: from='client.? 172.18.0.106:0/2128491971' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Nov 23 10:15:08 np0005532585.localdomain ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/674048425' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
Nov 23 10:15:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:15:09.312 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 23 10:15:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:15:09.312 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 23 10:15:09 np0005532585.localdomain ovn_metadata_agent[160434]: 2025-11-23 10:15:09.313 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
